Episodic memory multimodal learning for robot sensorimotor map building and navigation

Wei Hong Chin, Yuichiro Toda, Naoyuki Kubota, Chu Kiong Loo, Manjeevan Seera

Research output: Contribution to journalArticleResearchpeer-review

10 Citations (Scopus)

Abstract

In this paper, an unsupervised learning model of episodic memory is proposed. The proposed model, enhanced episodic memory adaptive resonance theory (EEM-ART), categorizes and encodes experiences of a robot to the environment and generates a cognitive map. EEM-ART consists of multilayer ART networks to extract novel events and encode spatio-temporal connection as episodes by incrementally generating cognitive neurons. The model connects episodes to construct a sensorimotor map for the robot to continuously perform path planning and goal navigation. Experimental results for a mobile robot indicate that EEM-ART can process multiple sensory sources for learning events and encoding episodes simultaneously. The model overcomes perceptual aliasing and robot localization by recalling the encoded episodes with a new anticipation function and generates sensorimotor map to connect episodes together to execute tasks continuously with little to no human intervention.

Original languageEnglish
Pages (from-to)210-220
Number of pages11
JournalIEEE Transactions on Cognitive and Developmental Systems
Volume11
Issue number2
DOIs
Publication statusPublished - Jun 2019
Externally publishedYes

Keywords

  • Adaptive resonance theory (ART)
  • episodic memory
  • robot navigation

Cite this