Self Organizing Neural Place Codes for Vision Based Robot Navigation

Kaustubh Chokshi, Stefan Wermter, Christo Panchev, K Burn

Research output: Chapter in Book/Report/Conference proceedingConference proceedingpeer-review

Abstract

Autonomous robots must be able to navigate independently within an environment. In the animal brain, so-called place cells respond to the environment where the animal is. We present a model of place cells based on self-organising maps. The aim of this paper is to show how image invariance can improve the performance of the neural place codes and make the model more robust to noise. The paper also demonstrates that localisation can be learned without having a pre-defined map given to the robot by humans and that after training, a robot can localise itself within a learned environment.
Original languageEnglish
Title of host publication2004 IEEE International Joint Conference on Neural Networks
Subtitle of host publicationproceedings : Budapest, Hungary, 25-29 July, 2004
PublisherIEEE
Pages2501 - 2506
Number of pages6
Volume4
ISBN (Print)0-7803-8359-1
DOIs
Publication statusPublished - 2004
Externally publishedYes
Event2004 IEEE International Joint Conference on Neural Networks - Budapest, Hungary
Duration: 25 Jul 200429 Jul 2004

Conference

Conference2004 IEEE International Joint Conference on Neural Networks
Abbreviated titleIJCNN
Country/TerritoryHungary
CityBudapest
Period25/07/0429/07/04

Fingerprint

Dive into the research topics of 'Self Organizing Neural Place Codes for Vision Based Robot Navigation'. Together they form a unique fingerprint.

Cite this