Presented is a spiking neural network architecture of human language instruction recognition and robot control. The network is based on a model of a leaky Integrate-And-Fire (lIAF) spiking neurone with Active Dendrites and Dynamic Synapses (ADDS) [1,2,3]. The architecture contains several main modules associating information across different modalities: an auditory system recognising single spoken words, a visual system recognising objects of different colour and shape, motor control system for navigation and motor control and a working memory. The main focus of this presentation is the working memory module whose function is sequential processing of word from a language instruction, task and goal representation and cross-modal association of objects and actions. We test the model with a robot whose goal is to recognise and execute language instructions. The work demonstrates the potential of spiking neurons for processing spatio-temporal patterns and the experiments present spiking neural networks as a paradigm which can be applied for modelling sequence detectors at word level for robot instructions.
|Title of host publication||Biomimetic Neural Learning for Intelligent Robots |
|Editors||Stefan Wermter, Gunter Palm, Mark Elshaw|
|Place of Publication||Berlin|
|Number of pages||29|
|Publication status||Published - 2005|
|Name||Lecture Notes in Computer Science |