A Hybrid Neural Emotion Recogniser for Human-Robotic Agent Interaction

Alexandru Traista, Mark Elshaw

Research output: Chapter in Book/Report/Conference proceedingConference proceedingpeer-review

1 Citation (Scopus)


This paper presents a hybrid neural approach to emotion recognition from speech, which combines feature selection using principal component analysis (PCA) with unsupervised neural clustering through self-organising map (SOM). Given the importance that is associated with emotions in humans, it is unlikely that robots will be accepted as anything more that machines if they do not express and recognise emotions. In this paper, we describe the performance of an unsupervised approach to emotion recognition that achieves similar performance to current supervised intelligent approaches. Performance, however, reduces when the system is tested using samples from a male volunteer not in the training set using a low cost microphone. Through the use of an unsupervised neural approach, it is possible to go beyond the basic binary classification of emotions to consider the similarity between emotions and whether speech can express multiple emotions at the same time
Original languageEnglish
Title of host publicationEngineering Applications of Neural Networks. EANN 2012
EditorsChrisina Jayne, Shigang Yue, Lazaros Iliadis
Place of PublicationBerlin
PublisherSpringer Verlag
Number of pages10
ISBN (Electronic)978-3-642-32909-8
ISBN (Print)978-3-642-32908-1
Publication statusPublished - 2012
Event13th International Conference on Engineering Applications of Neural Networks - London, United Kingdom
Duration: 20 Sep 201223 Sep 2012
Conference number: 13


Conference13th International Conference on Engineering Applications of Neural Networks
Abbreviated titleEANN 2012
CountryUnited Kingdom


  • Emotion recognition
  • social robot interaction
  • unsupervised neural learning

Fingerprint Dive into the research topics of 'A Hybrid Neural Emotion Recogniser for Human-Robotic Agent Interaction'. Together they form a unique fingerprint.

Cite this