Smartphone Based Human Activity and Postural Transition Classification with Deep Stacked Autoencoder Networks

Luke Hicks, Yih-Ling Hedley, Mark Elshaw, Abdulrahman Altahhan, Vasile Palade

Research output: Chapter in Book/Report/Conference proceedingChapter

1 Citation (Scopus)

Abstract

Human activity recognition (HAR) is a prominent research area attracting considerable interest in recent years.
Original languageEnglish
Title of host publicationArtificial Neural Networks and Machine Learning – ICANN 2016
EditorsAlessandro E.P. Villa, Paolo Masulli, Antonio Javier Pons Rivero
Place of PublicationSwitzerland
PublisherSpringer Verlag
Pages535-536
Volume9887
ISBN (Print)978-3-319-44780-3, 978-3-319-44781-0
DOIs
Publication statusPublished - 2016
EventThe 25th International Conference on Artificial Neural Networks - Barcelona, Spain
Duration: 6 Sep 20169 Sep 2016

Conference

ConferenceThe 25th International Conference on Artificial Neural Networks
Abbreviated titleICANN 2016
CountrySpain
CityBarcelona
Period6/09/169/09/16

Bibliographical note

The full text is available from http://dx.doi.org/10.1007/978-3-319-44781-0

Fingerprint Dive into the research topics of 'Smartphone Based Human Activity and Postural Transition Classification with Deep Stacked Autoencoder Networks'. Together they form a unique fingerprint.

  • Cite this

    Hicks, L., Hedley, Y-L., Elshaw, M., Altahhan, A., & Palade, V. (2016). Smartphone Based Human Activity and Postural Transition Classification with Deep Stacked Autoencoder Networks. In A. E. P. Villa, P. Masulli, & A. J. P. Rivero (Eds.), Artificial Neural Networks and Machine Learning – ICANN 2016 (Vol. 9887, pp. 535-536). Switzerland: Springer Verlag. https://doi.org/10.1007/978-3-319-44781-0