Automatic Music Playlist Generation Using Affective Technologies

Darryl Griffiths, Stuart Cunningham, Jonathan Weinel

Research output: Chapter in Book/Report/Conference proceedingConference proceedingpeer-review


This paper discusses how human emotion could be quantified using contextual and physiological
information that has been gathered from a range of sensors, and how this data could then be used to
automatically generate music playlists. I begin by discussing existing affective systems that automatically
generate playlists based on human emotion. I then consider the current work in audio description
analysis. A system is proposed that measures human emotion based on contextual and physiological data
using a range of sensors. The sensors discussed to invoke such contextual characteristics range from
temperature and light to EDA (electro dermal activity) and ECG (electrocardiogram). The concluding
section describes the progress achieved so far, which includes defining datasets using a conceptual
design, microprocessor electronics and data acquisition using MatLab. Lastly, there is brief discussion
of future plans to develop this research
Original languageEnglish
Title of host publicationInternet Technologies and Applications (ITA), 2013
EditorsRich Pickering, Stuart Cunningham, Nigel Houlden, Denise Oram, Vic Grout, Julie Mayers
PublisherNorth East Wales Institute
ISBN (Print)9780946881819
Publication statusPublished - 2013
Externally publishedYes
EventFifth International Conference on Internet Technologies & Applications - Wrexham, United Kingdom
Duration: 10 Sep 201313 Sep 2013
Conference number: 5


ConferenceFifth International Conference on Internet Technologies & Applications
Abbreviated titleITA13
Country/TerritoryUnited Kingdom


  • Digital music
  • playlist generation
  • affective computing
  • fuzzy logic
  • neural nets


Dive into the research topics of 'Automatic Music Playlist Generation Using Affective Technologies'. Together they form a unique fingerprint.

Cite this