An Interactive Music Playlist Generator that Responds to User Emotion and Context

Darryl Griffiths, Stuart Cunningham, Jonathan Weinel

Research output: Chapter in Book/Report/Conference proceedingConference proceeding

Abstract

This paper aims to demonstrate the mechanisms of a music recommendation system, and accompanying graphical user interface (GUI), that is capable of generating a playlist of songs based upon an individual’s emotion or context. This interactive music playlist generator has been designed as part of a broader system, Intended for mobile devices, which aims to suggest music based upon ‘how the user is feeling’ and ‘what the user is doing’ by evaluating real-time physiological and contextual sensory data using machine learning technologies. For instance, heart rate and skin temperature in conjunction with ambient light, temperature and global positioning satellite (GPS) could be used to a degree to infer one’s current situation and corresponding mood.

At present, this interactive music playlist generator has the ability to conceptually demonstrate how a playlist can be formed in accordance with such physiological and contextual parameters. In particular, the affective aspect of the interface is visually represented as a two-dimensional arousal-valence space based upon Russell’s circumplex model of affect (1980).Context refers to environmental, locomotion and activity concepts, and are visually represented in the interface as sliders. These affective and contextual components are discussed in more detail next in Sections 2 and 3, respectively. Section 4 will demonstrate how an affective and contextual music playlist can be formed by interacting with the GUI parameters. For a comprehensive discussion in terms of the development of this research, refer to (Griffiths et al. 2013a, 2013b, 2015). Moreover, refer to Teng et al. (2013) and Yang et al. (2008) for related work in these broader research areas.
Original languageEnglish
Title of host publicationElectronic Visualisation and the Arts (EVA 2016)
PublisherBritish Computer Society
Pages275-276
Number of pages2
DOIs
Publication statusPublished - 2016
Externally publishedYes
EventElectronic Visualisation and the Arts (EVA 2016) - London, United Kingdom
Duration: 12 Jul 201614 Jul 2016

Conference

ConferenceElectronic Visualisation and the Arts (EVA 2016)
CountryUnited Kingdom
CityLondon
Period12/07/1614/07/16

Fingerprint

Graphical user interfaces
Recommender systems
Mobile devices
Learning systems
Skin
Satellites
Temperature

Bibliographical note

The eWiC Series gives free access to the proceedings of workshops and conferences on the broadest possible range of computing topics.

Cite this

Griffiths, D., Cunningham, S., & Weinel, J. (2016). An Interactive Music Playlist Generator that Responds to User Emotion and Context. In Electronic Visualisation and the Arts (EVA 2016) (pp. 275-276). British Computer Society. https://doi.org/10.14236/ewic/EVA2016.53

An Interactive Music Playlist Generator that Responds to User Emotion and Context. / Griffiths, Darryl; Cunningham, Stuart; Weinel, Jonathan.

Electronic Visualisation and the Arts (EVA 2016). British Computer Society, 2016. p. 275-276.

Research output: Chapter in Book/Report/Conference proceedingConference proceeding

Griffiths, D, Cunningham, S & Weinel, J 2016, An Interactive Music Playlist Generator that Responds to User Emotion and Context. in Electronic Visualisation and the Arts (EVA 2016). British Computer Society, pp. 275-276, Electronic Visualisation and the Arts (EVA 2016), London, United Kingdom, 12/07/16. https://doi.org/10.14236/ewic/EVA2016.53
Griffiths D, Cunningham S, Weinel J. An Interactive Music Playlist Generator that Responds to User Emotion and Context. In Electronic Visualisation and the Arts (EVA 2016). British Computer Society. 2016. p. 275-276 https://doi.org/10.14236/ewic/EVA2016.53
Griffiths, Darryl ; Cunningham, Stuart ; Weinel, Jonathan. / An Interactive Music Playlist Generator that Responds to User Emotion and Context. Electronic Visualisation and the Arts (EVA 2016). British Computer Society, 2016. pp. 275-276
@inproceedings{d82b1f70a61e4e04ad7e3fc56a939d75,
title = "An Interactive Music Playlist Generator that Responds to User Emotion and Context",
abstract = "This paper aims to demonstrate the mechanisms of a music recommendation system, and accompanying graphical user interface (GUI), that is capable of generating a playlist of songs based upon an individual’s emotion or context. This interactive music playlist generator has been designed as part of a broader system, Intended for mobile devices, which aims to suggest music based upon ‘how the user is feeling’ and ‘what the user is doing’ by evaluating real-time physiological and contextual sensory data using machine learning technologies. For instance, heart rate and skin temperature in conjunction with ambient light, temperature and global positioning satellite (GPS) could be used to a degree to infer one’s current situation and corresponding mood.At present, this interactive music playlist generator has the ability to conceptually demonstrate how a playlist can be formed in accordance with such physiological and contextual parameters. In particular, the affective aspect of the interface is visually represented as a two-dimensional arousal-valence space based upon Russell’s circumplex model of affect (1980).Context refers to environmental, locomotion and activity concepts, and are visually represented in the interface as sliders. These affective and contextual components are discussed in more detail next in Sections 2 and 3, respectively. Section 4 will demonstrate how an affective and contextual music playlist can be formed by interacting with the GUI parameters. For a comprehensive discussion in terms of the development of this research, refer to (Griffiths et al. 2013a, 2013b, 2015). Moreover, refer to Teng et al. (2013) and Yang et al. (2008) for related work in these broader research areas.",
author = "Darryl Griffiths and Stuart Cunningham and Jonathan Weinel",
note = "The eWiC Series gives free access to the proceedings of workshops and conferences on the broadest possible range of computing topics.",
year = "2016",
doi = "10.14236/ewic/EVA2016.53",
language = "English",
pages = "275--276",
booktitle = "Electronic Visualisation and the Arts (EVA 2016)",
publisher = "British Computer Society",

}

TY - GEN

T1 - An Interactive Music Playlist Generator that Responds to User Emotion and Context

AU - Griffiths, Darryl

AU - Cunningham, Stuart

AU - Weinel, Jonathan

N1 - The eWiC Series gives free access to the proceedings of workshops and conferences on the broadest possible range of computing topics.

PY - 2016

Y1 - 2016

N2 - This paper aims to demonstrate the mechanisms of a music recommendation system, and accompanying graphical user interface (GUI), that is capable of generating a playlist of songs based upon an individual’s emotion or context. This interactive music playlist generator has been designed as part of a broader system, Intended for mobile devices, which aims to suggest music based upon ‘how the user is feeling’ and ‘what the user is doing’ by evaluating real-time physiological and contextual sensory data using machine learning technologies. For instance, heart rate and skin temperature in conjunction with ambient light, temperature and global positioning satellite (GPS) could be used to a degree to infer one’s current situation and corresponding mood.At present, this interactive music playlist generator has the ability to conceptually demonstrate how a playlist can be formed in accordance with such physiological and contextual parameters. In particular, the affective aspect of the interface is visually represented as a two-dimensional arousal-valence space based upon Russell’s circumplex model of affect (1980).Context refers to environmental, locomotion and activity concepts, and are visually represented in the interface as sliders. These affective and contextual components are discussed in more detail next in Sections 2 and 3, respectively. Section 4 will demonstrate how an affective and contextual music playlist can be formed by interacting with the GUI parameters. For a comprehensive discussion in terms of the development of this research, refer to (Griffiths et al. 2013a, 2013b, 2015). Moreover, refer to Teng et al. (2013) and Yang et al. (2008) for related work in these broader research areas.

AB - This paper aims to demonstrate the mechanisms of a music recommendation system, and accompanying graphical user interface (GUI), that is capable of generating a playlist of songs based upon an individual’s emotion or context. This interactive music playlist generator has been designed as part of a broader system, Intended for mobile devices, which aims to suggest music based upon ‘how the user is feeling’ and ‘what the user is doing’ by evaluating real-time physiological and contextual sensory data using machine learning technologies. For instance, heart rate and skin temperature in conjunction with ambient light, temperature and global positioning satellite (GPS) could be used to a degree to infer one’s current situation and corresponding mood.At present, this interactive music playlist generator has the ability to conceptually demonstrate how a playlist can be formed in accordance with such physiological and contextual parameters. In particular, the affective aspect of the interface is visually represented as a two-dimensional arousal-valence space based upon Russell’s circumplex model of affect (1980).Context refers to environmental, locomotion and activity concepts, and are visually represented in the interface as sliders. These affective and contextual components are discussed in more detail next in Sections 2 and 3, respectively. Section 4 will demonstrate how an affective and contextual music playlist can be formed by interacting with the GUI parameters. For a comprehensive discussion in terms of the development of this research, refer to (Griffiths et al. 2013a, 2013b, 2015). Moreover, refer to Teng et al. (2013) and Yang et al. (2008) for related work in these broader research areas.

U2 - 10.14236/ewic/EVA2016.53

DO - 10.14236/ewic/EVA2016.53

M3 - Conference proceeding

SP - 275

EP - 276

BT - Electronic Visualisation and the Arts (EVA 2016)

PB - British Computer Society

ER -