Baselight

EMOTIFY - Emotion Classificaiton In Songs

Songs Multi-label Emotions Classification

@kaggle.yash9439_emotify_emotion_classificaiton_in_songs

About this Dataset

EMOTIFY - Emotion Classificaiton In Songs

The dataset consists of 400 song excerpts (1 minute long) in 4 genres (rock, classical, pop, electronic). The annotations were collected using GEMS scale (Geneva Emotional Music Scales) [1]. Each participant could select maximally three items from the scale (the emotions, that he felt strongly listening to this song. Below is the description of the emotional categories as found in the game.

The annotations produced by the game are spread unevenly among the songs, which is caused both by design of the experiment and design of the game. Participants could skip songs and switch between genres, and they were encouraged to do so, because induced emotional response does not automatically occur on every music listening occasion. Therefore, less popular (among our particular sample of participants) genres received less annotations, and the same happened to less popular songs. Moreover, for the purposes of analysis we split our songs in two subsets. On average, each songs in one of the subsets is annotated by 48 people, and by 16 people in the other.

Please Cite this if you are using this dataset:
A. Aljanaki, F. Wiering, R. C. Veltkamp. Studying emotion induced by music through a crowdsourcing game. Information Processing & Management, 2015.

Share link

Anyone who has the link will be able to view this.