Options
2012
Conference Paper
Title
Emotion recognition in western popular music
Title Supplement
The role of melodic structure
Abstract
Abstract
Background Music Emotion Recognition (MER) involves modelling the relationship between musical features and expressed emotion. Previous work in this field concentrates on the extraction of spectrally derived acoustical and psychoacoustical features. However, this method has reached a 'glass ceiling' with respect to the accuracy in which MER algorithms can identify music emotion (Aucouturier & Pachet 2004). Aims This paper adopts a wider view of emotional expression in music by considering the musical communication process (Miell, MacDonald & Hargreaves 2005, Juslin & Timmers 2010). Higher level structural elements of music, specifically the role of melodic structure, are incorporated into the feature extraction process. A study is introduced in which participants use a 2 dimensional time-continuous measurement methodology to rate the emotion expressed by musical pieces. These musical stimuli are then analyzed using feature extraction algorithms. A statistical analysis of these measures is then performed with the aim of identifying correlations between melodic structural features and expressed emotion. Method We presented 30 subjects with 20 specially prepared music stimuli consisting of clips of Western contemporary music. In order to control for possible effects of familiarity and personal associations, novel, unreleased tracks were chosen. Vocal melodies were maintained, but lyrics replaced with wordless vocables, nonsense syllables used in a variety of music styles (Sadie, Tyrrell & Levy 2001). These include la's and do's in place of real words. Subjects were asked to rate the stimuli based on Russell's 2 dimensional circumplex model (Russell 1980). These were statistically summarized and used to create an overall value for arousal and valence. These measurements were then correlated with the extracted melodic structural features. Results Structural vocal melody features showed correlation with both arousal and valence measures. Repeated notes and note density showed a positive correlation in the arousal dimension. Average melodic interval and melodic range were positively correlated with valence. This indicates that large intervallic movement and general large pitch range expressed happier, positively valenced emotions. Conclusions The results of this analysis indicate that structural vocal melodic features contribute to the expressed emotion of Western popular music, regardless of lyrical content. These findings suggest an increase in accuracy may be achieved by including structural features in existing MER algorithms. In particular it points to better discrimination of the valence dimension, a well-known problem in existing MER systems (Yang, Lin, Su & Chen 2008, Eerola, Lartillot & Toivianen 2009)
Keyword(s)