TY - JOUR
T1 - Examining Emotion Perception Agreement in Live Music Performance
AU - Yang, Simin
AU - Reed, Courtney
AU - Chew, Elaine
AU - Barthet, Mathieu
PY - 2021/6/30
Y1 - 2021/6/30
N2 - Current music emotion recognition (MER) systems rely on emotion data averaged across listeners and over time to infer the emotion expressed by a musical piece, often neglecting time- and listener-dependent factors. These limitations can restrict the efficacy of MER systems and cause misjudgements. In a live music concert setting, fifteen audience members annotated perceived emotion in valence-arousal space over time using a mobile application. Analyses of inter-rater reliability yielded widely varying levels of agreement in the perceived emotions. A follow-up lab study to uncover the reasons for such variability was conducted, where twenty-one listeners annotated their perceived emotions through a recording of the original performance and offered open-ended explanations. Thematic analysis reveals many salient features and interpretations that can describe the cognitive processes. Some of the results confirm known findings of music perception and MER studies. Novel findings highlight the importance of less frequently discussed musical attributes, such as musical structure, performer expression, and stage setting, as perceived across different modalities. Musicians are found to attribute emotion change to musical harmony, structure, and performance technique more than non-musicians. We suggest that listener-informed musical features can benefit MER in addressing emotional perception variability by providing reasons for listener similarities and idiosyncrasies.
AB - Current music emotion recognition (MER) systems rely on emotion data averaged across listeners and over time to infer the emotion expressed by a musical piece, often neglecting time- and listener-dependent factors. These limitations can restrict the efficacy of MER systems and cause misjudgements. In a live music concert setting, fifteen audience members annotated perceived emotion in valence-arousal space over time using a mobile application. Analyses of inter-rater reliability yielded widely varying levels of agreement in the perceived emotions. A follow-up lab study to uncover the reasons for such variability was conducted, where twenty-one listeners annotated their perceived emotions through a recording of the original performance and offered open-ended explanations. Thematic analysis reveals many salient features and interpretations that can describe the cognitive processes. Some of the results confirm known findings of music perception and MER studies. Novel findings highlight the importance of less frequently discussed musical attributes, such as musical structure, performer expression, and stage setting, as perceived across different modalities. Musicians are found to attribute emotion change to musical harmony, structure, and performance technique more than non-musicians. We suggest that listener-informed musical features can benefit MER in addressing emotional perception variability by providing reasons for listener similarities and idiosyncrasies.
KW - Music and emotion
KW - music perception
KW - inter-rater reliability
KW - individual factors
KW - live performance
KW - music emotion recognition
KW - music information retrieval
KW - Computational modeling of human behavior
UR - https://bit.ly/BabajanianTrio
U2 - 10.1109/TAFFC.2021.3093787
DO - 10.1109/TAFFC.2021.3093787
M3 - Article
SN - 1949-3045
SP - 1
EP - 1
JO - IEEE transactions on affective computing
JF - IEEE transactions on affective computing
ER -