Decoding music’s amplitude envelope from neural and cardiovascular signals

Emily Graber*, Mateusz Solinski, Elaine Chew

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingPoster abstractpeer-review

65 Downloads (Pure)

Abstract

Introduction

Objectively analyzing realistic music experiences remains a challenge in music cognition, with research focusing primarily on electroencephalographic (EEG) responses. One approach to determining whether the brain is processing and tracking a continuous stimulus like music is to reconstruct it, or decode it, from the neural data. Both speech and music have been successfully decoded from EEG responses using regularized linear modelling techniques where lagged response data is weighted and summed to estimate the stimulus (Zuk et al., 2021).

Here, we investigate whether music can be decoded from cardiovascular signals, like inter-heart beat-intervals (RR intervals), which are mediated by the autonomic nervous system and have been shown to change in response to music (Koelsch & Jancke, 2015). To our knowledge, this decoding method has not yet been applied to electrocardiographic (ECG) responses to continuous music stimuli. With a continuous listening paradigm with acoustic music, we aim to determine if cardiovascular signals, alone or together with EEG, can be used to model and predict music amplitude envelopes over time, and how this compares to existing EEG-only approaches.

Methods

EEG and ECG signals were recorded from 24 participants as they listened to two sets of 15-minute music excerpts from the Baroque (Glenn Gould - Bach Goldberg Variations) and Contemporary (Alexandros Markeas - Improv) Classical music genres, respectively. The stimuli were re-rendered acoustically on a reproducing piano.

To analyze these data, RR intervals were extracted from ECG signals and interpolated to create a cardiovascular data channel. Alone and together with EEG data channels, decoding models were computed over lags from –100 to 400 ms with the mTRF Toolbox (Crosse et al., 2016). The models were used to make predictions of the music amplitude envelopes. Prediction accuracies were assessed by correlating the predicted envelopes with the recorded envelopes using 10-fold leave-one-out cross-validation within each subject, for each music genre.

Results

For the Baroque genre, the amplitude envelope of the music could be decoded from (1) the EEG data alone; (2) a combination of the EEG and RR intervals; and, (3) from the RR intervals alone. Moreover, the mean decoding accuracy given by Pearson’s r using RR intervals alone [M=.120, SD=.056] or in combination with EEG [M=.123, SD=.058] was significantly better than that obtained using EEG data alone [M=.063, SD=.032; RR-EEG: t(21)=4.33, p=0.0009; MIX-EEG: t(21)=5.52, p
Discussion

We show that EEG decoding can be applied to cardiovascular signals, however the modelling success may depend on the specific stimuli or the consistency of the physiological responses to the stimuli. The observed differences between Baroque and Contemporary modelling could come from more variable responses to Contemporary music.

Conclusion

Overall, this work proposes an objective way to analyze continuous music listening experiences. Preliminary results suggest that ECG outperforms EEG for reconstructing the music amplitude envelope, particularly for the Baroque Music stimuli. Additionally, ECG decoding may provide insight into how consistently or how well listeners understand and internalize the music feature being modelled.
Original languageEnglish
Title of host publicatione-Proceedings of the 27th International Conference on Music Perception and Cognition and the 7th Conference of the Asia-Pacific Society for the Cognitive Sciences of Music
EditorsMinoru Tsuzaki, Makiko Sadakata, Shimpei Ikegami, Toshie Matsui, Masahiro Okano, Haruka Shoda
Place of PublicationTokyo, Japan
Number of pages1
Volume17
Publication statusPublished - 2023
EventInternational Conference on Music Perception and Cognition - Nihon University, Tokyo, Japan
Duration: 24 Aug 202328 Aug 2023
Conference number: 17
https://jsmpc.org/ICMPC17/

Conference

ConferenceInternational Conference on Music Perception and Cognition
Abbreviated titleICMPC
Country/TerritoryJapan
CityTokyo
Period24/08/202328/08/2023
Internet address

Keywords

  • music perception
  • music cognition
  • neuroscience
  • cardiovascular science
  • cognitive science

Fingerprint

Dive into the research topics of 'Decoding music’s amplitude envelope from neural and cardiovascular signals'. Together they form a unique fingerprint.

Cite this