Projects per year
Abstract
Introduction
Objectively analyzing realistic music experiences remains a challenge in music cognition, with research focusing primarily on electroencephalographic (EEG) responses. One approach to determining whether the brain is processing and tracking a continuous stimulus like music is to reconstruct it, or decode it, from the neural data. Both speech and music have been successfully decoded from EEG responses using regularized linear modelling techniques where lagged response data is weighted and summed to estimate the stimulus (Zuk et al., 2021).
Here, we investigate whether music can be decoded from cardiovascular signals, like inter-heart beat-intervals (RR intervals), which are mediated by the autonomic nervous system and have been shown to change in response to music (Koelsch & Jancke, 2015). To our knowledge, this decoding method has not yet been applied to electrocardiographic (ECG) responses to continuous music stimuli. With a continuous listening paradigm with acoustic music, we aim to determine if cardiovascular signals, alone or together with EEG, can be used to model and predict music amplitude envelopes over time, and how this compares to existing EEG-only approaches.
Methods
EEG and ECG signals were recorded from 24 participants as they listened to two sets of 15-minute music excerpts from the Baroque (Glenn Gould - Bach Goldberg Variations) and Contemporary (Alexandros Markeas - Improv) Classical music genres, respectively. The stimuli were re-rendered acoustically on a reproducing piano.
To analyze these data, RR intervals were extracted from ECG signals and interpolated to create a cardiovascular data channel. Alone and together with EEG data channels, decoding models were computed over lags from –100 to 400 ms with the mTRF Toolbox (Crosse et al., 2016). The models were used to make predictions of the music amplitude envelopes. Prediction accuracies were assessed by correlating the predicted envelopes with the recorded envelopes using 10-fold leave-one-out cross-validation within each subject, for each music genre.
Results
For the Baroque genre, the amplitude envelope of the music could be decoded from (1) the EEG data alone; (2) a combination of the EEG and RR intervals; and, (3) from the RR intervals alone. Moreover, the mean decoding accuracy given by Pearson’s r using RR intervals alone [M=.120, SD=.056] or in combination with EEG [M=.123, SD=.058] was significantly better than that obtained using EEG data alone [M=.063, SD=.032; RR-EEG: t(21)=4.33, p=0.0009; MIX-EEG: t(21)=5.52, p
Discussion
We show that EEG decoding can be applied to cardiovascular signals, however the modelling success may depend on the specific stimuli or the consistency of the physiological responses to the stimuli. The observed differences between Baroque and Contemporary modelling could come from more variable responses to Contemporary music.
Conclusion
Overall, this work proposes an objective way to analyze continuous music listening experiences. Preliminary results suggest that ECG outperforms EEG for reconstructing the music amplitude envelope, particularly for the Baroque Music stimuli. Additionally, ECG decoding may provide insight into how consistently or how well listeners understand and internalize the music feature being modelled.
Objectively analyzing realistic music experiences remains a challenge in music cognition, with research focusing primarily on electroencephalographic (EEG) responses. One approach to determining whether the brain is processing and tracking a continuous stimulus like music is to reconstruct it, or decode it, from the neural data. Both speech and music have been successfully decoded from EEG responses using regularized linear modelling techniques where lagged response data is weighted and summed to estimate the stimulus (Zuk et al., 2021).
Here, we investigate whether music can be decoded from cardiovascular signals, like inter-heart beat-intervals (RR intervals), which are mediated by the autonomic nervous system and have been shown to change in response to music (Koelsch & Jancke, 2015). To our knowledge, this decoding method has not yet been applied to electrocardiographic (ECG) responses to continuous music stimuli. With a continuous listening paradigm with acoustic music, we aim to determine if cardiovascular signals, alone or together with EEG, can be used to model and predict music amplitude envelopes over time, and how this compares to existing EEG-only approaches.
Methods
EEG and ECG signals were recorded from 24 participants as they listened to two sets of 15-minute music excerpts from the Baroque (Glenn Gould - Bach Goldberg Variations) and Contemporary (Alexandros Markeas - Improv) Classical music genres, respectively. The stimuli were re-rendered acoustically on a reproducing piano.
To analyze these data, RR intervals were extracted from ECG signals and interpolated to create a cardiovascular data channel. Alone and together with EEG data channels, decoding models were computed over lags from –100 to 400 ms with the mTRF Toolbox (Crosse et al., 2016). The models were used to make predictions of the music amplitude envelopes. Prediction accuracies were assessed by correlating the predicted envelopes with the recorded envelopes using 10-fold leave-one-out cross-validation within each subject, for each music genre.
Results
For the Baroque genre, the amplitude envelope of the music could be decoded from (1) the EEG data alone; (2) a combination of the EEG and RR intervals; and, (3) from the RR intervals alone. Moreover, the mean decoding accuracy given by Pearson’s r using RR intervals alone [M=.120, SD=.056] or in combination with EEG [M=.123, SD=.058] was significantly better than that obtained using EEG data alone [M=.063, SD=.032; RR-EEG: t(21)=4.33, p=0.0009; MIX-EEG: t(21)=5.52, p
Discussion
We show that EEG decoding can be applied to cardiovascular signals, however the modelling success may depend on the specific stimuli or the consistency of the physiological responses to the stimuli. The observed differences between Baroque and Contemporary modelling could come from more variable responses to Contemporary music.
Conclusion
Overall, this work proposes an objective way to analyze continuous music listening experiences. Preliminary results suggest that ECG outperforms EEG for reconstructing the music amplitude envelope, particularly for the Baroque Music stimuli. Additionally, ECG decoding may provide insight into how consistently or how well listeners understand and internalize the music feature being modelled.
Original language | English |
---|---|
Title of host publication | e-Proceedings of the 27th International Conference on Music Perception and Cognition and the 7th Conference of the Asia-Pacific Society for the Cognitive Sciences of Music |
Editors | Minoru Tsuzaki, Makiko Sadakata, Shimpei Ikegami, Toshie Matsui, Masahiro Okano, Haruka Shoda |
Place of Publication | Tokyo, Japan |
Number of pages | 1 |
Volume | 17 |
Publication status | Published - 2023 |
Event | International Conference on Music Perception and Cognition - Nihon University, Tokyo, Japan Duration: 24 Aug 2023 → 28 Aug 2023 Conference number: 17 https://jsmpc.org/ICMPC17/ |
Conference
Conference | International Conference on Music Perception and Cognition |
---|---|
Abbreviated title | ICMPC |
Country/Territory | Japan |
City | Tokyo |
Period | 24/08/2023 → 28/08/2023 |
Internet address |
Keywords
- music perception
- music cognition
- neuroscience
- cardiovascular science
- cognitive science
Fingerprint
Dive into the research topics of 'Decoding music’s amplitude envelope from neural and cardiovascular signals'. Together they form a unique fingerprint.-
COSMOS: COSMOS: Computational Shaping and Modeling of Musical Structures
1/07/2022 → 30/11/2025
Project: Research
-
A framework for modeling performers' beat-to-beat heart intervals using music features and Interpretation Maps
Soliński, M., Reed, C. N. & Chew, E., 4 Sept 2024, In: Frontiers in Psychology. 15, p. 1403599 10 p., 1403599.Research output: Contribution to journal › Article › peer-review
Open Access -
Music-based Graph Convolution Neural Network with ECG, Respiration, Pulse Signal as a Diagnostic Tool for Hypertension
Pal, P., Cotic, N., Solinski, M., Pope, V., Lambiase, P. & Chew, E., 23 Oct 2024, (Accepted/In press) Music-based Graph Convolution Neural Network with ECG, Respiration, Pulse Signal as a Diagnostic Tool for Hypertension.Research output: Chapter in Book/Report/Conference proceeding › Conference paper › peer-review
-
Raised Blood Pressure Alters Reactivity to Musical Features
Pope, V., Solinski, M., Lambiase, P. D. & Chew, E., 31 Aug 2024, Raised Blood Pressure Alters Reactivity to Musical Features.Research output: Chapter in Book/Report/Conference proceeding › Conference paper › peer-review
Open AccessFile