Annotation and Analysis of Recorded Piano Performances on the Web

Lawrence Fyfe*, Daniel Bedoya, Elaine Chew

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)
34 Downloads (Pure)

Abstract

Advancing knowledge and understanding about performed music is hampered by a lack of annotation data for music expressivity. To enable large-scale collection of annotations and explorations of performed music, the authors have created a workflow that is enabled by CosmoNote, a Web-based citizen science tool for annotating musical structures created by the performer and experienced by the listener during expressive piano performances. To enable annotation tasks with CosmoNote, annotators can listen to the recorded performances and view synchronized music visualization layers including the audio waveform, recorded notes, extracted audio features such as loudness and tempo, and score features such as harmonic tension. Annotators have the ability to zoom into specific parts of a performance and see visuals and listen to the audio from just that part. The annotation of performed musical structures is done by using boundaries of varying strengths, regions, comments, and note groups. By analyzing the annotations collected with CosmoNote, performance decisions will be able to be modeled and analyzed in order to aid in the understanding of expressive choices in musical performances and discover the vocabulary of performed musical structures.
Original languageEnglish
Pages (from-to)962-978
JournalJAES Journal of the Audio Engineering Society
Volume70
Issue number11
DOIs
Publication statusPublished - Nov 2022

Keywords

  • music representation
  • music annotation
  • citizen science
  • musical prosody
  • performance science

Fingerprint

Dive into the research topics of 'Annotation and Analysis of Recorded Piano Performances on the Web'. Together they form a unique fingerprint.

Cite this