skip to main content
10.1145/3123514.3123535acmotherconferencesArticle/Chapter ViewAbstractPublication PagesamConference Proceedingsconference-collections
research-article

Recognition of Piano Pedalling Techniques Using Gesture Data

Published: 23 August 2017 Publication History

Abstract

This paper presents a study of piano pedalling technique recognition on the sustain pedal utilising gesture data that is collected using a novel measurement system. The recognition is comprised of two separate tasks: onset/offset detection and classification. The onset and offset time of each pedalling technique was computed through signal processing algorithms. Based on features extracted from every segment when the pedal is pressed, the task of classifying the segments by pedalling technique was undertaken using machine learning methods. We exploited and compared a Support Vector Machine (SVM) and a hidden Markov model (HMM) for classification. Recognition results can be represented by customised pedalling notations and visualised in a score following system.

References

[1]
Yasemin Altun, Ioannis Tsochantaridis, and Thomas Hofmann. 2003. Hidden markov support vector machines. In Proceedings of the Twentieth International Conference on Machine Learning (ICML), Vol. 3. 3--10.
[2]
Baptiste Caramiaux and Atau Tanaka. 2013. Machine Learning of Musical Gestures. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME). 513--518.
[3]
Elaine Chew and Alexandre RJ François. 2008. MuSA. RT and the pedal: the role of the sustain pedal in clarifying tonal structure. In Proceedings of the 10h International Conference on Music Perception and Cognition, Sapporo, Japan.
[4]
Werner Goebl, Simon Dixon, Giovanni De Poli, Anders Friberg, Roberto Bresin, and Gerhard Widmer. 2008. Sense in expressive music performance: Data acquisition, computational studies, and models. Sound to sense-sense to sound: A state of the art in sound and music computing (2008), 195--242.
[5]
Beici Liang, György Fazekas, Andrew McPherson, and Mark Sandler. 2017. Piano Pedaller: A Measurement System for Classification and Visualisation of Piano Pedalling Techniques. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME).
[6]
Andrew McPherson and Victor Zappi. 2015. An environment for submillisecondlatency audio and sensor processing on BeagleBone Black. In Proceedings of 138th International Audio Engineering Society (AES) Convention. Audio Engineering Society.
[7]
F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. 2011. Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research 12 (2011), 2825--2830.
[8]
William H Press, Brian P Flannery, Saul A Teukolsky, William T Vetterling, and Peter B Kramer. 1987. Numerical recipes: the art of scientific computing. AIP.
[9]
Nicolas H. Rasamimanana, Emmanuel Fléty, and Frédéric Bevilacqua. 2005. Gesture analysis of violin bow strokes. In International Gesture Workshop. Springer, 145--155.
[10]
Sandra P Rosenblum. 1993. Pedaling the piano: A brief survey from the eighteenth century to the present. Performance Practice Review 6, 2 (1993), 8.
[11]
David Rowland. 2004. A history of pianoforte pedalling. Cambridge University Press.
[12]
Deborah Rambo Sinn. 2013. Playing Beyond the Notes: A Pianist's Guide to Musical Interpretation. Oxford University Press.
[13]
Siying Wang, Sebastian Ewert, and Simon Dixon. 2015. Compensating for asynchronies between musical voices in score-performance alignment. In IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 589--593.
[14]
Diana Young. 2008. Classification of Common Violin Bowing Techniques Using Gesture Data from a Playable Measurement System. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME). 44--48.
[15]
Van Zandt-Escobar, Baptiste Caramiaux, and Atau Tanaka. 2014. PiaF: A Tool for Augmented Piano Performance Using Gesture Variation Following. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME).

Cited By

View all
  • (2023)Visual Representations to Stimulate New Musicking Strategies in Live CodingOrganised Sound10.1017/S1355771823000389(1-13)Online publication date: 22-Aug-2023
  • (2021)Instrument Playing Technique Recognition: A Greek Music Use CaseProceedings of the Worldwide Music Conference 202110.1007/978-3-030-74039-9_13(124-136)Online publication date: 13-Apr-2021

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AM '17: Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences
August 2017
337 pages
ISBN:9781450353731
DOI:10.1145/3123514
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

  • Queen Mary, University of London

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 August 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Piano pedalling techniques
  2. machine learning in musical performance
  3. musical gesture recognition

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • EPSRC and AHRC Centre for Doctoral Training in Media and Arts Technology
  • Fusing Semantic and Audio Technologies for Intelligent Music Production and Consumption
  • European Commission H2020 research and innovation grant AudioCommons

Conference

AM '17
AM '17: Audio Mostly 2017
August 23 - 26, 2017
London, United Kingdom

Acceptance Rates

AM '17 Paper Acceptance Rate 54 of 77 submissions, 70%;
Overall Acceptance Rate 177 of 275 submissions, 64%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)3
  • Downloads (Last 6 weeks)1
Reflects downloads up to 02 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Visual Representations to Stimulate New Musicking Strategies in Live CodingOrganised Sound10.1017/S1355771823000389(1-13)Online publication date: 22-Aug-2023
  • (2021)Instrument Playing Technique Recognition: A Greek Music Use CaseProceedings of the Worldwide Music Conference 202110.1007/978-3-030-74039-9_13(124-136)Online publication date: 13-Apr-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media