Abstract
This paper examined whether hand movements are responsible for expression in a piano performance. A player played a chord 100 times each with 12 different performance expressions, which consisted of three articulations (tenuto, heavy staccato, and light staccato) and four-level dynamics. The landmarks’ coordinates of her right fingers, wrist, elbow, and shoulder estimated as she played the chord (12 \(\times \) 100 times), judged by MediaPipe Pose and Hands, were used for machine learning training and testing. In the results for the learning model, the testing accuracy rate was 0.99. In each performance expression, F1-scores were 0.94–1.00. This suggested a relationship between performance expressions and hand movements. Moreover, when the player happened to play a different, unintended type of expression, her landmarks’ coordinates were close to those when she had aimed exactly to play that type of performance expression.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Lugaresi, C., et alet al.: Mediapipe: a framework for building perception pipelines, arXiv:1906.08172 (2019)
Google MediaPipe, Pose landmark detection guide. https://developers.google.com/mediapipe/solutions/vision/pose_landmarker. (Accessed 28 Jan 2024)
Google MediaPipe, Hand landmark detection guide. https://developers.google.com/mediapipe/solutions/vision/hand_landmarker. (Accessed 28 Jan 2024)
Zhang, F., et al.: Mediapipe hands: On-device real-time hand tracking. arXiv:2006.10214 (2020)
Bazarevsky, V., Zhang, F.: On-Device, Real-Time hand Tracking with mediaPipe. https://blog.research.google/2019/08/on-device-real-time-hand-tracking-with.html. (Accessed 28 Jan 2024)
Wu, E., Nishioka, H., Furuya, S., Koike, H.: Marker-removal networks to collect precise 3D hand data for RGB-based estimation and its application in piano. In: Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pp. 2977-2986 (2023)
Jones, T.: The Art of Piano Articulation: Mastering Musical Expression. https://tamecajones.com/. (Accessed 28 Jan 2024)
Creative Piano Teacher: The Long and Short of Articulations: How to Correctly Interpret Piano Articulations, https://creativepianoteacher.com/. (Accessed 28 Jan 2024)
Take Note: Learn How to Read Sheet Music: Dynamics, Articulations and Tempo, https://blog.sheetmusicplus.com/. (Accessed 28 Jan 2024)
Guan, Y., Plötz, T.: Ensembles of deep lstm learners for activity recognition using wearables. Proc. ACM interactive, Mobile, Wearable Ubiquitous Technol. 1(2), 1–28 (2017)
TensorFlow. https://github.com/tensorflow/tensorflow. (Accessed 28 Jan 2024)
Pytorch, https://pytorch.org/. (Accessed 28 Jan 2024)
Yanagisawa, Y., Akahani, J., Satoh, T.: Shape-based similarity query for trajectory of mobile objects. In: Chen, M.-S., Chrysanthis, P.K., Sloman, M., Zaslavsky, A. (eds.) MDM 2003. LNCS, vol. 2574, pp. 63–77. Springer, Heidelberg (2003). https://doi.org/10.1007/3-540-36389-0_5
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Oshima, C., Takatsu, T., Nakayama, K. (2024). Examining the Relationship Between Playing a Chord with Expressions and Hand Movements Using MediaPipe. In: Mori, H., Asahi, Y. (eds) Human Interface and the Management of Information. HCII 2024. Lecture Notes in Computer Science, vol 14691. Springer, Cham. https://doi.org/10.1007/978-3-031-60125-5_8
Download citation
DOI: https://doi.org/10.1007/978-3-031-60125-5_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-60124-8
Online ISBN: 978-3-031-60125-5
eBook Packages: Computer ScienceComputer Science (R0)