Skip to main content

Adaptive Interface for Mapping Body Movements to Sounds

  • Conference paper
  • First Online:
Computational Intelligence in Music, Sound, Art and Design (EvoMUSART 2018)

Abstract

Contemporary digital musical instruments allow an abundance of means to generate sound. Although superior to traditional instruments in terms of producing a unique audio-visual act, there is still an unmet need for digital instruments that allow performers to generate sounds through movements in an intuitive manner. One of the key factors for an authentic digital music act is a low latency between movements (user commands) and corresponding sounds. Here we present such a low-latency interface that maps the user’s kinematic actions into sound samples. The interface relies on wireless sensor nodes equipped with inertial measurement units and a real-time algorithm dedicated to the early detection and classification of a variety of movements/gestures performed by a user. The core algorithm is based on the approximate inference of a hierarchical generative model with piecewise-linear dynamical components. Importantly, the model’s structure is derived from a set of motion gestures. The performance of the Bayesian algorithm was compared against the k-nearest neighbors (k-NN) algorithm, which showed the highest classification accuracy, in a pre-testing phase, among several existing state-of-the-art algorithms. In almost all of the evaluation metrics the proposed probabilistic algorithm outperformed the k-NN algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bahn, C., Hahn, T., Trueman, D.: Physicality and feedback: a focus on the body in the performance of electronic music. In: Proceedings of the International Computer Music Conference, vol. 2001, pp. 44–51 (2001)

    Google Scholar 

  2. Bevilacqua, F., Guédy, F., Schnell, N., Fléty, E., Leroy, N.: Wireless sensor interface and gesture-follower for music pedagogy. In: Proceedings of the 7th International Conference on New Interfaces for Musical Expression, pp. 124–129. ACM (2007)

    Google Scholar 

  3. Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford University Press, Oxford (1995)

    MATH  Google Scholar 

  4. Bongers, B.: Physical interfaces in the electronic arts. In: Trends in Gestural Control of Music, pp. 41–70 (2000)

    Google Scholar 

  5. Cacoullos, T.: Discriminant Analysis and Applications. Academic Press, New York (2014)

    Google Scholar 

  6. Cadoz, C., Wanderley, M.M.: Gesture-music. In: Trends in Gestural Control of Music, pp. 71–93 (2000)

    Google Scholar 

  7. Choi, I.: From motion to emotion: synthesis of interactivity with gestural primitives. In: Emotional and Intelligent: The Tangled Knot of Cognition, pp. 22–25 (1998)

    Google Scholar 

  8. Du, K.-L., Swamy, M.N.S.: Support vector machines. In: Neural Networks and Statistical Learning. STS, pp. 469–524. Springer, London (2014). https://doi.org/10.1007/978-1-4471-5571-3_16

  9. Ferreira, P.P.: When sound meets movement: performance in electronic dance music. Leonardo Music J. 18, 17–20 (2008)

    Article  Google Scholar 

  10. Forney, G.D.: The viterbi algorithm. Proc. IEEE 61(3), 268–278 (1973)

    Article  MathSciNet  Google Scholar 

  11. Goldstein, M.: Gestural coherence and musical interaction design. In: 1998 IEEE International Conference on Systems, Man, and Cybernetics, vol. 2, pp. 1076–1079. IEEE (1998)

    Google Scholar 

  12. Gritten, A., King, E.: Music and Gesture. Ashgate Publishing, Ltd., Farnham (2006)

    Google Scholar 

  13. Hechenbichler, K., Schliep, K.: Weighted k-nearest-neighbor techniques and ordinal classification (2004). http://nbn-resolving.de/urn/resolver.pl?urn=nbn:de:bvb:19-epub-1769-9

  14. Iazzetta, F.: Meaning in musical gesture. In: Trends in Gestural Control of Music, pp. 259–268 (2000)

    Google Scholar 

  15. Jensenius, A.R.: Action-sound: developing methods and tools to study music-related body movement. Ph.D. thesis, Department of Musicology, University of Oslo (2007)

    Google Scholar 

  16. Johnson, M.: pyhsmm-autoregresive repository (2012). https://github.com/mattjj/pyhsmm-autoregressive

  17. Mitchell, T.J.: Soundgrasp: a gestural interface for the performance of live music. In: International Conference on New Interfaces for Musical Expression (NIME) (2011)

    Google Scholar 

  18. Mitra, S., Acharya, T.: Gesture recognition: a survey. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 37(3), 311–324 (2007)

    Article  Google Scholar 

  19. North, B., Blake, A.: Learning dynamical models using expectation-maximisation. In: 1998 Sixth International Conference on Computer Vision, pp. 384–389. IEEE (1998)

    Google Scholar 

  20. Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)

    MathSciNet  MATH  Google Scholar 

  21. Schloss, W.A.: Using contemporary technology in live performance: the dilemma of the performer. J. New Music Res. 32(3), 239–242 (2003)

    Article  Google Scholar 

  22. Stuart, C.: The object of performance: aural performativity in contemporary laptop music. Contemp. Music Rev. 22(4), 59–65 (2003)

    Article  Google Scholar 

  23. Tanaka, A.: Sensor-based musical instruments and interactive. In: Oxford Handbook of Computer Music, p. 233 (2009)

    Google Scholar 

  24. Wang, G.: The Chuck Audio Programming Language. A Strongly-Timed and On-the-Fly Environ/Mentality. Princeton University, Princeton (2008)

    Google Scholar 

  25. Watanabe, S.: A widely applicable Bayesian information criterion. J. Mach. Learn. Res. 14, 867–897 (2013)

    MathSciNet  MATH  Google Scholar 

  26. Winkler, T.: Making motion musical: gesture mapping strategies for interactive computer music. In: ICMC Proceedings, pp. 261–264 (1995)

    Google Scholar 

  27. Yang, M.: Some properties of vector autoregressive processes with markov-switching coefficients. Econometric Theory 16(01), 23–43 (2000)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dimitrije Marković .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Marković, D., Malešević, N. (2018). Adaptive Interface for Mapping Body Movements to Sounds. In: Liapis, A., Romero Cardalda, J., Ekárt, A. (eds) Computational Intelligence in Music, Sound, Art and Design. EvoMUSART 2018. Lecture Notes in Computer Science(), vol 10783. Springer, Cham. https://doi.org/10.1007/978-3-319-77583-8_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-77583-8_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-77582-1

  • Online ISBN: 978-3-319-77583-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics