Skip to main content

Hand Postures for Sonification Control

  • Conference paper
  • First Online:
Gesture and Sign Language in Human-Computer Interaction (GW 2001)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2298))

Included in the following conference series:

Abstract

Sonification is a rather new technique in human-computer interaction which addresses auditory perception. In contrast to speech interfaces, sonification uses non-verbal sounds to present information. The most common sonification technique is parameter mapping where for each data point a sonic event is generated whose acoustic attributes are determined from data values by a mapping function. For acoustic data exploration, this mapping must be adjusted or manipulated by the user. We propose the use of hand postures as a particularly natural and intuitive means of parameter manipulation for this data exploration task. As a demonstration prototype we developed a hand posture recognition system for gestural controlling of sound. The presented implementation applies artificial neural networks for the identification of continuous hand postures from camera images and uses a real-time sound synthesis engine. In this paper, we present our system and first applications of the gestural control of sounds. Techniques to apply gestures to control sonification are proposed and sound examples are given.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. R. Boulanger. The Csound Book. MIT Press, 2000.

    Google Scholar 

  2. U. M. Fayyad et al., editor. Advances in Knowledge Discovery and Data Mining. MIT Press, 1996.

    Google Scholar 

  3. Friedman, J. H. and Tukey, J. W. A projection pursuit algorithm for exploratory data analysis. IEEE Transactions on Computers, 23:881–890, 1974.

    Article  MATH  Google Scholar 

  4. T. Hermann and H. Ritter. Listen to your Data: Model-Based Sonification for Data Analysis. In M. R. Syed, editor, Advances in intelligent computing and mulimedia systems. Int. Inst. for Advanced Studies in System Research and Cybernetics, 1999.

    Google Scholar 

  5. I. A. Kapandji. The physiology of the joints: Upper limbs. Churchill Livingstone, 1982.

    Google Scholar 

  6. G. Kramer, editor. Auditory Display-Sonification, Audification, and Auditory Interfaces. Addison-Wesley, 1994.

    Google Scholar 

  7. A. G. E. Mulder and S. S. Fels. Sound Sculpting: Manipulating sound through virtual sculpting. In Proceedings of the 1998 Western Computer Graphics Symposium, pages 15–23, 1998. http://www.cs.sfu.ca/~amulder/personal/vmi/publist.html.

  8. C. Nölker and T. Hermann. GREFIT: Visual recognition of hand postures. http://www.techfak.uni-bielefeld.de/~claudia/vishand.html, 2001.

  9. H. Ritter, “The graphical simulation toolkit NEO/Nst,” http://www.techfak.uni-bielefeld.de/ags/ni/projects/simulation and visual/neo/neo e.html.

  10. C. Nölker and H. Ritter. GREFIT: Visual recognition of hand postures. In A. Braffort, R. Gherbi, S. Gibet, J. Richardson, and D. Teil, editors, Gesture-Based Communication in Human-Computer Interaction: Proc. International Gesture Workshop, GW’ 99, France, pages 61–72. Springer Verlag, LNAI 1739, http://www.techfak.uni-bielefeld.de/techfak/ags/ni/publicfr_99d.htm, 1999.

  11. C. Nölker and H. Ritter. Parametrized SOMs for hand posture reconstruction. In S.-I. Amari, C. L. Giles, M. Gori, and V. Piuri, editors, Proceedings IJCNN’2000. Neural Computing: New Challenges and Perspectives for the New Millennium, pages IV-139–144, 2000.

    Google Scholar 

  12. C. Scaletti, “Sound synthesis algorithms for auditory data representations,” in Auditory Display, G. Kramer, Ed. 1994, Addison-Wesley.

    Google Scholar 

  13. B. D. Ripley, Pattern Pecognition and Neural Networks, Cambridge University Press, 1996.

    Google Scholar 

  14. J. Walter, C. Nölker and H. Ritter. The PSOM Algorithm and Applications. Proc. Symposium Neural Computation 2000, pp. 758–764, Berlin, 2000

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hermann, T., Nölker, C., Ritter, H. (2002). Hand Postures for Sonification Control. In: Wachsmuth, I., Sowa, T. (eds) Gesture and Sign Language in Human-Computer Interaction. GW 2001. Lecture Notes in Computer Science(), vol 2298. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-47873-6_32

Download citation

  • DOI: https://doi.org/10.1007/3-540-47873-6_32

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43678-2

  • Online ISBN: 978-3-540-47873-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics