Skip to main content
Log in

User-centred process for the definition of free-hand gestures applied to controlling music playback

  • Regular Paper
  • Published:
Multimedia Systems Aims and scope Submit manuscript

Abstract

Music is a fundamental part of most cultures. Controlling music playback has commonly been used to demonstrate new interaction techniques and algorithms. In particular, controlling music playback has been used to demonstrate and evaluate gesture recognition algorithms. Previous work, however, used gestures that have been defined based on intuition, the developers’ preferences, and the respective algorithm’s capabilities. In this paper we propose a refined process for deriving gestures from constant user feedback. Using this process every result and design decision is validated in the subsequent step of the process. Therefore, comprehensive feedback can be collected from each of the conducted user studies. Along the process we develop a set of free-hand gestures for controlling music playback. The situational context is analysed to shape the usage scenario and derive an initial set of necessary functions. In a successive user study the set of functions is validated and proposals for gestures are collected from participants for each function. Two gesture sets containing static and dynamic gestures are derived and analysed in a comparative evaluation. The comparative evaluation shows the suitability of the identified gestures and allows further refinement. Our results indicate that the proposed process, that includes validation of each design decision, improves the final results. By using the process to identify gestures for controlling music playback we not only show that the refined process can successfully be applied, but we also provide a consistent gesture set that can serve as a realistic benchmark for gesture recognition algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26

Similar content being viewed by others

References

  1. Akers, D.: Wizard of oz for participatory design: inventing a gestural interface for 3d selection of neural pathway estimates. In: Ext. Abstracts CHI, pp. 454–459. ACM (2006)

  2. Baudel, T., Beaudouin-Lafon, M.: Charade: remote control of objects using free-hand gestures. In: Proc. Commun ACM, vol. 36, pp. 28–35 (1993)

  3. Bergman, J., Kauko, J., Keränen, J.: Hands on music: physical approach to interaction with digital music. In: Proc. Mobile HCI, pp. 33:1–33:11. ACM, New York (2009)

  4. Bolt, R.: Put-that-there: voice and gesture at the graphics interface. In: Proc. SIGGRAPH, pp. 262–270. ACM (1980)

  5. Choi, E., Bang, W., Cho, S., Yang, J., Kim, D., Kim, S.: Beatbox music phone: gesture-based interactive mobile phone using a tri-axis accelerometer. In: Proc. ICIT, pp. 97–102. IEEE (2005)

  6. Description of the microsoft kinect. http://www.xbox.com/kinect/ (2011)

  7. Freeman, W., Roth, M.: Orientation histograms for hand gesture recognition. In: Proc. IWAFGR, vol. 12, pp. 296–301 (1995)

  8. Freeman, W., Weissman, C.: Television control by hand gestures. In: Proc. IWAFGR, pp. 179–183 (1995)

  9. Gavrila, D.: The visual analysis of human movement: a survey. Comput Vision Image Underst 73(1), 82–98 (1999)

    Article  MATH  Google Scholar 

  10. Giles, J.: Inside the race to hack the kinect. New Sci 208(2789), 22–23 (2010)

    Article  Google Scholar 

  11. Hayafuchi, K., Suzuki, K.: Musicglove: A wearable musical controller for massive media library. In: Proc. NIME, pp. 259–262. ACM (2008)

  12. Henze, N., Boll, S.: Snap and share your photobooks. In: Proc. MM, pp. 409–418. ACM (2008)

  13. Henze, N., Boll, S.: Designing a cd augmentation for mobile phones. In: Proc. CHI, pp. 3979–3984. ACM (2010)

  14. Hesselmann, T., Boll, S., Heuten, W.: Sciva—designing applications for surface computers. In: Proc. EICS. To appear. ACM (2011)

  15. Hofmann, F., Heyer, P., Hommel, G.: Velocity profile based recognition of dynamic gestures with discrete hidden markov models. In: Gesture and Sign Language in Human–Computer Interaction, pp. 81–95 (1998)

  16. Kela, J., Korpipää, P., Mäntyjärvi, J., Kallio, S., Savino, G., Jozzo, L., Marca, S.: Accelerometer-based gesture control for a design environment. Pers Ubiquit Comput 10(5), 285–299 (2006)

    Article  Google Scholar 

  17. Kranz, M., Freund, S., Holleis, P., Schmidt, A., Arndt, H.: Developing gestural input. In: Proc. ICDCS, pp. 63–63. IEEE (2006)

  18. Kray, C., Nesbitt, D., Dawson, J., Rohs, M.: User-defined gestures for connecting mobile phones, public displays, and tabletops. In: Proc. Mobile HCI, pp. 239–248. ACM (2010)

  19. Liang, R., Ouhyoung, M.: A real-time continuous gesture recognition system for sign language. In: Proc. FG, pp. 558–567. IEEE (1998)

  20. Masui, T., Tsukada, K., Siio, I.: Mousefield: A simple and versatile input device for ubiquitous computing. In: Proc. UbiComp, pp. 319–328 (2004)

  21. Mistry, P., Maes, P.: Sixthsense: a wearable gestural interface. In: Proc. SIGGRAPH ASIA. ACM (2009)

  22. Moeslund, T.B., Störring, M., Granum, E.: A natural interface to a virtual environment through computer vision-estimated pointing gestures. In: Gesture and Sign Language in Human–Computer Interaction, pp. 59–63 (2002)

  23. Morris, M., Wobbrock, J., Wilson, A.: Understanding users’ preferences for surface gestures. In: Proc. GI, pp. 261–268. Canadian Information Processing Society (2010)

  24. Nielsen, M., Störring, M., Moeslund, T., Granum, E.: A procedure for developing intuitive and ergonomic gesture interfaces for hci. In: Gesture-based Communication in Human–Computer Interaction, pp. 105–106 (2004)

  25. Pylvänäinen, T.: Accelerometer based gesture recognition using continuous hmms. In: Pattern Recognition and Image Analysis pp. 639–646 (2005)

  26. Rico, J., Brewster, S.: Usable gestures for mobile interfaces: evaluating social acceptability. In: Proc. CHI, pp. 887–896. ACM (2010)

  27. Schlömer, T., Poppinga, B., Henze, N., Boll, S.: Gesture recognition with a wii controller. In: Proc. TEI, pp. 11–14. ACM (2008)

  28. Stenger, B., Woodley, T., Cipolla, R.: A vision-based remote control. In: Computer Vision, pp. 233–262 (2010)

  29. Symbian moove—gesture controlled music player. http://www.eyesight-tech.com/ (2010)

  30. Wobbrock, J., Morris, M., Wilson, A.: User-defined gestures for surface computing. In: Proc. CHI, pp. 1083–1092. ACM (2009)

  31. Wobbrock, J., Wilson, A., Li, Y.: Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In: Proc. UIST, pp. 159–168. ACM (2007)

  32. Wu, J., Pan, G., Zhang, D., Qi, G., Li, S.: Gesture recognition with a 3-d accelerometer. In: Proc. Ubiquitous Intelligence and Computing, pp. 25–38 (2009)

  33. Wu, Y., Huang, T.: Vision-based gesture recognition: a review. In: Gesture-Based Communication in Human–Computer Interaction, pp. 103–115 (1999)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andreas Löcken.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Löcken, A., Hesselmann, T., Pielot, M. et al. User-centred process for the definition of free-hand gestures applied to controlling music playback. Multimedia Systems 18, 15–31 (2012). https://doi.org/10.1007/s00530-011-0240-2

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00530-011-0240-2

Keywords

Navigation