Skip to main content
Log in

User-defined gesture interaction for in-vehicle information systems

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Gesture elicitation study, a technique emerging from the field of participatory design, has been extensively applied in emerging interaction and sensing technologies in recent years. However, traditional gesture elicitation study often suffers from the gesture disagreement and legacy bias problem and may not generate optimal gestures for a target system. This paper reports a research project on user-defined gestures for interacting with in-vehicle information systems. The main contribution of our research lies in a 3-stage, participatory design method we propose for deriving more reliable gestures than traditional gesture elicitation methods. Using this method, we generated a set of user-defined gestures for secondary tasks in an in-vehicle information system. Drawing on our research, we develop a set of design guidelines for freehand gesture design. We highlight the implications of this work for the gesture elicitation for all gestural interfaces.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. https://mashable.com/2017/01/05/bmw-haloactive-hands-on-ces/#FSPHGPrqqPqF.

  2. https://media.ford.com/content/fordmedia/fna/us/en/news/2015/06/02/all-new-ford-sync-3-connectivity-system-launching-on-2016-ford-e.html.

  3. http://www.autocarpro.in/news-international/delphi-develops-cabin-gesture-control-26028.

References

  1. Akyol S, Canzler U, Bengler K, Hahn W (2000) Gesture control for use in automobiles. In: IAPR Workshop on Machine Vision Applications, p 349-352

  2. Alpern M, Minardo K (2003) Developing a car gesture interface for use as a secondary task. In: ACM CHI’03, p 932-933

  3. Angelini L, Carrino F, Carrino S, Caon M, Khaled OA, Baumgaartner J, Sonderegger A, Lalanne D, Mugellini E (2014) Gesturing on the steering wheel: a user-elicited taxonomy. In: AutomotiveUI’14, p 1–8

  4. Bach K M, Jæger M G, Skov M B, Thomassen N G (2008) You can touch, but you can't look: interacting with in-vehicle systems. In: CHI’08. p 1139–1148

  5. Buchanan S, Bourke Floyd CH IV, Holderness W (2013) Towards user-defined multi-touch gestures for 3D objects. In: ITS’13. p 231-240

  6. Cai ZY, Han JG, Li L, Ling S (2017) RGB-D datasets using microsoft kinect or similar sensors: a survey. Multimed Tools Appl 76(3):4313–4355

    Article  Google Scholar 

  7. Chan E, Seyed T, Stuerzlinger W, Yang X D, Maurer F (2016) User elicitation on single-hand microgestures. In: ACM CHI’16. p 3403-3411

  8. Chen Z, Ma XC, Peng ZY, Zhou Y, Yao MG, Ma Z, Wang C, Gao ZF, Shen MW (2018) User-defined gestures for gestural interaction: extending from hands to other body parts. Int J Hum Comput Interact 34(3):238–250

    Article  Google Scholar 

  9. Cheng H, Yang L, Liu ZC (2016) Survey on 3D hand gesture recognition. IEEE Trans Circuits Syst Video Technol 26(9):1659–1673

    Article  Google Scholar 

  10. Choi E, Kwon S, Lee D, Lee H, Chung MK (2014) Towards successful user interaction with systems: focusing on user-derived gestures for smart home systems. Appl Ergon 45:1196–1207

    Article  Google Scholar 

  11. Connell S, Kuo P Y, Piper A M (2013) A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface. In: IDC’13, p 277-280

  12. Dong HW, Danesh A, Figueroa N, Saddik AE (2015) An elicitation study on gesture preferences and memorability toward a practical hand-gesture vocabulary for smart televisions. IEEE Access:543–555

    Article  Google Scholar 

  13. Döring T, Kern D, Marshall P, Pfeiffer M, Schöning J, Gruhn V, Schmidt A (2011) Gestural interaction on the steering wheel – reducing the visual demand. In: ACM CHI’11. p 483-492

  14. England D, Ruperez M, Botto C, Nimoy J, Poulter S (2007) Creative technology and HCI. Proceedings of HCI ed

  15. Feng ZQ, Yang B, Li Y, Zheng YW, Zhao XY, Yin JQ, Meng QF (2013) Real-time oriented behavior-driven 3D freehand tracking for direct interaction. Pattern Recogn 46:590–608

    Article  Google Scholar 

  16. Findlater L, Lee B, Wobbrock J (2012) Beyond QWERTY: augmenting touch screen keyboards with multi-touch gestures for non-alphanumeric input. In: ACM CHI’12. p 2679–2682

  17. Freeman FG, Mikulka PJ, Scerbo MW, Prinzel LJ, Clouatre K (2000) Evaluation of a psychophysiologically controlled adaptive automation system using performance on a tracking task. Appl Psychophysiol Biofeedback 25(2):103–115

    Article  Google Scholar 

  18. Gheran B F, Vanderdonckt J, Vatavu R D (2018) Gestures for smart rings: empirical results, insights, and design implications. In: DIS’18. p 623–635

  19. Grijincu D, Nacenta M A, Kristensson P O (2014) User-defined interface gestures: dataset and analysis.” In: ITS’14. p 25–34

  20. Hoff L, Hornecker E, Bertel S (2016) Modifying gesture elicitation: do kinaesthetic priming and increased production reduce legacy bias? In: TEI’16. p 86–91

  21. Kray C, Nesbitt D, Rohs M (2010) User-defined gestures for connecting mobile phones, public displays, and tabletops. In: MobileHCI’10. p 239-248

  22. Kühnel C, Westermann T, Hemmer F, Kratz S (2011) I’m home: defining and evaluating a gesture set for smart-home control. Int J Hum Comput Stud 69:693–704

    Article  Google Scholar 

  23. Kulshreshth A, LaViola Jr, JJ (2014) Exploring the usefulness of finger-based 3D gesture menu selection. In: ACM CHI’14. p 1093–1102

  24. Kurdyukova E, Redlin M, André E (2012) Studying user-defined iPad gestures for interaction in multi-display environment. In: ACM IUI’12. p 93-96

  25. Lee G A, Choi J S, Wong J, Park C J, Park H S, Billinghurst M (2015) User defined gestures for augmented virtual mirrors: a guessability study. In: ACM CHI’15. p 959-964

  26. Löcken A, Hesselmann T, Pielot M, Henze N, Boll S (2011) User-centered process for the definition of freehand gestures applied to controlling music playback. Multimedia Systems 18(1):15–31

    Article  Google Scholar 

  27. Loehmann S, Knobel M, Lamara M, Butz A (2013) Culturally independent gestures for in-car interactions. In: INTERACT’13. p 538–545

    Chapter  Google Scholar 

  28. Lou XL, Peng R, Hansen P, Li XDA (2018) Effects of user’s hand orientation and spatial movements on free hand interactions with large displays. Int J Hum Comput Interact 34(6):519–532

    Article  Google Scholar 

  29. Miller GA (1955) The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychol Rev 101(2):343–352

    Article  Google Scholar 

  30. Montero C S, Alexander J, Marshall M, Subramanian S (2010) Would you do that? – Understanding social acceptance of gestural interfaces. In: MobileHCI’10. p 275–278

  31. Morris M R (2012) Web on the wall: insights from a multimodal interaction elicitation study. In: ITS’12. p 95–104

  32. Morris MR, Danielescu A, Drucker S, Fisher D, Lee B, Schraefel MC, Wobbrock JO (2014) Reducing legacy bias in gesture elicitation studies. Interactions. 21(3):40–45

    Article  Google Scholar 

  33. Nebeling M, Huber A, Ott D, Norrie M C (2014). Web on the wall reloaded: implementation, replication and refinement of user-defined interaction sets. In: ITS’14. p 15–24

  34. Nielsen M, Störring M, Moeslund T, Granum E (2004) A procedure for developing intuitive and ergonomic gesture interfaces for HCI. Gesture-based Communication in Human–Computer Interaction. p 105–106

  35. Obaid M, Haring M, Kistler F, Buhling R, Andre E (2012) User-defined body gestures for navigational control of a humanoid robot. In: ICSR’12. p 367-377

    Chapter  Google Scholar 

  36. Piumsomboon, T, Billinghurst M, Clark A, Cockburn A (2013) User-defined gestures for augmented reality. In: ACM CHI’13. p 955-960

  37. Rahman A S M M, Saboune J, Saddik A E I (2011) Motion-path based in car gesture control of the multimedia devices. In: ACM International Symposium on Design and Analysis of Intelligent Vehicular Networks and Applications. p 69-75

  38. Randolph JJ (2005) Free-marginal multirater kappa: an alternative to Fleiss´ fixed-marginal multirater kappa. Online Submission 4(3):20

  39. Rautaray SS, Agrawal A (2015) Vision based hand gesture recognition for human computer interaction: a survey. Artif Intell Rev 43(1):1–54

    Article  Google Scholar 

  40. Riener A (2012) Hand and finger gestures in vehicular applications. Computer. 45:6

    Article  Google Scholar 

  41. Riener A, Rossbory M, Ferscha A (2011) Natural DVI based on intuitive hand gestures. In: Workshop User Experience in Cars. INTERACT 2011:5

  42. Rovelo G, Vanacken D, Luyten K, Abad F, Camahort E (2014) Multi-viewer gesture-based interaction for omni-directional video. In: ACM CHI’14. p 4077–4086

  43. Ruiz J, Li Y, Lank E (2011) User-defined motion gestures for mobile interaction. In: ACM CHI’11. p 197-206

  44. Seyed T, Burns C, Sousa MC, Maurer F, Tang A (2012) Eliciting usable gestures for multi-display environments. In: ITS’12. p 41–50

  45. Shimon S S A, Lutton C, Xu Z C, Smith S M, Boucher C, Ruiz J (2016) Exploring non-touchscreen gestures for smartwatches. In: CHI’16. p 3822-3833

  46. Takahashi M, Fujii M, Naemura M, Satoh S (2013) Human gesture recognition system for TV viewing using time-of-flight camera. Multimed Tools Appl 62:761–783

    Article  Google Scholar 

  47. Tung Y C, Hsu C Y, Wang H Y, Chyou S, Lin J W, Wu P J, Valstar A, Chen M Y (2015) User-defined game input for smart glasses in public space. In: ACM CHI’15. p 3327–3336

  48. Valdes C, Eastman D, Grote C, Thatte S, Shaer O, Mazalek A, Ullmer B, Konkel M K (2014) Exploring the design space of gestural interaction with active tokens through user-defined gestures. In: ACM CHI’14. p 4107–4116

  49. Vatavu R D (2012) User-defined gestures for free-hand TV control. In: EuroITV’12. p 45–48

  50. Vatavu RD, Wobbrock JO (2015). Formalizing agreement analysis for elicitation studies: new measures, significance test, and toolkit. In: CHI’15. p 1325–1334

  51. Vatavu RD, Wobbrock JO (2016). Between-subjects elicitation studies: formalization and tool support. In: CHI’16. p 3390–3402

  52. Wahl H, Groh R (2016) User interface and interaction design in future auto-mobility. Design, User Experience, and Usability: Design Thinking and Methods. Springer International Publishing

  53. Wan MH, Li M, Yang GW, Gai S, Jin Z (2014) Feature extraction using two-dimensional maximum embedding difference. Inf Sci 274:55–69

    Article  Google Scholar 

  54. Wan MH, Lai ZH, Yang GW, Yang ZJ, Zhang FL, Zheng H (2017) Local graph embedding based on maximum margin criterion via fuzzy set. Fuzzy Sets Syst 318:120–131

    Article  MathSciNet  Google Scholar 

  55. Wan MH, Yang GW, Gai S, Yang ZJ (2017) Two-dimensional discriminant locality preserving projections (2DDLPP) and its application to feature extraction via fuzzy set. Multimed Tools Appl 76:355–371

    Article  Google Scholar 

  56. Wobbrock JO, Aung HH, Rothrock B, Myers BA (2005) Maximizing the guessability of symbolic input. In: ACM CHI’05. p 1869–1872

  57. Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing. In: ACM CHI’09. p 1083–1092

  58. Wu HY, Wang JM, Zhang XL (2016) User-centered gesture development in TV viewing environment. Multimed Tools Appl 75(2):733–760

    Article  Google Scholar 

  59. Wu HY, Zhang SK, Qiu JL, Liu JY, Zhang XL (2018) The gesture disagreement problem in freehand gesture interaction. Int J Hum Comput Interact. https://doi.org/10.1080/10447318.2018.1510607

    Article  Google Scholar 

  60. Yang C, Jang Y, Beh J, Han D (2012) Gesture recognition using depth-based hand tracking for contactless controller application. In: IEEE International Conference on Consumer Electronics. p 297-298

  61. Yee W (2009) Potential limitations of multi-touch gesture vocabulary: differentiation, adoption, fatigue. In: Proceeding of the 13th International Conference on Human Computer Interaction. p 291–300

    Chapter  Google Scholar 

  62. Zaiţi IA, Pentiuc SG, Vatavu RD (2015) On free-hand TV control: experimental results on user-elicited gestures with leap motion. Pers Ubiquit Comput 19:821–838

    Article  Google Scholar 

  63. Zobl M, Geiger M, Schuller B, Lang W, Rigoll G (2003) A real-time system for hand gesture controlled operation of in-car devices. In: International Conference on Multimedia & Expo. 3. p 541-544

Download references

Acknowledgements

The authors would like to thank the anonymous reviewers for their insightful comments. This work was supported by the National Natural Science Foundation of China under Grant No. 61772564, 61772468.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Huiyue Wu.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, H., Wang, Y., Liu, J. et al. User-defined gesture interaction for in-vehicle information systems. Multimed Tools Appl 79, 263–288 (2020). https://doi.org/10.1007/s11042-019-08075-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-019-08075-1

Keywords

Navigation