Skip to main content

User Evaluation of Hand Gestures for Designing an Intelligent In-Vehicle Interface

  • Conference paper
  • First Online:
Book cover Designing the Digital Transformation (DESRIST 2017)

Abstract

Driving a car is a high cognitive-load task requiring full attention behind the wheel. Intelligent navigation, transportation, and in-vehicle interfaces have introduced a safer and less demanding driving experience. However, there is still a gap for the existing interaction systems to satisfy the requirements of actual user experience. Hand gesture as an interaction medium, is natural and less visually demanding while driving. This paper aims to conduct a user-study with 79 participants to validate mid-air gestures for 18 major in-vehicle secondary tasks. We have demonstrated a detailed analysis on 900 mid-air gestures investigating preferences of gestures for in-vehicle tasks, their physical affordance, and driving errors. The outcomes demonstrate that employment of mid-air gestures reduces driving errors by up to 50% compared to traditional air-conditioning control. Results can be used for the development of vision-based in-vehicle gestural interfaces.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Victor, T., Rothoff, M., Coelingh, E., Ödblom, A., Burgdorf, K.: When autonomous vehicles are introduced on a larger scale in the road transport system: the Drive Me project. In: Watzenig, D., Horn, M. (eds.) Automated Driving, pp. 541–546. Springer, Cham (2017). doi:10.1007/978-3-319-31895-0_24

    Chapter  Google Scholar 

  2. Drews, F.A., Yazdani, H., Godfrey, C.N., Cooper, J.M., Strayer, D.L.: Text messaging during simulated driving. Hum. Factors: J. Hum. Factors Ergon. Soc. 51, 762–770 (2009)

    Article  Google Scholar 

  3. Gregoriades, A., Sutcliffe, A., Papageorgiou, G., Louvieris, P.: Human-centered safety analysis of prospective road designs. IEEE Trans. Syst. Man Cybern.-Part A: Syst. Hum. 40(2), 236–250 (2010)

    Article  Google Scholar 

  4. Döring, T., Kern, D., Marshall, P., Pfeiffer, M., Schöning, J., Gruhn, V., Schmidt, A.: Gestural interaction on the steering wheel: reducing the visual demand. ACM (2011)

    Google Scholar 

  5. Fariman, H.J., Alyamani, H.J., Kavakli, M., Hamey, L.: Designing a user-defined gesture vocabulary for an in-vehicle climate control system. In: Proceedings of 28th Australian Conference on Computer-Human Interaction, Launceston, Tasmania, Australia. ACM (2016)

    Google Scholar 

  6. Ruikar, M.: National statistics of road traffic accidents in India. J. Orthop. Traumatol. Rehabil. 6(1), 1 (2013)

    Article  Google Scholar 

  7. Bonin-Font, F., Ortiz, A., Oliver, G.: Visual navigation for mobile robots: a survey. J. Intell. Rob. Syst. 53(3), 263 (2008)

    Article  Google Scholar 

  8. Lin, S.-P., Maxemchuk, N.F.: The fail-safe operation of collaborative driving systems. J. Intell. Transp. Syst. 20(1), 88–101 (2016)

    Article  Google Scholar 

  9. Velez, G., Otaegui, O.: Embedding vision-based advanced driver assistance systems: a survey. IET Intell. Transp. Syst. 11(3), 103–112 (2016)

    Article  Google Scholar 

  10. Normark, C.J., Tretten, P., Gärling, A.: Do redundant head-up and head-down display configurations cause distractions. (2009)

    Google Scholar 

  11. Metz, B., Landau, A., Just, M.: Frequency of secondary tasks in driving–results from naturalistic driving data. Saf. Sci. 68, 195–203 (2014)

    Article  Google Scholar 

  12. Chen, S., Epps, J.: Using task-induced pupil diameter and blink rate to infer cognitive load. Hum.-Comput. Interact. 29(4), 390–413 (2014)

    Article  Google Scholar 

  13. Hartson, R.: Cognitive, physical, sensory, and functional affordances in interaction design. Behav. Inf. Technol. 22(5), 315–338 (2003)

    Article  Google Scholar 

  14. Kaptelinin, V., Nardi, B.: Affordances in HCI: toward a mediated action perspective. ACM (2012)

    Google Scholar 

  15. Merrill, D.J.: FlexiGesture: a sensor-rich real-time adaptive gesture and affordance learning platform for electronic music control. Massachusetts Institute of Technology (2004)

    Google Scholar 

  16. Norman, D.A.: Affordance, conventions, and design. Interactions 6(3), 38–43 (1999)

    Article  Google Scholar 

  17. Riedl, R., Davis, F.D., Banker, R., Kenning, P.H.: Neuroscience in Information Systems Research: Applying Knowledge of Brain Functionality Without Neuroscience Tools. Springer, Heidelberg (2017)

    Book  Google Scholar 

  18. Jahani Fariman, H., Ahmad, S.A., Hamiruce Marhaban, M., Ali Jan Ghasab, M., Chappell, P.H.: Simple and computationally efficient movement classification approach for EMG-controlled prosthetic hand: ANFIS vs. artificial neural network. Intell. Autom. Soft Comput. 21, 1–15 (2015). Taylor and Francis

    Google Scholar 

  19. Kucukyildiz, G., Ocak, H., Karakaya, S., Sayli, O.: Design and implementation of a multi sensor based brain computer interface for a robotic wheelchair. J. Intell. Rob. Syst. 1–17 (2017)

    Google Scholar 

  20. Boyali, A., Hashimoto, N.: Spectral collaborative representation based classification for hand gestures recognition on electromyography signals. Biomed. Signal Process. Control 24, 11–18 (2016)

    Article  Google Scholar 

  21. Rodger, J.A.: Reinforcing inspiration for technology acceptance: improving memory and software training results through neuro-physiological performance. Comput. Hum. Behav. 38, 174–184 (2014)

    Article  Google Scholar 

  22. Lin, Y., Breugelmans, J., Iversen, M., Schmidt, D.: An adaptive interface design (AID) for enhanced computer accessibility and rehabilitation. Int. J. Hum Comput Stud. 98, 14–23 (2017)

    Article  Google Scholar 

  23. Riener, A.: Gestural interaction in vehicular applications. Computer 4, 42–47 (2012)

    Article  Google Scholar 

  24. Jæger, M.G., Skov, M.B. Thomassen, N.G. You can touch, but you can’t look: interacting with in-vehicle systems. ACM (2008)

    Google Scholar 

  25. Jamson, A.H., Westerman, S.J., Hockey, G.R.J., Carsten, O.M.: Speech-based e-mail and driver behavior: effects of an in-vehicle message system interface. Hum. Factors: J. Hum. Factors Ergon. Soc. 46(4), 625–639 (2004)

    Article  Google Scholar 

  26. Akl, A., Valaee, S.: Accelerometer-based gesture recognition via dynamic-time warping, affinity propagation, & compressive sensing. IEEE (2010)

    Google Scholar 

  27. Riener, A., Ferscha, A., Bachmair, F., Hagmüller, P., Lemme, A., Muttenthaler, D., Pühringer, D., Rogner, H., Tappe, A., Weger, F.: Standardization of the in-car gesture interaction space. ACM (2013)

    Google Scholar 

  28. Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. ACM (2009)

    Google Scholar 

  29. Ruiz, J., Li, Y., Lank, E.: User-defined motion gestures for mobile interaction. ACM (2011)

    Google Scholar 

  30. Obaid, M., Häring, M., Kistler, F., Bühling, R., André, E.: User-defined body gestures for navigational control of a humanoid robot. In: Ge, S.S., Khatib, O., Cabibihan, J.-J., Simmons, R., Williams, M.-A. (eds.) ICSR 2012. LNCS, vol. 7621, pp. 367–377. Springer, Heidelberg (2012). doi:10.1007/978-3-642-34103-8_37

    Chapter  Google Scholar 

  31. Silpasuwanchai, C., Ren, X.: Designing concurrent full-body gestures for intense gameplay. Int. J. Hum. Comput. Stud. 80, 1–13 (2015)

    Article  Google Scholar 

  32. Ha, T., Billinghurst, M., Woo, W.: An interactive 3D movement path manipulation method in an augmented reality environment. Interact. Comput. 24(1), 10–24 (2012)

    Article  Google Scholar 

  33. Nielsen, M., Störring, M., Moeslund, T.B., Granum, E.: A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS, vol. 2915, pp. 409–420. Springer, Heidelberg (2004). doi:10.1007/978-3-540-24598-8_38

    Chapter  Google Scholar 

  34. Kühnel, C., Westermann, T., Hemmert, F., Kratz, S., Müller, A., Möller, S.: I’m home: defining and evaluating a gesture set for smart-home control. Int. J. Hum. Comput. Stud. 69(11), 693–704 (2011)

    Article  Google Scholar 

  35. Seyed, T., Burns, C., Costa Sousa, M., Maurer, F., Tang, A.: Eliciting usable gestures for multi-display environments. ACM (2012)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hessam Jahani .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Jahani, H., Alyamani, H.J., Kavakli, M., Dey, A., Billinghurst, M. (2017). User Evaluation of Hand Gestures for Designing an Intelligent In-Vehicle Interface. In: Maedche, A., vom Brocke, J., Hevner, A. (eds) Designing the Digital Transformation. DESRIST 2017. Lecture Notes in Computer Science(), vol 10243. Springer, Cham. https://doi.org/10.1007/978-3-319-59144-5_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-59144-5_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-59143-8

  • Online ISBN: 978-3-319-59144-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics