Skip to main content

Cueing Car Drivers with Ultrasound Skin Stimulation

  • Conference paper
  • First Online:
HCI in Mobility, Transport, and Automotive Systems (HCII 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14049))

Included in the following conference series:

  • 446 Accesses

Abstract

The aim was to utilize ultrasound skin stimulation (UH) on a palm to inform the system state when interacting with mid-air hand gestures in the automotive context. Participants navigated a horizontal menu using touch, buttons, and hand gestures during simulated driving. Mid-air interaction and UH feedback design was tested in two studies. The first study proved the participants were able to perceive the menu interaction state from UH feedback, but the interaction felt inconvenient. After redesigning the mid-air gestures and the feedback for the second study, the participants rated it equally to the touch-based method, though the interaction with steering wheel buttons was still the most favorable. In conclusion UH feedback can be utilized for delivering the interactive system state to drivers, but this approach may cause additional cognitive load and should be further tested in a running car.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Alpern, M., Minardo, K.: Developing a car gesture interface for use as a secondary task. In: Proceedings of CHI-EA 2003, pp. 932–933. ACM (2003). https://doi.org/10.1145/765891.766078

  2. Bach, K.M., Jæger, M.G., Skov, M.B., Thomassen, N.G.: You can touch, but you can't look: interacting with in-vehicle systems. In: SIGCHI Conference on Human Factors in Computing Systems (CHI 2008), pp. 1139–1148. ACM (2008)

    Google Scholar 

  3. Bilius, L.B., Vatavu, R.-D.: A multistudy investigation of drivers and passengers’ gesture and voice input preferences for in-vehicle interactions. J. Intell. Transport. Syst. 25(2), 197–220 (2021). https://doi.org/10.1080/15472450.2020.1846127

    Article  Google Scholar 

  4. Brand, D., Büchele, K., Meschtscherjakov, A.: Pointing at the HUD: gesture interaction using a leap motion. In: Proceedings of AutomotiveUI 2016 Adjunct, pp. 167–172. ACM (2016)

    Google Scholar 

  5. Brown, E., Large, D.R., Limerick, H., Burnett, G.: Ultrahapticons: “Haptifying” drivers’ mental models to transform automotive mid-air haptic gesture infotainment interfaces. In: Proceedings of AutomotiveUI 2020, pp. 54–57. ACM (2020). https://doi.org/10.1145/3409251.3411722

  6. Carter, T., Seah, S.A., Long, B., Drinkwater, B., Subramanian, S.: UltraHaptics: multi-point mid-air haptic feedback for touch surfaces. In: Proceedings of UIST 2013, pp. 505–514. ACM (2013)

    Google Scholar 

  7. Ch, N., Tosca, D., Crump, T., Ansah, A., Kun, A., Shaer, O.: Gesture and voice commands to interact with AR windshield display in automated vehicle: a remote elicitation study. In: Proceedings of AutomotiveUI 2022, pp. 171–182. ACM (2022). https://doi.org/10.1145/3543174.3545257

  8. Detjen, H., Faltaous, S., Geisler, S., Schneegass, S.: User-defined voice and mid-air gesture commands for Maneuver-based interventions in automated vehicles. In: Proceedings of Mensch Und Computer, pp. 341–348. ACM (2019). https://doi.org/10.1145/3340764.3340798

  9. Ecker, E., Broy, V., Hertzschuch, K., Butz, A.: Visual cues supporting direct touch gesture interaction with in-vehicle information systems. In: Proceedings of AutomotiveUI 2010, pp. 80–87. ACM (2010). https://doi.org/10.1145/1969773.1969788

  10. Fariman, H. J., Alyamani, H.J., Kavakli, M., Hamey, L.: Designing a user-defined gesture vocabulary for an in-vehicle climate control system. In: Proceedings of OzCHI 2016, pp. 391–395. ACM (2016). https://doi.org/10.1145/3010915.3010955

  11. Farooq, A., et al.: Where's my cellphone: non-contact based hand-gestures and ultrasound haptic feedback for secondary task interaction while driving. In: 2021 IEEE Sensors, pp. 1–4. IEEE (2021)

    Google Scholar 

  12. Freeman, E., Vo, D.-B., Brewster, S.: HaptiGlow: helping users position their hands for better mid-air gestures and ultrasound haptic feedback. In: 2019 IEEE World Haptics Conference (WHC), pp. 289–294. IEEE (2019). https://doi.org/10.1109/WHC.2019.8816092

  13. Gable, T.M., May, K.R., Walker, B.N.: Applying popular usability heuristics to gesture interaction in the vehicle. In: Adjunct Proceedings of AutomotiveUI 2014, pp. 1–7. ACM (2014)

    Google Scholar 

  14. Georgiou, O., et al.: Haptic in-vehicle gesture controls. In: Proceedings of AutomotiveUI 2017, pp. 233–238. ACM 2017. https://doi.org/10.1145/3131726.3132045

  15. Graichen, L., Graichen, M., Krems, J.F.: Evaluation of gesture-based in-vehicle interaction: user experience and the potential to reduce driver distraction. Hum. Factors 61(5), 774–792 (2019). https://doi.org/10.1177/001872081882425

    Article  Google Scholar 

  16. Graichen, L., Graichen, M., Krems, J.F.: Effects of gesture-based interaction on driving behavior: a driving simulator study using the projection-based vehicle-in-the-loop. Hum. Factors 64(2), 324–342 (2022). https://doi.org/10.1177/0018720820943284

    Article  Google Scholar 

  17. Harrington, K., Large, D.R., Burnett, G., Georgiou, O.: Exploring the use of mid-air ultrasonic feedback to enhance automotive user interfaces. In: Proceedings of AutomotiveUI 2018, pp. 11–20. ACM (2018). https://doi.org/10.1145/3239060.3239089

  18. Hart, S.G., Staveland, L.E.: Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. In: Hancock, P. A., Meshkati, N. (eds.) Adv. in Psychology, vol. 52, pp. 139–183. North Holland Press, Amsterdam (1988)

    Google Scholar 

  19. Häuslschmid, R., Menrad, B. Butz, A.: Freehand vs. micro gestures in the car: driving performance and user experience. In: IEEE Symposium on 3D User Interfaces (3DUI), pp. 159–160. IEEE (2015). https://doi.org/10.1109/3DUI.2015.7131749

  20. Hessam, J.F., Zancanaro, M., Kavakli, M., Billinghurst, M.: Towards optimization of mid-air gestures for in-vehicle interactions. In: Proceedings of OzCHI 2017, pp. 126–134. ACM (2017)

    Google Scholar 

  21. Ho, C., Tan, H.Z., Spence, C.: Using spatial vibrotactile cues to direct visual attention in driving scenes. Transport. Res. F Traffic Psychol. Behav. 8(6), 397–412 (2005)

    Article  Google Scholar 

  22. Hoshi, T., Takahashi, M., Iwamoto, T., Shinoda, H.: Noncontact tactile display based on radiation pressure of airborne ultrasound. IEEE Trans. Haptics 3(3), 155–165 (2010)

    Article  Google Scholar 

  23. Howard, T., Gallagher, G., Lécuyer, A., Pacchierotti, C., Marchal, M.: Investigating the recognition of local shapes using mid-air ultrasound haptics. In: Proceedings of IEEE World Haptics Conference (WHC 2019), pp. 503–508. IEEE (2019). https://doi.org/10.1109/WHC.2019.8816127

  24. Huemer, A.K., Vollrath, M.: Learning the Lane change task: comparing different training regimes in semi-paced and continuous secondary tasks. Appl. Ergon. 43(5), 940–947 (2012). https://doi.org/10.1016/j.apergo.2012.01.002

    Article  Google Scholar 

  25. Iwamoto, T., Tatezono, M., Shinoda, H.: Non-contact method for producing tactile sensation using airborne ultrasound. In: Ferre, M. (ed.) EuroHaptics 2008. LNCS, vol. 5024, pp. 504–513. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-69057-3_64

    Chapter  Google Scholar 

  26. Kim, M., Seong, E., Jwa, Y., Lee, J., Kim, S.: A cascaded multimodal natural user interface to reduce driver distraction. IEEE Access 8, 112969–112984 (2020)

    Article  Google Scholar 

  27. Korres, G., Chehabeddine, S., Eid, M.: Mid-air tactile feedback co-located with virtual touchscreen improves dual-task performance. IEEE Trans. Haptics 13(4), 825–830 (2020). https://doi.org/10.1109/TOH.2020.2972537

    Article  Google Scholar 

  28. Large, D R., Burnett, G., Crundall, E., Lawson, G., Skrypchuk, L.: Twist it, touch it, push it, swipe it: evaluating secondary input devices for use with an automotive touchscreen HMI. In: Proceedings of Automotive'UI 2016, pp. 161–168. ACM (2016)

    Google Scholar 

  29. Large, D.R., Harrington, K., Burnett, G., Georgiou, O.: Feel the noise: Mid-air ultrasound haptics as a novel human-vehicle interaction paradigm. Appl. Ergon. 81, 102909 (2019)

    Article  Google Scholar 

  30. Long, B., Seah, S.A., Carter, T., Subramanian, S.: Rendering volumetric haptic shapes in mid-air using ultrasound. ACM Trans. Graph. 33(6), 1–10 (2014)

    Article  Google Scholar 

  31. Ma, J., Du, Y.: Study on the evaluation method of in-vehicle gesture control, In: IEEE International Conference on Control Science and System Engineering (ICCSSE), pp. 145–148. IEEE (2017)

    Google Scholar 

  32. Mahr, A., Endres, C., Müller, C., Schneeberger, T.: Determining human-centered parameters of ergonomic micro-gesture interaction for drivers using the theater approach. In: Proceedings of AutomotiveUI 2011, pp. 151–158. ACM (2011). https://doi.org/10.1145/2381416.2381441

  33. Manawadu, U.E., Kamezaki, M., Ishikawa, M., Kawano, T., Sugano, S.: A hand gesture based driver-vehicle interface to control lateral and longitudinal motions of an autonomous vehicle. In: IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 1785–1790 (2016)

    Google Scholar 

  34. Martinez, P.I.C., De Pirro, S., Vi, C.T., Subramanian, S.: Agency in Mid-air Interfaces. In: Proceedings of CHI 2017, pp. 2426–2439. ACM (2017). https://doi.org/10.1145/3025453.3025457

  35. Maunsbach, M., Hornbæk, K., Seifi, H.: Whole-hand haptics for mid-air buttons. In: Seifi, H., et al. (eds.) Haptics: Science, Technology, Applications: 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, Hamburg, Germany, May 22–25, 2022, Proceedings, pp. 292–300. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-06249-0_33

    Chapter  Google Scholar 

  36. May, K.R., Gable, T.M., Walker, B.N.: A multimodal air gesture interface for in vehicle menu navigation. In: Adjunct Proceedings of AutomotiveUI 2014, pp. 1–6. ACM (2014)

    Google Scholar 

  37. May, K.R., Gable, T.N., Walker, B.N.: Designing an in-vehicle air gesture set using elicitation methods. In: Proceedings of AutomotiveUI 2017, pp. 74–83. ACM (2017)

    Google Scholar 

  38. May, K.R., Gable, T.M., Wu, X., Sardesai, T.R., Walker, B.N.: Choosing the right air gesture: impacts of menu length and air gesture type on driver workload. In: Adjunct Proceedings of AutomotiveUI 2016, pp. 69–74. ACM (2016). https://doi.org/10.1145/3004323.3004330

  39. NHTSA. Visual-Manual NHTSA Driver Distraction Guidelines for In-Vehicle Electronic Devices. Federal Register, vol. 77, no. 37, pp. 11200–11250 (2012)

    Google Scholar 

  40. Ng, A., Brewster, S.: An evaluation of touch and pressure-based scrolling and haptic feedback for in-car touchscreens. In: Proceedings of AutomotiveUI 2017, pp. 11–20. ACM (2017)

    Google Scholar 

  41. Riener, A., et al.: Standardization of the in-car gesture interaction space. In: Proceedings of AutomotiveUI 2013, pp. 14–21. ACM (2013)

    Google Scholar 

  42. Rakkolainen, I., Freeman, E., Sand, A., Raisamo, R., Brewster, S.: A survey of mid-air ultrasound haptics and its applications. IEEE Trans. Haptics 14(1), 2–19 (2021)

    Article  Google Scholar 

  43. Rutten, I., Geerts, D.: Better because it's new: the impact of perceived novelty on the added value of mid-air haptic feedback. In: Proceedings of CHI 2020, pp. 1–13. ACM (2020)

    Google Scholar 

  44. Rümelin, S., Gabler, T., Bellenbaum, J.: Clicks are in the air: how to support the interaction with floating objects through ultrasonic feedback. In: Proceedings of AutomotiveUI 2017, pp. 103–108. ACM (2017). https://doi.org/10.1145/3122986.3123010

  45. Sauras-Perez, P., Taiber, J., Smith, J.: Variability analysis of in-car gesture interaction. In: International Conference on Connected Vehicles and Expo (ICCVE), pp. 777–780. IEEE (2014)

    Google Scholar 

  46. Sauro, J., Lewis, J.R.: When designing usability questionnaires, does it hurt to be positive? In: Proceedings of CHI 2011, pp. 2215–2224. ACM (2011). https://doi.org/10.1145/1978942.1979266

  47. Shakeri, G., Williamson, J. H., Brewster, S.: May the force be with you: ultrasound haptic feedback for mid-air gesture interaction in cars. In: Proceedings of AutomotiveUI 2018, pp. 1–10. ACM (2018). https://doi.org/10.1145/3239060.3239081

  48. Stiegemeier, D., Bringeland, S., Kraus, J., Baumann, M.: User experience of in-vehicle gesture interaction: exploring the effect of autonomy and competence in a mock-up experiment. In: Proc. of AutomotiveUI 2022, pp. 285–296. ACM (2022)

    Google Scholar 

  49. Sun, C., Nai, W., Sun, X.: Tactile sensitivity in ultrasonic haptics: do different parts of hand and different rendering methods have an impact on perceptual threshold? Virt. Real. Intell. Hardw. 1(3), 265–275 (2019). https://doi.org/10.3724/SP.J.2096-5796.2019.0009

    Article  Google Scholar 

  50. Swette, R., May, K.R., Gable, T.M., Walker, B.N.: Comparing three novel multimodal touch interfaces for infotainment menus. In: Proceedings of AutomotiveUI 2013, pp. 100–107. ACM (2013). https://doi.org/10.1145/2516540.2516559

  51. Špakov, O., Farooq, A., Venesvirta, H., Hippula, A., Surakka, V., Raisamo, R.: Ultrasound feedback for mid-air gesture interaction in vibrating environment. In: Ahram, T., Taiar, R. (eds) IHIET-AI: Artificial Intelligence & Future Applications International Conference, vol. 23. AHFE Open Access (2022)

    Google Scholar 

  52. Vo, D-B., Brewster, S.: Touching the invisible: Localizing ultrasonic haptics cues. In: Proceedings of IEEE World Haptics Conference (WHC 2015), pp 368–373. IEEE (2015)

    Google Scholar 

  53. Weidner, F., Broll, W.: Interact with your car: a user-elicited gesture set to inform future in-car user interfaces. In: Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia (MUM 2019), Article 11, pp. 1–12. ACM (2019). https://doi.org/10.1145/3365610.3365625

  54. Wolfe, B., Kosovicheva, A., Stent, S., Rosenholtz, R.: Effects of temporal and spatiotemporal cues on detection of dynamic road hazards. Cogn. Res. Princip. Implicat. 6(1), 1–15 (2021). https://doi.org/10.1186/s41235-021-00348-4

    Article  Google Scholar 

  55. Wu, S., Gable, T., May, K., Choi, Y., Walker, B.: Comparison of surface gestures and air gestures for in-vehicle menu navigation. Arch. Des. Res. 29(4), 65 (2016)

    Google Scholar 

  56. Young, G., Milne, H., Griffiths, D., Padfield, E., Blenkinsopp, R., Georgiou, O.: Designing mid-air haptic gesture-controlled user interfaces for cars. In: Proceedings of the ACM on Human-Computer Interaction, vol. 4(EICS), pp 1–23. ACM (2020). https://doi.org/10.1145/3397869

  57. Zhang, N., Wang, W.-X., Huang, S.-Y., Luo, R.-M.: Mid-air gestures for in-vehicle media player: elicitation, segmentation, recognition, and eye-tracking testing. SN Appl. Sci. 4(4), 1–18 (2022). https://doi.org/10.1007/s42452-022-04992-3

    Article  Google Scholar 

Download references

Acknowledgements

This research was carried out as part of the Adaptive Multimodal In-Vehicle Interaction project (AMICI), which was funded by Business Finland (grant 1316/31/2021).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Oleg Spakov .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Spakov, O. et al. (2023). Cueing Car Drivers with Ultrasound Skin Stimulation. In: Krömker, H. (eds) HCI in Mobility, Transport, and Automotive Systems. HCII 2023. Lecture Notes in Computer Science, vol 14049. Springer, Cham. https://doi.org/10.1007/978-3-031-35908-8_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35908-8_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35907-1

  • Online ISBN: 978-3-031-35908-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics