skip to main content
10.1145/2381416.2381442acmotherconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
research-article

Natural, intuitive finger based input as substitution for traditional vehicle control

Published: 30 November 2011 Publication History

Abstract

Both amount as well as dynamicity of content to be displayed in a car increases steadily, forcing manufacturer to change over to customizable screens integrated in dashboard and center console instead of dozens to hundreds of individual control signals. In addition, new requirements such as Internet access in the car or web services accessible while driving invalidates rudimentary display formats. Traditional forms of interaction such as buttons or knobs are unsuitable to respond to dynamic content shown on digital screens, requesting new mechanisms for distraction-free yet effective user (driver) input. We pick up this problem by introducing a novel sensing device allowing for natural, contactless, and eyes-free operation by relating finger movements in the area of the gearshift to screen coordinates. To assess quality features of this interface two research questions were formulated, (i) that the application of such a device would allow for natural, intuitive mouse pointer control in a similar manner than traditional forms of input and (ii) that the interface is insusceptible to varying workload conditions of the driver. Results from experimentation have revealed that, with respect to the first hypothesis, proximity sensing in a two-dimensional plane is a viable approach to directly control a mouse cursor on a screen integrated into the dashboard. A generally accepted conclusion on the assumption that the index of performance of the interface does not change with varying workload (hypothesis ii) cannot be drawn. To simulate different conditions of workload a dual task signal-response setting was used.

References

[1]
B. Bell, J. Kleban, D. Overholt, L. Putnam, J. Thompson, and J. Kuchera-Morin. The multimodal music stand. In Proceedings of the 7th international conference on New interfaces for musical expression, NIME '07, pages 62--65, New York, NY, USA, 2007. ACM.
[2]
Capacitec. Capacitive Sensors for Non-Contact Displacement Measurement. online, last retrieved October 21 2011. http://www.capacitec.com.
[3]
C. Chin and A. Barreto. Electromyograms as physiological inputs that provide efficient computer cursor control. In Proceedings of the 2006 WSEAS International Conference on Mathematical Biology and Ecology (MABE '06), Miami, Florida, January 2006.
[4]
C. Chin and A. Barreto. Performance comparison of electromyogram-based computer cursor control systems. WSEAS Transactions on Biology and Biomedicine, 3:118, 2006.
[5]
A. De Sena and D. Moschini. A Mouse Evaluation Framework for Fitts' Test. Phd report, Dipartimento di Informatica, Universita di Verona, 2006.
[6]
Der Standard. Das Auto als Handy-Zusatz: Internet am Steuer wird Realität. online, September 13 2011. http://derstandard.at/1315006249952, last retrieved October 21, 2011.
[7]
C. Endres and S. Dimitrov. Using a Theremin for Micro-Gesture Recognition in an Automotive Environment. In Adjunct proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI'2010), November 11-12, Pittsburgh, PA, US, page 1, 11 2010.
[8]
C. Endres and C. A. M. T. Schwartz. "Geremin": 2D Microgestures for Drivers Based on Electric Field Sensing. In Proceedings of the 15th International Conference on Intelligent User Interfaces (IUI '11), Palo Alto, CA, USA, pages 327--330, 2011.
[9]
P. M. Fitts. The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 47(6):381--391, June 1954. (Reprinted in Journal of Experimental Psychology: General, 121(3), pp. 262--269, 1992).
[10]
P. M. Fitts and J. R. Peterson. Information capacity of discrete motor responses. Journal of Experimental Psychology, 67(2):102--113, February 1964.
[11]
H. Fuller and O. Tsimhoni. Glance strategies for using an in-vehicle touch-screen monitor. Report UMTRI-2009-5, The University of Michigan, Transportation Research Institute, Ann Arbor, Michigan 48109--2150, USA, April 2009. pp. 32.
[12]
P. Garg, N. Aggarwal, and S. Sofat. Vision based hand gesture recognition. World Academy of Science, Engineering and Technology, 49:972--977, September 2009.
[13]
C. Geiger, H. Reckter, D. Paschke, F. Schutz, and C. Poepel. Towards participatory design and evaluation of theremin-based musical interfaces. In Proceedings of the 2008 Conference on New Interfaces for Musical Expression, pages 303--306, Genova, Italy, June 2008.
[14]
I. E. González, J. O. Wobbrock, D. H. Chau, A. Faulring, and B. A. Myers. Eyes on the road, hands on the wheel: thumb-based interaction techniques for input on steering wheels. In Proceedings of Graphics Interface 2007, GI '07, pages 95--102, New York, NY, USA, 2007. ACM.
[15]
S. Graf, W. Spiessl, A. Schmidt, A. Winter, and G. Rigoll. In-car interaction using search-based user interfaces. In Proceeding of the 26th SIGCHI conference on Human factors in computing systems, CHI '08, pages 1685--1688, New York, NY, USA, 2008. ACM.
[16]
D. M. Gronwall. Paced auditory serial-addition task: a measure of recovery from concussion. Perceptual and motor skills, 44(2):367--373, 1977.
[17]
D. Kern, A. Mahr, S. Castronovo, A. Schmidt, and C. Müller. Making use of drivers' glances onto the screen for explicit gaze-based interaction. In Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI '10, pages 110--116, New York, NY, USA, 2010. ACM.
[18]
S. C. Lee and T. Starner. BuzzWear: Alert perception in wearable tactile displays on the wrist. In Proceedings of the 28th international conference on Human factors in computing systems (CHI'10), pages 433--442, New York, NY, USA, 2010. ACM.
[19]
X. Li. Acoustic Proximity Ranging and its Application to Cavity Thickness Mo. PhD Thesis, Graduate School of the University of Florida, May 2003.
[20]
I. S. MacKenzie. Fitts' law as a research and design tool in human-computer interaction. Hum.-Comput. Interact., 7:91--139, March 1992.
[21]
A. Marks. Smart cars. Audi magazin, 02:30--33, June 2011.
[22]
L. Mowatt. Nuance Voice Control for Automotive -- Enabling a Single Consistent Voice User Interface to Connected Car Services. White paper, Nuance Communications, February 2009. {online}, http://www.nuance.com/industries/automotive/whitepapers/AutomotiveConnectedCarWP.pdf, retrieved July 8th, 2011.
[23]
G. Murthy and R. Jadon. A review of vision based hand gestures recognition. International Journal of Information Technology and Knowledge Management (IJITKM), 2:405--410, December 2009.
[24]
S. Rao. Instructions for the paced auditory serial addition test. Technical report, December 2000. pp. 8, http://pasat.us/PDF/PASAT_Manual.pdf, last retrieved October 21, 2011.
[25]
G. Reiter and W. G. Matthäus. Marktforschung und Datenanalyse mit EXCEL. Oldenbourg Wissenschaftsverlag, 2000. ISBN: 3-486-25394-8.
[26]
A. Riener. Display content adaptation using a force sensitive office chair. International Journal of Ambient Computing and Intelligence (IJACI), 3(3):8--17 (10), July 2011. ISSN: 1941--6237.
[27]
A. Riener and A. Ferscha. Supporting Implicit Human-to-Vehicle Interaction: Driver Identification from Sitting Postures. In The First Annual International Symposium on Vehicular Computing Systems (ISVCS 2008), July 22--24, 2008, Trinity College Dublin, Ireland, page 10. ACM Digital Library, July 2008. ISBN: 978-963-9799-27-1.
[28]
A. Riener, P. Wintersberger, and A. Ferscha. Fitts' law as a performance measure to evaluate the fitness of Theremins for implicit interaction. In Sixth international conference on tangible, embedded and embodied interaction (TEI 2012), page 8. ACM, February 19--22 2012. submitted.
[29]
A. Sahami, P. Holleis, A. Schmidt, and J. Häkkilä. Rich Tactile Output on Mobile Devices. In Proceedings of the European Conference on Ambient Intelligence, AmI '08, pages 210--221, Berlin, Heidelberg, 2008. Springer-Verlag.
[30]
J. Slocum. A breakdown of the psychomotor components of input device usage. Usability News, 71, 2005. published online, URL: http://psychology.wichita.edu/newsurl/usabilitynews/71/input_devices.asp, last retrieved March 30, 2009.
[31]
L. S. Theremin. Method of and apparatus for the generation of sound, 1928.
[32]
R. Volpe and R. Ivlev. A survey and experimental evaluation of proximity sensors for space robotics. In Proceedings of the 1994 IEEE International Conference on Robotics and Automation, volume 4, pages 3466--3473, May 1994.
[33]
R. Wimmer. Using capacitive sensors for tracking large-scale and small-scale body motion. Position paper, HCI 2008, page 3, 2008.

Cited By

View all
  • (2024)The Content-Specific Display: Between Medium and MetaphorDesigns10.3390/designs80601098:6(109)Online publication date: 25-Oct-2024
  • (2022)A Design Space for Human Sensor and Actuator Focused In-Vehicle Interaction Based on a Systematic Literature ReviewProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35346176:2(1-51)Online publication date: 7-Jul-2022
  • (2020)From Controls on the Steering Wheel to Controls on the FingerCompanion Publication of the 2020 ACM Designing Interactive Systems Conference10.1145/3393914.3395851(299-304)Online publication date: 6-Jul-2020
  • Show More Cited By

Index Terms

  1. Natural, intuitive finger based input as substitution for traditional vehicle control

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      AutomotiveUI '11: Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
      November 2011
      190 pages
      ISBN:9781450312318
      DOI:10.1145/2381416
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      • ICT&S: ICT&S Center, University of Salzburg

      In-Cooperation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 30 November 2011

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. capacitive proximity sensing
      2. cognitive workload
      3. driver-vehicle feedback
      4. natural interaction
      5. user-configurable screens

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      AutomotiveUI '11
      Sponsor:
      • ICT&S

      Acceptance Rates

      Overall Acceptance Rate 248 of 566 submissions, 44%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)11
      • Downloads (Last 6 weeks)1
      Reflects downloads up to 10 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)The Content-Specific Display: Between Medium and MetaphorDesigns10.3390/designs80601098:6(109)Online publication date: 25-Oct-2024
      • (2022)A Design Space for Human Sensor and Actuator Focused In-Vehicle Interaction Based on a Systematic Literature ReviewProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35346176:2(1-51)Online publication date: 7-Jul-2022
      • (2020)From Controls on the Steering Wheel to Controls on the FingerCompanion Publication of the 2020 ACM Designing Interactive Systems Conference10.1145/3393914.3395851(299-304)Online publication date: 6-Jul-2020
      • (2020)Evaluation of different interface designs for human-machine interaction in vehiclesMultimedia Tools and Applications10.1007/s11042-020-08920-8Online publication date: 6-May-2020
      • (2019)Automotive multimodal human-machine interfaceThe Handbook of Multimodal-Multisensor Interfaces10.1145/3233795.3233809(477-522)Online publication date: 1-Jul-2019
      • (2019)Social car: The research of interaction design on the driver's communication systemConcurrency and Computation: Practice and Experience10.1002/cpe.513932:6Online publication date: 17-Jan-2019
      • (2017)VoGe: A Voice and Gesture System for Interacting with Autonomous CarsSAE Technical Paper Series10.4271/2017-01-0068Online publication date: 28-Mar-2017
      • (2017)Eliminating Driving Distractions: Human-Computer Interaction with Built-In ApplicationsIEEE Vehicular Technology Magazine10.1109/MVT.2016.262533112:1(20-29)Online publication date: Mar-2017
      • (2015)A leap for touchProceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/2799250.2799273(39-46)Online publication date: 1-Sep-2015
      • (2014)Gesturing on the Steering WheelProceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/2667317.2667414(1-8)Online publication date: 17-Sep-2014
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media