skip to main content
10.1145/2582051.2582056acmotherconferencesArticle/Chapter ViewAbstractPublication PagesahConference Proceedingsconference-collections
research-article

Multi-touch steering wheel for in-car tertiary applications using infrared sensors

Published: 07 March 2014 Publication History

Abstract

This paper proposes a multi-touch steering wheel for in-car tertiary applications. Existing interfaces for in-car applications such as buttons and touch displays have several operating problems. For example, drivers have to consciously move their hands to the interfaces as the interfaces are fixed on specific positions. Therefore, we developed a steering wheel where touch positions can correspond to different operating positions. This system can recognize hand gestures at any position on the steering wheel by utilizing 120 infrared (IR) sensors embedded in it. The sensors are lined up in an array surrounding the whole wheel. An Support Vector Machine (SVM) algorithm is used to learn and recognize the different gestures through the data obtained from the sensors. The gestures recognized are flick, click, tap, stroke and twist. Additionally, we implemented a navigation application and an audio application that utilizes the torus shape of the steering wheel. We conducted an experiment to observe the possibility of our proposed system to recognize flick gestures at three positions. Results show that an average of 92% of flick could be recognized.

References

[1]
Dietz, P., and Leigh, D. Diamondtouch: a multi-user touch technology. In Proceedings of the 14th annual ACM symposium on User interface software and technology, ACM (2001), 219--226.
[2]
Doring, T., Kern, D., Marshall, P., Pfeiffer, M., Schoning, J., Gruhn, V., and Schmidt, A. Gestural interaction on the steering wheel: reducing the visual demand. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM (2011), 483--492.
[3]
Geiser, G. Man machine interaction in vehicles. ATZ 87 (1985), 74--77.
[4]
González, I. E., Wobbrock, J. O., Chau, D. H., Faulring, A., and Myers, B. A. Eyes on the road, hands on the wheel: thumb-based interaction techniques for input on steering wheels. In Proceedings of Graphics Interface 2007, ACM (2007), 95--102.
[5]
Hachisu, T., and Kajimoto, H. Hachistack: dual-layer photo touch sensing for haptic and auditory tapping interaction. In Proceedings of the 2013 ACM annual conference on Human factors in computing systems, ACM (2013), 1411--1420.
[6]
Han, J. Y. Low-cost multi-touch sensing through frustrated total internal reflection. In Proceedings of the 18th annual ACM symposium on User interface software and technology, ACM (2005), 115--118.
[7]
Komeiji, S., Sato, K., Minamizawa, K., Nii, H., Kawakami, N., and Tachi, S. Meistergrip: cylindrical interface for intuitional robot manipulation. In ACM SIGGRAPH 2008 new tech demos, ACM (2008), 25.
[8]
Matsushita, N., and Rekimoto, J. Holowall: designing a finger, hand, body, and object sensitive wall. In Proceedings of the 10th annual ACM symposium on User interface software and technology, ACM (1997), 209--210.
[9]
Muller, C., Weinberg, G. Multimodal Input in the Car, Today and Tomorrow. Multimedia, IEEE 18, 1 (2011), 98--103.
[10]
Ogata, M., Sugiura, Y., Makino, Y., Inami, M., and Imai, M. Senskin: adapting skin as a soft interface. In Proceedings of the 26th annual ACM symposium on User interface software and technology, ACM (2013), 539--544.
[11]
Rekimoto, J. Smartskin: an infrastructure for freehand manipulation on interactive surfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems: Changing our world, changing ourselves, ACM (2002), 113--120.
[12]
Shizuki, B., Naito, M., and Tanaka, J. Browsing 3d media using cylindrical multi-touch interface. In Multimedia, 2008. ISM 2008. Tenth IEEE International Symposium on, IEEE (2008), 489--490.
[13]
Sugiura, Y., Kakehi, G., Withana, A., Lee, C., Sakamoto, D., Sugimoto, M., Inami, M., and Igarashi, T. Detecting shape deformation of soft objects using directional photoreflectivity measurement. In Proceedings of the 24th annual ACM symposium on User interface software and technology, ACM (2011), 509--516.
[14]
Takaki, Y., Urano, Y., Kashiwada, S., Ando, H., and Nakamura, K. Super multi-view windshield display for long-distance image information presentation. Optics Express 19, 2 (2011), 704--716.

Cited By

View all
  • (2023)Effects of Urgency and Cognitive Load on Interaction in Highly Automated VehiclesProceedings of the ACM on Human-Computer Interaction10.1145/36042547:MHCI(1-20)Online publication date: 13-Sep-2023
  • (2023)Active signal-generating spacer-fabric-type continuous touch/pressure sensorSmart Materials and Structures10.1088/1361-665X/acb47132:3(035029)Online publication date: 13-Feb-2023
  • (2021)SwiVR-Car-SeatProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34949685:4(1-26)Online publication date: 30-Dec-2021
  • Show More Cited By

Index Terms

  1. Multi-touch steering wheel for in-car tertiary applications using infrared sensors

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AH '14: Proceedings of the 5th Augmented Human International Conference
    March 2014
    249 pages
    ISBN:9781450327619
    DOI:10.1145/2582051
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    • MEET IN KOBE 21st Century: MEET IN KOBE 21st Century

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 March 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. automobile
    2. gesture recognition
    3. infrared sensor
    4. interaction design
    5. multi-touch
    6. torus interface

    Qualifiers

    • Research-article

    Conference

    AH '14
    Sponsor:
    • MEET IN KOBE 21st Century

    Acceptance Rates

    Overall Acceptance Rate 121 of 306 submissions, 40%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)14
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 08 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Effects of Urgency and Cognitive Load on Interaction in Highly Automated VehiclesProceedings of the ACM on Human-Computer Interaction10.1145/36042547:MHCI(1-20)Online publication date: 13-Sep-2023
    • (2023)Active signal-generating spacer-fabric-type continuous touch/pressure sensorSmart Materials and Structures10.1088/1361-665X/acb47132:3(035029)Online publication date: 13-Feb-2023
    • (2021)SwiVR-Car-SeatProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34949685:4(1-26)Online publication date: 30-Dec-2021
    • (2021)Enhancing Interactions for In-Car Voice User Interface with Gestural Input on the Steering Wheel13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3409118.3475126(59-68)Online publication date: 9-Sep-2021
    • (2020)waveSenseProceedings of the Augmented Humans International Conference10.1145/3384657.3384790(1-7)Online publication date: 16-Mar-2020
    • (2020)A Cascaded Multimodal Natural User Interface to Reduce Driver DistractionIEEE Access10.1109/ACCESS.2020.30027758(112969-112984)Online publication date: 2020
    • (2020)User interface for in-vehicle systems with on-wheel finger spreading gestures and head-up displaysJournal of Computational Design and Engineering10.1093/jcde/qwaa052Online publication date: 19-Jun-2020
    • (2019)Towards a Taxonomy for In-Vehicle Interactions Using Wearable Smart Textiles: Insights from a User-Elicitation StudyMultimodal Technologies and Interaction10.3390/mti30200333:2(33)Online publication date: 9-May-2019
    • (2019)The intuitive grasp interface: design and evaluation of micro-gestures on the steering wheel for driving scenarioUniversal Access in the Information Society10.1007/s10209-019-00647-0Online publication date: 5-Apr-2019
    • (2018)The Handlebar as an Input Field: Evaluating Finger Gestures Designed for Bicycle RidersAdvances in Human Aspects of Transportation10.1007/978-3-319-93885-1_59(648-659)Online publication date: 28-Jun-2018
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media