skip to main content
10.1145/2993148.2993153acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Comparison of three implementations of HeadTurn: a multimodal interaction technique with gaze and head turns

Published: 31 October 2016 Publication History

Abstract

The best way to construct user interfaces for smart glasses is not yet known. We investigated the use of eye tracking in this context in two experiments. The eye and head movements were combined so that one can select the object to interact by looking at it and then change a setting in that object by turning the head horizontally. We compared three different techniques for mapping the head turn to scrolling a list of numbers with and without haptic feedback. We found that the haptic feedback had no noticeable effect in objective metrics, but it sometimes improved user experience. Direct mapping of head orientation to list position is fast and easy to understand, but the signal-to-noise ratio of eye and head position measurement limits the possible range. The technique with constant rate of change after crossing the head angle threshold was simple and functional, but slow when the rate of change is adjusted to suit beginners. Finally the rate of change dependent on the head angle tends to lead to fairly long task completion times, although in theory it offers a good combination of speed and accuracy.

References

[1]
Deepak Akkil, Jari Kangas, Jussi Rantala, Poika Isokoski, Oleg Špakov, and Roope Raisamo. 2015. Glance Awareness and Gaze Interaction in Smartwatches. In Proc. of CHI EA ’15. ACM, 1271–1276.
[2]
Douglas A. Craig and Hung T. Nguyen. 2005. Wireless Real-Time Head Movement System Using a Personal Digital Assistant (PDA) for Control of a Power Wheelchair. In Proc. of EMBS ’05. IEEE, 772–775.
[3]
Andrew Crossan, Mark McGill, Stephen Brewster, and Roderick Murray-Smith. 2009. Head Tilting for Interaction in Mobile Contexts. In Proc. of MobileHCI ’09. ACM, Article 6, 10 pages.
[4]
Heiko Drewes, Alexander De Luca, and Albrecht Schmidt. 2007. Eye-gaze Interaction for Mobile Phones. In Proc. of Mobility ’07. ACM, 364–371.
[5]
Zhang fang Hu, Lin Li, Yuan Luo, Yi Zhang, and Xing Wei. 2010. A novel intelligent wheelchair control approach based on head gesture recognition. In Proc. of ICCASM ’10, Vol. 6. IEEE, 159–163.
[6]
Dan Witzner Hansen, David J. C. MacKay, John Paulin Hansen, and Mads Nielsen. 2004. Eye Tracking off the Shelf. In Proc. of ETRA ’04. ACM, 58–58.
[7]
Robert J. K. Jacob. 1991. The Use of Eye Movements in Human-computer Interaction Techniques: What You Look at is What You Get. ACM Trans. Inf. Syst. 9, 2 (1991), 152–169.
[8]
Shahram Jalaliniya, Diako Mardanbegi, and Thomas Pederson. 2015. MAGIC Pointing for Eyewear Computers. In Proc. of ISWC ’15. ACM, 155–158.
[9]
Jari Kangas, Deepak Akkil, Jussi Rantala, Poika Isokoski, Päivi Majaranta, and Roope Raisamo. 2014. Gaze Gestures and Haptic Feedback in Mobile Devices. In Proc. CHI ’14. ACM, 435–438.
[10]
Viktoria A. Kettner and Jeremy I.M. Carpendale. 2013. Developing gestures for no and yes: Head shaking and nodding in infancy. Gesture 13, 2 (2013), 193–209.
[11]
Manu Kumar, Andreas Paepcke, and Terry Winograd. 2007. EyePoint: Practical Pointing and Selection Using Gaze and Keyboard. In Proc. of CHI ’07. ACM, 421–430.
[12]
Edmund LoPresti, David M. Brienza, Jennifer Angelo, Lars Gilbertson, and Jonathan Sakai. 2000. Neck Range of Motion and Use of Computer Head Controls. In Proc. of Assets ’00. ACM, 121–128.
[13]
Kristian Lukander, Sharman Jagadeesan, Huageng Chi, and Kiti Müller. 2013. OMG!: A New Robust, Wearable and Affordable Open Source Mobile Gaze Tracker. In Proc. of MobileHCI ’13. ACM, 408–411.
[14]
Päivi. Majaranta, Ulla-Kaija Ahola, and Oleg. Špakov. 2009. Fast Gaze Typing with an Adjustable Dwell Time. In Proc. of CHI ’09. ACM, 357–360.
[15]
Päivi Majaranta and Kari-Jouko Räihä. 2002. Twenty Years of Eye Typing: Systems and Design Issues. In Proc. of ETRA ’02. ACM, 15–22.
[16]
Diako Mardanbegi, Dan Witzner Hansen, and Thomas Pederson. 2012. Eye-based Head Gestures. In Proc. of ETRA ’12. ACM, 139–146.
[17]
Takashi Nagamatsu, Michiya Yamamoto, and Hiroshi Sato. 2010. MobiGaze: Development of a Gaze Interface for Handheld Mobile Devices. In Proc. of CHI EA ’10. ACM, 3349–3354.
[18]
Tomi Nukarinen, Jari Kangas, Oleg Špakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016.
[19]
Evaluation of HeadTurn - An Interaction Technique Using the Gaze and Head Turns. In A Manuscript accepted for publication in NordiCHI 2016.
[20]
Jeff Pelz, Mary Hayhoe, and Russ Loeber. 2001. The coordination of eye, head, and hand movements in a natural task. Experimental Brain Research 139, 3 (2001), 266–277.
[21]
David Rozado, T. Moreno, Javier. San Agustin, Francisco B. Rodriguez, and Pablo Varona. 2015. Controlling a Smartphone Using Gaze Gestures as the Input Mechanism. Human-â ˘ A¸SComputer Interaction 30, 1 (2015), 34–63.
[22]
Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of Eye Gaze Interaction. In Proc. of CHI ’00. ACM, 281–288.
[23]
Sophie Stellmach and Raimund Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition. In Proc. of CHI ’12. ACM, 2981–2990.
[24]
Sophie Stellmach and Raimund Dachselt. 2013. Still Looking: Investigating Seamless Gaze-supported Selection, Positioning, and Manipulation of Distant Targets. In Proc. of CHI ’13. ACM, 285–294.
[25]
Oleg Špakov. 2012. Comparison of Eye Movement Filters Used in HCI. In Proc. of ETRA ’12. ACM, 281–284.
[26]
Oleg Špakov, Poika Isokoski, and Päivi Majaranta. 2014. Look and Lean: Accurate Head-assisted Eye Pointing. In Proc. of ETRA ’14. ACM, 35–42.
[27]
Oleg Špakov and Päivi Majaranta. 2012. Enhanced Gaze Interaction Using Simple Head Gestures. In Proc. of UbiComp ’12. ACM, 705–710.

Cited By

View all
  • (2021)Ensuring a Robust Multimodal Conversational User Interface During Maintenance WorkProceedings of Mensch und Computer 202110.1145/3473856.3473871(79-91)Online publication date: 5-Sep-2021
  • (2020)Headbang: Using Head Gestures to Trigger Discrete Actions on Mobile Devices22nd International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3379503.3403538(1-10)Online publication date: 5-Oct-2020
  • (2020)Interactive Auditory Mediated RealityProceedings of the 2020 ACM Designing Interactive Systems Conference10.1145/3357236.3395493(2035-2050)Online publication date: 3-Jul-2020
  • Show More Cited By

Index Terms

  1. Comparison of three implementations of HeadTurn: a multimodal interaction technique with gaze and head turns

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        ICMI '16: Proceedings of the 18th ACM International Conference on Multimodal Interaction
        October 2016
        605 pages
        ISBN:9781450345569
        DOI:10.1145/2993148
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 31 October 2016

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. Gaze interaction
        2. haptic feedback
        3. head moves

        Qualifiers

        • Research-article

        Conference

        ICMI '16
        Sponsor:

        Acceptance Rates

        Overall Acceptance Rate 453 of 1,080 submissions, 42%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)10
        • Downloads (Last 6 weeks)2
        Reflects downloads up to 27 Feb 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2021)Ensuring a Robust Multimodal Conversational User Interface During Maintenance WorkProceedings of Mensch und Computer 202110.1145/3473856.3473871(79-91)Online publication date: 5-Sep-2021
        • (2020)Headbang: Using Head Gestures to Trigger Discrete Actions on Mobile Devices22nd International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3379503.3403538(1-10)Online publication date: 5-Oct-2020
        • (2020)Interactive Auditory Mediated RealityProceedings of the 2020 ACM Designing Interactive Systems Conference10.1145/3357236.3395493(2035-2050)Online publication date: 3-Jul-2020
        • (2020)HeadReach: Using Head Tracking to Increase Reachability on Mobile Touch DevicesProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376868(1-12)Online publication date: 21-Apr-2020
        • (2019)Visual Attention and Haptic Control: A Cross-Study2019 IEEE Fifth International Conference on Multimedia Big Data (BigMM)10.1109/BigMM.2019.00-36(111-117)Online publication date: Sep-2019
        • (2018)Systematic Literature Review on User Logging in Virtual RealityProceedings of the 22nd International Academic Mindtrek Conference10.1145/3275116.3275123(110-117)Online publication date: 10-Oct-2018
        • (2018)Estimating Head Motion from Egocentric VisionProceedings of the 20th ACM International Conference on Multimodal Interaction10.1145/3242969.3242982(342-346)Online publication date: 2-Oct-2018

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media