skip to main content
10.1145/2971485.2971490acmotherconferencesArticle/Chapter ViewAbstractPublication PagesnordichiConference Proceedingsconference-collections
research-article

Evaluation of HeadTurn: An Interaction Technique Using the Gaze and Head Turns

Published: 23 October 2016 Publication History

Abstract

Smart glasses equipped with eye tracking technology could be utilized to develop natural interaction techniques. They could be used to conveniently interact with an electronic appliance in the environment from a distance. We describe a technique, HeadTurn, that allows a user to look at a device and then control it by turning the head to the left or right. We evaluated HeadTurn using an interface that linked head turning to increasing or decreasing of a number shown on a display. The task was to adjust then number to a given value. We studied the optimal rate at which number should change once the angle of head turn exceed a predefined threshold. We varied the rate of change of the number (217, 290, and 435ms per change) and the feedback (visual, haptic+visual). In the haptic condition, a 20 millisecond vibration was given through vibrating eye glass frame with each number change. Participants completed number selections faster with shorter intervals but also overshot the target more often. Seven out of 12 participants preferred the middle number changing speed (i.e., 290 ms). There were no statistically significant differences in task completion times. The optimal change rate of the numbers seems to be a compromise between faster selection and overshooting. Haptic feedback made the interaction slightly faster but the difference was not significant. The participants rated their experience with the technique as positive in general.

References

[1]
Akkil, D., Kangas, J., Rantala, J., Isokoski, P., Špakov, O., and Raisamo, R. 2015. Glance Awareness and Gaze Interaction in Smartwatches. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '15). ACM, New York, NY, USA, 1271--1276.
[2]
Craig, D. and Nguyen, H. T. 2005. Wireless Real-Time Head Movement System Using a Personal Digital Assistant (PDA) for Control of a Power Wheelchair. In Engineering in Medicine and Biology Society, 2005. IEEE-EMBS 2005. 27th Annual International Conference of the. IEEE, New York, 772--775.
[3]
Crossan, A., McGill, M., Brewster, S., and Murray-Smith, R. 2009. Head Tilting for Interaction in Mobile Contexts. In Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '09). ACM, New York, NY, USA, Article 6, 10 pages.
[4]
Drewes, H., De Luca, A., and Schmidt, A. 2007. Eye-gaze Interaction for Mobile Phones. In Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology (Mobility '07). ACM, New York, NY, USA, 364--371.
[5]
Fang Hu, Z., Li, L., Luo, Y., Zhang, Y., and Wei, X. 2010. A novel intelligent wheelchair control approach based on head gesture recognition. In Computer Application and System Modeling (ICCASM), 2010 International Conference on, Vol. 6. IEEE, New York, V6-159--V6-163.
[6]
Hansen, D. W., MacKay, D. J. C., Hansen, J. P., and Nielsen, M. 2004. Eye Tracking off the Shelf. In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications (ETRA '04). ACM, New York, NY, USA, 58--58.
[7]
Huckauf, A. and Urbina, M. H. 2008. On object selection in gaze controlled environments. Journal of Eye Movement Research 4, 2 (2008), 1--7.
[8]
Jacob, R. J. K. 1991. The Use of Eye Movements in Human-computer Interaction Techniques: What You Look at is What You Get. ACM Trans. Inf. Syst. 9, 2 (April 1991), 152--169.
[9]
Jalaliniya, S., Mardanbegi, D., and Pederson, T. 2015. MAGIC Pointing for Eyewear Computers. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC '15). ACM, New York, NY, USA, 155--158.
[10]
Kangas, J., Akkil, D., Rantala, J., Isokoski, P., Majaranta, P., and Raisamo, R. 2014. Gaze Gestures and Haptic Feedback in Mobile Devices. In Proc. CHI 2014. ACM Press, New York, NY, USA, 435--438.
[11]
Kettner, V. A. and Carpendale, J. I. 2013. Developing gestures for no and yes: Head shaking and nodding in infancy. Gesture 13, 2 (2013), 193--209. http://search. ebscohost.com/login.aspx?direct=true&AuthType=cookie, ip,uid&db=ufh&AN=94953751&site=ehost-live&scope=site
[12]
Kumar, M., Paepcke, A., and Winograd, T. 2007. EyePoint: Practical Pointing and Selection Using Gaze and Keyboard. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '07). ACM, New York, NY, USA, 421--430.
[13]
LoPresti, E., Brienza, D. M., Angelo, J., Gilbertson, L., and Sakai, J. 2000. Neck Range of Motion and Use of Computer Head Controls. In Proceedings of the Fourth International ACM Conference on Assistive Technologies (Assets '00). ACM, New York, NY, USA, 121--128.
[14]
Lukander, K., Jagadeesan, S., Chi, H., and Müller, K. 2013. OMG!: A New Robust, Wearable and Affordable Open Source Mobile Gaze Tracker. In Proceedings of the 15th International Conference on Human-computer Interaction with Mobile Devices and Services (MobileHCI '13). ACM, New York, NY, USA, 408--411.
[15]
Majaranta, P. and Räihä, K.-J. 2002. Twenty Years of Eye Typing: Systems and Design Issues. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications (ETRA '02). ACM, New York, NY, USA, 15--22.
[16]
Mardanbegi, D., Hansen, D. W., and Pederson, T. 2012. Eye-based Head Gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 139--146.
[17]
Myles, K. and Kalb, J. T. 2010. Guidelines for head tactile communication. Technical Report ARL-TR-5116. Aberdeen Proving Ground, MD:Army Research Laboratory.
[18]
Nagamatsu, T., Yamamoto, M., and Sato, H. 2010. MobiGaze: Development of a Gaze Interface for Handheld Mobile Devices. In CHI '10 Extended Abstracts on Human Factors in Computing Systems (CHI EA '10). ACM, New York, NY, USA, 3349--3354.
[19]
Nielsen, J. 1993. Noncommand User Interfaces. Commun. ACM 36, 4 (April 1993), 83--99.
[20]
Rantala, J., Kangas, J., Akkil, D., Isokoski, P., and Raisamo, R. 2014. Glasses with Haptic Feedback of Gaze Gestures. In Proceedings of the Extended Abstracts of the 32Nd Annual ACM Conference on Human Factors in Computing Systems (CHI EA '14). ACM, New York, NY, USA, 1597--1602.
[21]
Rozado, D., Moreno, T., San Agustin, J., Rodriguez, F. B., and Varona, P. 2015. Controlling a Smartphone Using Gaze Gestures as the Input Mechanism. Human-Computer Interaction 30, 1 (2015), 34--63.
[22]
Sibert, L. E. and Jacob, R. J. K. 2000. Evaluation of Eye Gaze Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '00). ACM, New York, NY, USA, 281--288.
[23]
Smith, B. A., Yin, Q., Feiner, S. K., and Nayar, S. K. 2013. Gaze Locking: Passive Eye Contact Detection for Human-object Interaction. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST '13). ACM, New York, NY, USA, 271--280.
[24]
Špakov, O., Isokoski, P., and Majaranta, P. 2014. Look and Lean: Accurate Head-assisted Eye Pointing. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 35--42.
[25]
Špakov, O. and Majaranta, P. 2012. Enhanced Gaze Interaction Using Simple Head Gestures. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing (UbiComp '12). ACM, New York, NY, USA, 705--710.
[26]
Stellmach, S. and Dachselt, R. 2012. Look & Touch: Gaze-supported Target Acquisition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, New York, NY, USA, 2981--2990.
[27]
Stellmach, S. and Dachselt, R. 2013. Still Looking: Investigating Seamless Gaze-supported Selection, Positioning, and Manipulation of Distant Targets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 285--294.
[28]
Ware, C. and Mikaelian, H. H. 1987. An Evaluation of an Eye Tracker As a Device for Computer Input. In Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface (CHI '87). ACM, New York, NY, USA, 183--188.

Cited By

View all
  • (2024)Hands-free Selection in Scroll Lists for AR DevicesProceedings of Mensch und Computer 202410.1145/3670653.3670671(323-330)Online publication date: 1-Sep-2024
  • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
  • (2024)RimSenseProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314567:4(1-24)Online publication date: 12-Jan-2024
  • Show More Cited By

Index Terms

  1. Evaluation of HeadTurn: An Interaction Technique Using the Gaze and Head Turns

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    NordiCHI '16: Proceedings of the 9th Nordic Conference on Human-Computer Interaction
    October 2016
    1045 pages
    ISBN:9781450347631
    DOI:10.1145/2971485
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 October 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Gaze tracking
    2. gaze-based interaction
    3. haptic feedback
    4. head moves

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    Conference

    NordiCHI '16

    Acceptance Rates

    NordiCHI '16 Paper Acceptance Rate 58 of 231 submissions, 25%;
    Overall Acceptance Rate 379 of 1,572 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)46
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 27 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Hands-free Selection in Scroll Lists for AR DevicesProceedings of Mensch und Computer 202410.1145/3670653.3670671(323-330)Online publication date: 1-Sep-2024
    • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
    • (2024)RimSenseProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314567:4(1-24)Online publication date: 12-Jan-2024
    • (2024)Designing Upper-Body Gesture Interaction with and for People with Spinal Muscular Atrophy in VRProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642884(1-19)Online publication date: 11-May-2024
    • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
    • (2022)Augmenting Ear Accessories for Facial Gesture Input Using Infrared Distance Sensor ArrayElectronics10.3390/electronics1109148011:9(1480)Online publication date: 5-May-2022
    • (2022)ClenchClickProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35503276:3(1-26)Online publication date: 7-Sep-2022
    • (2022)“I Don’t Want People to Look At Me Differently”Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517552(1-15)Online publication date: 29-Apr-2022
    • (2022)Understanding Gesture Input Articulation with Upper-Body Wearables for Users with Upper-Body Motor ImpairmentsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501964(1-16)Online publication date: 29-Apr-2022
    • (2022)GazeDock: Gaze-Only Menu Selection in Virtual Reality using Auto-Triggering Peripheral Menu2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00105(832-842)Online publication date: Mar-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media