skip to main content
10.1145/2875194.2875209acmotherconferencesArticle/Chapter ViewAbstractPublication PagesahConference Proceedingsconference-collections
research-article

Feedback for Smooth Pursuit Gaze Tracking Based Control

Published: 25 February 2016 Publication History

Abstract

Smart glasses, like Google Glass or Microsoft HoloLens, can be used as interfaces that expand human perceptual, cognitive, and actuation capabilities in many everyday situations. Conventional manual interaction techniques, however, are not convenient with smart glasses whereas eye trackers can be built into the frames. This makes gaze tracking a natural input technology for smart glasses. Not much is known about interaction techniques for gaze-aware smart glasses. This paper adds to this knowledge, by comparing feedback modalities (visual, auditory, haptic, none) in a continuous adjustment technique for smooth pursuit gaze tracking. Smooth pursuit based gaze tracking has been shown to enable flexible and calibration free method for spontaneous interaction situations. Continuous adjustment, on the other hand, is a technique that is needed in many everyday situations such as adjusting the volume of a sound system or the intensity of a light source. We measured user performance and preference in a task where participants matched the shades of two gray rectangles. The results showed no statistically significant differences in performance, but clear user preference and acceptability for haptic and audio feedback.

References

[1]
Pupil labs. http://pupil-labs.com/. Accessed: 2015-11-27.
[2]
SMI eye tracking glasses 2. http://www.eyetracking-glasses.com/. Accessed: 2015-11-13.
[3]
Tobii glasses 2. http://www.tobiipro.com/product-listing/tobii-pro-glasses-2/. Accessed: 2015-11-13.
[4]
R. Biedert, G. Buscher, S. Schwarz, J. Hees, and A. Dengel. Text 2.0. In Extended abstracts of CHI 2010, pages 4003--4008, New York, NY, USA, 2010. ACM Press.
[5]
R. A. Bolt. Gaze-orchestrated dynamic windows. In Proc. Siggraph 1982, pages 109--119, New York, NY, USA, 1982. ACM Press.
[6]
D. H. Cymek, A. C. Venjakob, S. Ruff, O. H.-M. Lutz, S. Hofmann, and M. Roetting. Entering pin codes by smooth pursuit eye movements. Journal of Eye Movement Research, 7(4):1--11, 2014.
[7]
H. Drewes, A. De Luca, and A. Schmidt. Eye-gaze interaction for mobile phones. In Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology, Mobility '07, pages 364--371, New York, NY, USA, 2007. ACM.
[8]
H. Drewes and A. Schmidt. Interacting with the computer using gaze gestures. In Proc. INTERACT 2007, pages 475--488, New York, NY, 2007. Springer.
[9]
A. Esteves, E. Velloso, A. Bulling, and H. Gellersen. Orbits: Enabling gaze interaction in smart watches using moving targets. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers, UbiComp/ISWC'15 Adjunct, pages 419--422, New York, NY, USA, 2015. ACM.
[10]
A. Esteves, E. Velloso, A. Bulling, and H. Gellersen. Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, UIST '15, pages 457--466, New York, NY, USA, 2015. ACM.
[11]
A. Hyrskykari, H. Istance, and S. Vickers. Gaze gestures or dwell-based interactionfs. In Proc. ETRA'12, pages 229--232. ACM Press, 2012.
[12]
A. Hyrskykari, P. Majaranta, A. Aaltonen, and K.-J. Räihä. Design issues of idict: a gaze-assisted translation aid. In Proc. ETRA 2000, pages 9--14, New York, NY, USA, 2000. ACM Press.
[13]
R. J. K. Jacob. The use of eye movements in human-computer interaction techniques: What you look at is what you get. ACM Trans. Inf. Syst., 9(2):152--169, Apr. 1991.
[14]
M. Khamis, F. Alt, and A. Bulling. A field study on spontaneous gaze-based interaction with a public display using pursuits. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers, UbiComp/ISWC'15 Adjunct, pages 863--872, New York, NY, USA, 2015. ACM.
[15]
O. H.-M. Lutz, A. C. Venjakob, and S. Ruff. Entering pin codes by smooth pursuit eye movements. Journal of Eye Movement Research, 8(1):1--11, 2015.
[16]
P. Majaranta and K.-J. Räihä. Twenty years of eye typing: Systems and design issues. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, ETRA '02, pages 15--22, New York, NY, USA, 2002. ACM.
[17]
T. E. Nichols and A. P. Holmes. Nonparametric permutation tests for functional neuroimaging: A primer with examples. Human Brain Mapping, 15(1):1--25, 2002.
[18]
K. Pfeuffer, M. Vidal, J. Turner, A. Bulling, and H. Gellersen. Pursuit calibration: Making gaze calibration less tedious and more flexible. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, UIST '13, pages 261--270, New York, NY, USA, 2013. ACM.
[19]
J. Rantala, J. Kangas, D. Akkil, P. Isokoski, and R. Raisamo. Glasses with haptic feedback of gaze gestures. In Proceedings of the Extended Abstracts of the 32Nd Annual ACM Conference on Human Factors in Computing Systems, CHI EA '14, pages 1597--1602, New York, NY, USA, 2014. ACM.
[20]
D. A. Robinson. The mechanics of human smooth pursuit eye movement. The Journal of Physiology, 180(3):569--591, 1965.
[21]
I. Starker and R. A. Bolt. Gaze-responsive self-disclosing display. In Proc. CHI 1990, pages 3--10, New York, NY, USA, 1990. ACM Press.
[22]
M. Vidal, A. Bulling, and H. Gellersen. Pursuits: Spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp '13, pages 439--448, New York, NY, USA, 2013. ACM.
[23]
M. Vidal, A. Bulling, and H. Gellersen. Pursuits: Spontaneous eye-based interaction for dynamic interfaces. GetMobile: Mobile Comp. and Comm., 18(4):8--10, Jan. 2015.
[24]
M. Vidal, K. Pfeuffer, A. Bulling, and H. W. Gellersen. Pursuits: Eye-based interaction with moving targets. In CHI '13 Extended Abstracts on Human Factors in Computing Systems, CHI EA '13, pages 3147--3150, New York, NY, USA, 2013. ACM.

Cited By

View all
  • (2024)Bi-Directional Gaze-Based Communication: A ReviewMultimodal Technologies and Interaction10.3390/mti81201088:12(108)Online publication date: 4-Dec-2024
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AH '16: Proceedings of the 7th Augmented Human International Conference 2016
February 2016
258 pages
ISBN:9781450336802
DOI:10.1145/2875194
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

  • University of Geneva

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 February 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. gaze tracking
  2. interactive eye-wear
  3. smooth pursuit
  4. wearable computing

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

AH '16
AH '16: Augmented Human International Conference 2016
February 25 - 27, 2016
Geneva, Switzerland

Acceptance Rates

AH '16 Paper Acceptance Rate 21 of 138 submissions, 15%;
Overall Acceptance Rate 121 of 306 submissions, 40%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)17
  • Downloads (Last 6 weeks)0
Reflects downloads up to 27 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Bi-Directional Gaze-Based Communication: A ReviewMultimodal Technologies and Interaction10.3390/mti81201088:12(108)Online publication date: 4-Dec-2024
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
  • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
  • (2022)EOG-Based Human–Computer Interface: 2000–2020 ReviewSensors10.3390/s2213491422:13(4914)Online publication date: 29-Jun-2022
  • (2022)User Perception of Smooth Pursuit Target Speed2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529234(1-7)Online publication date: 8-Jun-2022
  • (2022)Using Eye Tracking to Investigate Interaction Between Humans and Virtual Agents2022 IEEE Conference on Cognitive and Computational Aspects of Situation Management (CogSIMA)10.1109/CogSIMA54611.2022.9830686(125-132)Online publication date: 6-Jun-2022
  • (2022)A One-Point Calibration Design for Hybrid Eye Typing InterfaceInternational Journal of Human–Computer Interaction10.1080/10447318.2022.210118639:18(3620-3633)Online publication date: 24-Jul-2022
  • (2021)Smooth Pursuit Study on an Eye-Control System for Continuous Variable Adjustment TasksInternational Journal of Human–Computer Interaction10.1080/10447318.2021.201297939:1(23-33)Online publication date: 15-Dec-2021
  • (2021)Eye tracking algorithms, techniques, tools, and applications with an emphasis on machine learning and Internet of Things technologiesExpert Systems with Applications10.1016/j.eswa.2020.114037166(114037)Online publication date: Mar-2021
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media