skip to main content
10.1145/1983302.1983303acmotherconferencesArticle/Chapter ViewAbstractPublication PagesngcaConference Proceedingsconference-collections
research-article

Designing gaze-supported multimodal interactions for the exploration of large image collections

Published: 26 May 2011 Publication History

Abstract

While eye tracking is becoming more and more relevant as a promising input channel, diverse applications using gaze control in a more natural way are still rather limited. Though several researchers have indicated the particular high potential of gaze-based interaction for pointing tasks, often gaze-only approaches are investigated. However, time-consuming dwell-time activations limit this potential. To overcome this, we present a gaze-supported fisheye lens in combination with (1) a keyboard and (2) and a tilt-sensitive mobile multi-touch device. In a user-centered design approach, we elicited how users would use the aforementioned input combinations. Based on the received feedback we designed a prototype system for the interaction with a remote display using gaze and a touch-and-tilt device. This eliminates gaze dwell-time activations and the well-known Midas Touch problem (unintentionally issuing an action via gaze). A formative user study testing our prototype provided further insights into how well the elaborated gaze-supported interaction techniques were experienced by users.

References

[1]
N. Adams, M. Witkowski, and R. Spence. The inspection of very large images by eye-gaze control. In Proceedings of the working conference on Advanced visual interfaces, AVI '08, pages 111--118, 2008.
[2]
M. Ashmore, A. T. Duchowski, and G. Shoemaker. Efficient eye pointing with a fisheye lens. In Proceedings of Graphics Interface 2005, GI '05, pages 203--210, 2005.
[3]
H.-J. Bieg. Gaze-augmented manual interaction. In Proceedings of CHI'09 - Extended abstracts, pages 3121--3124, 2009.
[4]
E. Castellina and F. Corno. Multimodal gaze interaction in 3d virtual environments. In COGAIN '08: Proceedings of the 4th Conference on Communication by Gaze Interaction, pages 33--37, 2008.
[5]
A. Cockburn, A. Karlson, and B. B. Bederson. A review of overview+detail, zooming, and focus+context interfaces. ACM Comput. Surv., Vol. 41:1--31, 2009.
[6]
R. Dachselt and R. Buchholz. Throw and tilt -- seamless interaction across devices using mobile phone gestures. In Lecture Notes in Informatics, Vol. P-133, MEIS '08, pages 272--278, 2008.
[7]
R. Dachselt and R. Buchholz. Natural throw and tilt interaction between mobile phones and distant displays. In Proceedings of CHI'09 - Extended abstracts, pages 3253--3258, 2009.
[8]
A. T. Duchowski, N. Cournia, and H. Murphy. Gaze-contingent displays: A review. CyberPsychology & Behavior, 7(6):621--634, 2004.
[9]
D. Fono and R. Vertegaal. Eyewindows: evaluation of eye-controlled zooming windows for focus selection. In Proceedings of CHI'05, pages 151--160, 2005.
[10]
T. Germer, T. Götzelmann, M. Spindler, and T. Strothotte. SpringLens -- Distributed Nonlinear Magnifications. In Eurographics - Short Papers, pages 123--126. EGPub, 2006.
[11]
D. W. Hansen, H. H. T. Skovsgaard, J. P. Hansen, and E. Møllenbach. Noise tolerant selection by gaze-controlled pan and zoom in 3d. In Proceedings of ETRA'08, pages 205--212. ACM, 2008.
[12]
H. Istance, R. Bates, A. Hyrskykari, and S. Vickers. Snap clutch, a moded approach to solving the midas touch problem. In Proceedings of ETRA'08, pages 221--228. ACM, 2008.
[13]
R. J. K. Jacob. What you look at is what you get: eye movement-based interaction techniques. In Proceedings of CHI'90, pages 11--18, 1990.
[14]
J. Kruskal and M. Wish. Multidimensional scaling. Sage, 1986.
[15]
D. Miniotas, O. Špakov, and I. S. MacKenzie. Eye gaze interaction with expanding targets. In Proceedings of CHI'04 - Extended abstracts, pages 1255--1258, 2004.
[16]
M. Nielsen, M. Störring, T. Moeslund, and E. Granum. A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In Gesture-Based Communication in Human-Computer Interaction, volume 2915 of Lecture Notes in Computer Science, pages 105--106. Springer Berlin/Heidelberg, 2004.
[17]
B. Shneiderman. The eyes have it: A task by data type taxonomy for information visualizations. In Proceedings of the IEEE Symposium on Visual Languages, 1996.
[18]
G. Shoemaker and C. Gutwin. Supporting multi-point interaction in visual workspaces. In Proceedings of CHI'07, pages 999--1008, 2007.
[19]
S. Stober and A. Nürnberger. A multi-focus zoomable interface for multi-facet exploration of music collections. In 7th International Symopsium on Computer Music Modeling and Retrieval, pages 339--354, 2010.
[20]
S. Stober, C. Hentschel, and A. Nürnberger. Multi-facet exploration of image collections with an adaptive multi-focus zoomable interface. In WCCI '10: Proceedings of 2010 IEEE World Congress on Computational Intelligence, pages 2780--2787, 2010.
[21]
D. Vogel and R. Balakrishnan. Distant freehand pointing and clicking on very large, high resolution displays. In UIST '05: Proceedings of the 18th annual ACM symposium on User interface software and technology, pages 33--42, 2005.
[22]
B. Yoo, J.-J. Han, C. Choi, K. Yi, S. Suh, D. Park, and C. Kim. 3d user interface combining gaze and hand gestures for large-scale display. In Proceedings of CHI'10 - Extended abstracts, pages 3709--3714, 2010.
[23]
S. Zhai, C. Morimoto, and S. Ihde. Manual and gaze input cascaded (magic) pointing. In Proceedings of CHI'99, pages 246--253, 1999.
[24]
X. Zhang, X. Ren, and H. Zha. Improving eye cursor's stability for eye pointing tasks. In Proceedings of CHI'08, pages 525--534, 2008.

Cited By

View all
  • (2024)Attention-Aware Visualization: Tracking and Responding to User Perception Over TimeIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345630031:1(1017-1027)Online publication date: 9-Sep-2024
  • (2024)Robust Object Selection in Spontaneous Gaze-Controlled Application Using Exponential Moving Average and Hidden Markov ModelIEEE Transactions on Human-Machine Systems10.1109/THMS.2024.341378154:5(485-498)Online publication date: Oct-2024
  • (2024)From Feature to Gaze: A Generalizable Replacement of Linear Layer for Gaze Estimation2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52733.2024.00140(1409-1418)Online publication date: 16-Jun-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
NGCA '11: Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
May 2011
71 pages
ISBN:9781450306805
DOI:10.1145/1983302
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • School of Computing, BTH: Blekinge Institute of Technology - School of Computing
  • Tobii

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 May 2011

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eye tracking
  2. gaze control
  3. multimodal interaction

Qualifiers

  • Research-article

Funding Sources

Conference

NGCA '11
Sponsor:
  • School of Computing, BTH

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)27
  • Downloads (Last 6 weeks)5
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Attention-Aware Visualization: Tracking and Responding to User Perception Over TimeIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345630031:1(1017-1027)Online publication date: 9-Sep-2024
  • (2024)Robust Object Selection in Spontaneous Gaze-Controlled Application Using Exponential Moving Average and Hidden Markov ModelIEEE Transactions on Human-Machine Systems10.1109/THMS.2024.341378154:5(485-498)Online publication date: Oct-2024
  • (2024)From Feature to Gaze: A Generalizable Replacement of Linear Layer for Gaze Estimation2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52733.2024.00140(1409-1418)Online publication date: 16-Jun-2024
  • (2024)Vicsgaze: a gaze estimation method using self-supervised contrastive learningMultimedia Systems10.1007/s00530-024-01458-x30:6Online publication date: 2-Nov-2024
  • (2023)Pactolo Bar: An Approach to Mitigate the Midas Touch Problem in Non-Conventional InteractionSensors10.3390/s2304211023:4(2110)Online publication date: 13-Feb-2023
  • (2023)Exploring 3D Interaction with Gaze Guidance in Augmented Reality2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR55154.2023.00018(22-32)Online publication date: Mar-2023
  • (2022)VRDoc: Gaze-based Interactions for VR Reading Experience2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR55827.2022.00097(787-796)Online publication date: Oct-2022
  • (2022)Object Selection Using LSTM Networks for Spontaneous Gaze-Based Interaction2022 14th International Conference on Information Technology and Electrical Engineering (ICITEE)10.1109/ICITEE56407.2022.9954104(1-5)Online publication date: 18-Oct-2022
  • (2021)Gaze-Supported 3D Object Manipulation in Virtual RealityProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445343(1-13)Online publication date: 6-May-2021
  • (2021)A Review on Opportunities and Challenges of Machine Learning and Deep Learning for Eye Movements Classification2021 IEEE International Biomedical Instrumentation and Technology Conference (IBITeC)10.1109/IBITeC53045.2021.9649434(65-70)Online publication date: 20-Oct-2021
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media