skip to main content
10.1145/1461893.1461903acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
research-article

Gaze-based interaction in various environments

Published: 31 October 2008 Publication History

Abstract

The analysis of cognitive processes during human-machine and human-human interaction requires various tracking technologies. The human gaze is a very important cue to gather information concerning the user's intentions, current mental state, etc. To get this data the framework consisting of a highly accurate head-mounted gaze tracker combined with a low latency head tracking method was developed. Its integration into various experimental environments forces an easy to use calibration method for multiple working areas and also the implementation of numerous interfaces. Therefore a calibration method by simply looking at known fixation points was integrated. Also, first results of a brief user study using the proposed framework are presented.

References

[1]
Homepage of the ASL Eyetracking. www.a-s-l.com.
[2]
Homepage of the SR Research. www.eyelinkinfo.com.
[3]
The Internet Communications Engine (ice). http://www.zeroc.com/ice.html.
[4]
www.attrakdiff.de. Service der User Interface Design GmbH.
[5]
J. Brooke. Sus - a quick and dirty usability scale. Technical report, Redhatch Consulting Ltd., 1986.
[6]
M. Buss, M. Beetz, and D. Wollherr. CoTeSys - Cognition for Technical Systems. In Proceedings of the 4th COE Workshop on Human Adaptive Mechatronics (HAM), 2007.
[7]
T. Dera, G. Boening, S. Bardins, E. Schneider, and T. Brandt. Low-latency video tracking of horizontal, vertical, and torsional eye movements as a basis for 3dof realtime motion control of a head-mounted camera. In Proceedings of the IEEE Conference on Systems, Man and Cybernetics (SMC2006), Taipei, Taiwan, 2006.
[8]
M. Goebl and G. A. Farber. Real-time-capable hard-and software architecture for joint image and knowledge processing in cognitive automobile, June 2007.
[9]
M. Halbrügge, B. Deml, B. A. Färber, and S. Bardins. ACT-CV -- die Erweiterung von ACT-R um Bildverarbeitungsalgorithmen erlaubt die schnelleErzeugung mächtigerBenutzermodelle. In 49. Fachausschusssitzung DGLR T5.4 Anthropotechnik -- Stand und Perspektiven der simulationsgestützen Systemgestaltung. DGLR, 2007.
[10]
J. Heikkila and O. Silven. A four-step camera calibration procedure with implicit imagecorrection. Computer Vision and Pattern Recognition, 1997. Proceedings., 1997 IEEE Computer Society Conference on, pages 1106--1112, 1997.
[11]
A. Hyrskykari, P. Majaranta, and K.-J. Räihä. From gaze control to attentive interfaces. in. In Proceedings of HCII 2005, Las Vegas, NV., 2005.
[12]
S. Kohlbecher, S. Bardins, K. Bartl, E. Schneider, T. Poitschke, and M. Ablassmeier. Calibration-free eye tracking by reconstruction of the pupil ellipse in 3d space. In Proc. of ETRA 2008, Savannah, USA, March 25-28, 2008. CD ROM., 2008.
[13]
S. Kohlbecher, T. Poitschke, M. Ablassmeier, G. Rigoll, S. Bardins, and E. Schneider. Gaze vector detection by stereo reconstruction of the pupil contours. In Conf. Abstracts of Europ. Conf. on Eye Movements, ECEM 2007, Potsdam, 19.-23.08.2007. Universität Potsdam, p. 122., 2007.
[14]
M. F. Land. Eye movements and the control of actions in everyday life. Prog Retin Eye Res, 25:296--324, 2006.
[15]
P. Majaranta and K.-J. Räihä. Twenty years of eye typing: systems and design issues. In Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA'02)., 2002.
[16]
T. Poitschke, M. Ablassmeier, S. Bardins, S. Kohlbecher, E. Schneider, and G. Rigoll. Contact-analog information representation in an automotive head-up display. In Proc. of ETRA 2008, Savannah, USA, March 25-28, 2008. CD ROM., 2008.
[17]
E. Schneider, K. Bartl, T. Dera, G. Boening, and T. Brandt. Gaze-aligned head-mounted camera with pan, tilt and roll motion control for medical documentation and teaching applications. In Proceedings of the IEEE Conference on Systems, Man and Cybernetics (SMC2006), Taipei, Taiwan, 2006.

Cited By

View all
  • (2022)Gaze-Controlled Instructions for Manual Assembly Tasks – A Usability Evaluation StudyProceedings of Mensch und Computer 202210.1145/3543758.3547537(343-348)Online publication date: 4-Sep-2022
  • (2022)Gaze-Vergence-Controlled See-Through Vision in Augmented RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.320311028:11(3843-3853)Online publication date: Nov-2022
  • (2022)Opportunities for using eye tracking technology in manufacturing and logistics: Systematic literature review and research agendaComputers & Industrial Engineering10.1016/j.cie.2022.108444171(108444)Online publication date: Sep-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
VNBA '08: Proceedings of the 1st ACM workshop on Vision networks for behavior analysis
October 2008
116 pages
ISBN:9781605583136
DOI:10.1145/1461893
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 31 October 2008

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. area-of-interest
  2. calibration
  3. gaze tracking
  4. gaze-in-space
  5. stereo camera system

Qualifiers

  • Research-article

Conference

MM08
Sponsor:
MM08: ACM Multimedia Conference 2008
October 31, 2008
British Columbia, Vancouver, Canada

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)1
Reflects downloads up to 25 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2022)Gaze-Controlled Instructions for Manual Assembly Tasks – A Usability Evaluation StudyProceedings of Mensch und Computer 202210.1145/3543758.3547537(343-348)Online publication date: 4-Sep-2022
  • (2022)Gaze-Vergence-Controlled See-Through Vision in Augmented RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.320311028:11(3843-3853)Online publication date: Nov-2022
  • (2022)Opportunities for using eye tracking technology in manufacturing and logistics: Systematic literature review and research agendaComputers & Industrial Engineering10.1016/j.cie.2022.108444171(108444)Online publication date: Sep-2022
  • (2020)Gaze-based Interaction with Windshield Displays for Automated Driving: Impact of Dwell Time and Feedback Design on Task Performance and Subjective Workload12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3409120.3410654(151-160)Online publication date: 21-Sep-2020
  • (2020)Toward Improving Social Interaction Ability for Children with Autism Spectrum Disorder Using Social SignalsArtificial Intelligence Supported Educational Technologies10.1007/978-3-030-41099-5_9(153-174)Online publication date: 30-Apr-2020
  • (2019)Virtual reality driving simulator for user studies on automated drivingProceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings10.1145/3349263.3349595(502-507)Online publication date: 21-Sep-2019
  • (2019)AutoWSDProceedings of Mensch und Computer 201910.1145/3340764.3345366(853-857)Online publication date: 8-Sep-2019
  • (2018)Error-aware gaze-based interfaces for robust mobile gaze interactionProceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications10.1145/3204493.3204536(1-10)Online publication date: 14-Jun-2018
  • (2018)User Experience Analysis in Industry 4.0 - The Use of Biometric Devices in Engineering Design and Manufacturing2018 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM)10.1109/IEEM.2018.8607367(192-196)Online publication date: Dec-2018
  • (2016)Design and evaluation of a gaze tracking system for free-space interactionProceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct10.1145/2968219.2968336(1676-1685)Online publication date: 12-Sep-2016
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media