skip to main content
10.1145/2401836.2401852acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Sensing visual attention using an interactive bidirectional HMD

Published: 26 October 2012 Publication History

Abstract

This paper presents a novel system for sensing of attentional behavior in Augmented Reality (AR) environments by analyzing eye movement. The system is based on light weight head mounted optical see-through glasses containing bidirectional microdisplays, which allow displaying images and eye tracking on a single chip. The sensing and interaction application has been developed in the European project ARtSENSE in order to (1) detect museum visitors attention/interest in artworks as well as in presented AR content, (2) present appropriate personalized information based on the detected attention as augmented overlays, and (3) allow museum visitors gaze-based interaction with the system or the AR content. In this paper we present a novel algorithm for pupil estimation in low resolution eye-tracking images and show first results on attention estimation by eye movement analysis and interaction with the system by gaze.

References

[1]
J. Baumgarten, T. Schuchert, S. Voth, P. Wartenberg, B. Richter, and U. Vogel. Aspects of a head-mounted eye-tracker based on a bidirectional oled microdisplay. Journal of information display 13, 2:67--71, 2012.
[2]
D. Droege and D. Paulus. Pupil center detection in low resolution images. In ETRA 2010, pages 169--172. ACM, 2010.
[3]
D. Droege, C. Schmidt, and D. Paulus. A comparison of pupil center estimation algorithms. In H. Istance, O. Stepankova, and R. Bates, editors, COGAIN 2008, pages 23--26, 2008.
[4]
T. Fukuda, K. Morimoto, and H. Yamana. Model-based eye-tracking method for low-resolution eye-images. In 2nd Workshop on Eye Gaze in Intelligent Human Machine Interaction, 2011.
[5]
A. Hyrskykari, P. Majaranta, and K. jouko Räihä. Proactive response to eye movements. In Proc. INTERACT - IFIP Conference on Human-Computer Interaction, pages 129--136. IOS Press, 2003.
[6]
D. Li, D. Winfield, and D. J. Parkhurst. Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. Proceedings of the IEEE Vision for Human-Computer Interaction Workshop at CVPR, 2005.
[7]
S. Nilsson, T. Gustafsson, and P. Carleberg. Hands free interaction with virtual information in a real environment: Eye gaze as an interaction tool in an augmented reality system. PsychNology Journal, 7:175--196, 2009.

Cited By

View all
  • (2023)Gaze-Based Human–Computer Interaction for Museums and Exhibitions: Technologies, Applications and Future PerspectivesElectronics10.3390/electronics1214306412:14(3064)Online publication date: 13-Jul-2023
  • (2021)Embodied Interaction on Constrained Interfaces for Augmented RealitySpringer Handbook of Augmented Reality10.1007/978-3-030-67822-7_10(239-271)Online publication date: 16-Dec-2021
  • (2020)UbiPointProceedings of the 11th ACM Multimedia Systems Conference10.1145/3339825.3391870(190-201)Online publication date: 27-May-2020
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
Gaze-In '12: Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction
October 2012
88 pages
ISBN:9781450315166
DOI:10.1145/2401836
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 October 2012

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Research-article

Funding Sources

Conference

ICMI '12
Sponsor:
ICMI '12: INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION
October 26, 2012
California, Santa Monica

Acceptance Rates

Overall Acceptance Rate 19 of 21 submissions, 90%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)15
  • Downloads (Last 6 weeks)1
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Gaze-Based Human–Computer Interaction for Museums and Exhibitions: Technologies, Applications and Future PerspectivesElectronics10.3390/electronics1214306412:14(3064)Online publication date: 13-Jul-2023
  • (2021)Embodied Interaction on Constrained Interfaces for Augmented RealitySpringer Handbook of Augmented Reality10.1007/978-3-030-67822-7_10(239-271)Online publication date: 16-Dec-2021
  • (2020)UbiPointProceedings of the 11th ACM Multimedia Systems Conference10.1145/3339825.3391870(190-201)Online publication date: 27-May-2020
  • (2019)Design and development of a spatial mixed reality touring guide to the Egyptian museumMultimedia Tools and Applications10.1007/s11042-019-08026-wOnline publication date: 6-Aug-2019
  • (2018)Interaction Methods for Smart Glasses: A SurveyIEEE Access10.1109/ACCESS.2018.28310816(28712-28732)Online publication date: 2018
  • (2016)Designing a Willing-to-Use-in-Public Hand Gestural Interaction Technique for Smart GlassesProceedings of the 2016 CHI Conference on Human Factors in Computing Systems10.1145/2858036.2858436(4203-4215)Online publication date: 7-May-2016
  • (2013)Real-time 3D gaze analysis in mobile applicationsProceedings of the 2013 Conference on Eye Tracking South Africa10.1145/2509315.2509333(75-78)Online publication date: 29-Aug-2013
  • (2013)Robust hand tracking in realtime using a single head-mounted RGB cameraProceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IV10.1007/978-3-642-39330-3_27(252-261)Online publication date: 21-Jul-2013

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media