skip to main content
10.1145/3411763.3451621acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
poster

A Novel Gaze Gesture Sensor for Smart Glasses Based on Laser Self-Mixing

Published: 08 May 2021 Publication History

Abstract

The integration of gaze gesture sensors in next-generation smart glasses will improve usability and enable new interaction concepts. However, consumer smart glasses place additional requirements to gaze gesture sensors, such as a low power consumption, high integration capability and robustness to ambient illumination. We propose a novel gaze gesture sensor based on laser feedback interferometry (LFI), which is capable to measure the rotational velocity of the eye as well as the sensor’s distance towards the eye. This sensor delivers a unique and novel set of features with an outstanding sample rate allowing to not only predict a gaze gesture but also to anticipate it. To take full advantage of the unique sensor features and the high sampling rate, we propose a novel gaze symbol classification algorithm based on single sample. At a mean F1-score of 93.44 %, our algorithms shows exceptional classification performance.

References

[1]
Amazon. 2019. Echo Frames - Eyeglasses with Alexa. online. https://www.amazon.com/Staging-Product-Not-Retail-Sale/dp/B07W72XKPJ
[2]
Roman Bednarik, Tersia Gowases, and Markku Tukiainen. 2009. Gaze interaction enhances problem solving: Effects of dwell-time based, gaze-augmented, and mouse interaction on problem-solving strategies and user experience. Journal of Eye Movement Research 3, 1 (Jun. 2009), 1–10. https://doi.org/10.16910/jemr.3.1.3
[3]
B. Behroozpour, P. A. M. Sandborn, M. C. Wu, and B. E. Boser. 2017. Lidar System Architectures and Circuits. IEEE Communications Magazine 55, 10 (2017), 135–142. https://doi.org/10.1109/MCOM.2017.1700030
[4]
Andreas Bulling, Daniel Roggen, and Gerhard Tröster. 2009. Wearable EOG Goggles: Seamless Sensing and Context-Awareness in Everyday Environments. J. Ambient Intell. Smart Environ. 1, 2 (April 2009), 15.
[5]
Rainhard Dieter Findling, Le Ngu Nguyen, and Stephan Sigg. 2019. Closed-Eye Gaze Gestures: Detection and Recognition of Closed-Eye Movements with Cameras in Smart Glasses. In Advances in Computational Intelligence, Ignacio Rojas, Gonzalo Joya, and Andreu Catala (Eds.). Springer International Publishing, Cham, 322–334.
[6]
Wolfgang Fuhl, Marc Tonsen, Andreas Bulling, and Enkelejda Kasneci. 2016. Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art. Machine Vision and Applications 27, 8 (June 2016), 1275–1288. https://doi.org/10.1007/s00138-016-0776-4
[7]
Guido Giuliani, Michele Norgia, Silvano Donati, and Thierry Bosch. 2002. Laser diode self-mixing technique for sensing applications. Journal of Optics A: Pure and Applied Optics 4, 6 (nov 2002), S283–S294. https://doi.org/10.1088/1464-4258/4/6/371
[8]
Herbert Gross, Fritz Blechinger, and Bertram Achtner. 2008. Human eye. Handbook of Optical Systems: Volume 4: Survey of Optical Instruments 4 (2008), 1–87.
[9]
Albrecht Schmidt Heiko Drewes. 2007. Interacting with the Computer Using Gaze Gestures. In Lecture Notes in Computer Science, Vol. 4663. Springer, Berlin, Heidelberg, 475–488.
[10]
Sabrina Hoppe and Andreas Bulling. 2016. End-to-End Eye Movement Detection Using Convolutional Neural Networks. arxiv:1609.02452 [cs.CV]
[11]
R. Lang and K. Kobayashi. 1980. External optical feedback effects on semiconductor injection laser properties. IEEE Journal of Quantum Electronics 16, 3 (1980), 347–355.
[12]
L. Lee and P. Hui. 2018. Interaction Methods for Smart Glasses: A Survey. IEEE Access 6(2018), 28712–28732.
[13]
Daniel Kelly Maj Isabelle Olsson, Mitchell Joseph Heinrich and John Lapetina. 2013. Wearable device with input and output structures.
[14]
Päivi Majaranta and Andreas Bulling. 2014. Eye tracking and eye-based human–computer interaction. In Advances in physiological computing. Springer, London, 39–65.
[15]
Johannes Meyer, Thomas Schlebusch, Wolfgang Fuhl, and Enkelejda Kasneci. 2020. A Novel Camera-Free Eye Tracking Sensor for Augmented Reality based on Laser Scanning. IEEE Sensors Journal 20, 24 (2020), 1–9.
[16]
Johannes Meyer, Thomas Schlebusch, Hans Spruit, Jochen Hellmig, and Enkelejda Kasneci. 2020. A Novel -Eye-Tracking Sensor for AR Glasses Based on Laser Self-Mixing Showing Exceptional Robustness Against Illumination. In ACM Symposium on Eye Tracking Research and Applications (Stuttgart, Germany) (ETRA 20 Short Papers). Association for Computing Machinery, New York, NY, USA, Article 31, 5 pages. https://doi.org/10.1145/3379156.3391352
[17]
Shigeru Natsume. 2017. Virtual reality glasses.
[18]
Dario D. Salvucci and Joseph H. Goldberg. 2000. Identifying Fixations and Saccades in Eye-Tracking Protocols. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications (Palm Beach Gardens, Florida, USA) (ETRA 00). Association for Computing Machinery, New York, NY, USA, 8. https://doi.org/10.1145/355017.355028
[19]
Thiago Santini, Wolfgang Fuhl, Thomas Kübler, and Enkelejda Kasneci. 2016. Bayesian Identification of Fixations, Saccades, and Smooth Pursuits. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications(Charleston, South Carolina) (ETRA 16). Association for Computing Machinery, New York, NY, USA, 8. https://doi.org/10.1145/2857491.2857512
[20]
N. Sarkar, D. Strathearn, G. Lee, M. Olfat, A. Rohani, and R. R. Mansour. 2015. A large angle, low voltage, small footprint micromirror for eye tracking and near-eye display applications. In 2015 Transducers - 2015 18th International Conference on Solid-State Sensors, Actuators and Microsystems (TRANSDUCERS). IEEE, New York City, 855–858.
[21]
Mikhail Startsev, Ioannis Agtzidis, and Michael Dorr. 2019. 1D CNN with BLSTM for automated classification of fixations, saccades, and smooth pursuits. Behavior Research Methods 51, 2 (2019), 556–572.
[22]
Thomas Taimre, Milan Nikolić, Karl Bertling, Yah Leng Lim, Thierry Bosch, and Aleksandar D. Rakić. 2015. Laser feedback interferometry: a tutorial on the self-mixing effect for coherent sensing. Adv. Opt. Photon. 7, 3 (Sep 2015), 570–631. https://doi.org/10.1364/AOP.7.000570

Cited By

View all
  • (2023)Static Laser Feedback Interferometry-Based Gaze Estimation for Wearable GlassesIEEE Sensors Journal10.1109/JSEN.2023.325071423:7(7558-7569)Online publication date: 1-Apr-2023
  • (2022)U-HARProceedings of the ACM on Human-Computer Interaction10.1145/35308846:ETRA(1-19)Online publication date: 13-May-2022
  • (2022)A Highly Integrated Ambient Light Robust Eye-Tracking Sensor for Retinal Projection AR Glasses Based on Laser Feedback InterferometryProceedings of the ACM on Human-Computer Interaction10.1145/35308816:ETRA(1-18)Online publication date: 13-May-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI EA '21: Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems
May 2021
2965 pages
ISBN:9781450380959
DOI:10.1145/3411763
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 May 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. gaze interaction
  2. laser feedback interferometry (LFI) sensor
  3. smart glasses interaction

Qualifiers

  • Poster
  • Research
  • Refereed limited

Conference

CHI '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)54
  • Downloads (Last 6 weeks)8
Reflects downloads up to 20 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Static Laser Feedback Interferometry-Based Gaze Estimation for Wearable GlassesIEEE Sensors Journal10.1109/JSEN.2023.325071423:7(7558-7569)Online publication date: 1-Apr-2023
  • (2022)U-HARProceedings of the ACM on Human-Computer Interaction10.1145/35308846:ETRA(1-19)Online publication date: 13-May-2022
  • (2022)A Highly Integrated Ambient Light Robust Eye-Tracking Sensor for Retinal Projection AR Glasses Based on Laser Feedback InterferometryProceedings of the ACM on Human-Computer Interaction10.1145/35308816:ETRA(1-18)Online publication date: 13-May-2022
  • (2021)A CNN-based Human Activity Recognition System Combining a Laser Feedback Interferometry Eye Movement Sensor and an IMU for Context-aware Smart GlassesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34949985:4(1-24)Online publication date: 30-Dec-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media