skip to main content
research-article

A Highly Integrated Ambient Light Robust Eye-Tracking Sensor for Retinal Projection AR Glasses Based on Laser Feedback Interferometry

Published:13 May 2022Publication History
Skip Abstract Section

Abstract

Robust and highly integrated eye-tracking is a key technology to improve resolution of near-eye-display technologies for augmented reality (AR) glasses such as focus-free retinal projection as it enables display enhancements like foveated rendering. Furthermore, eye-tracking sensors enables novel ways to interact with user interfaces of AR glasses, improving thus the user experience compared to other wearables. In this work, we present a novel approach to track the user's eye by scanned laser feedback interferometry sensing. The main advantages over modern video-oculography (VOG) systems are the seamless integration of the eye-tracking sensor and the excellent robustness to ambient light with significantly lower power consumption. We further present an algorithm to track the bright pupil signal captured by our sensor with a significantly lower computational effort compared to VOG systems. We evaluate a prototype to prove the high robustness against ambient light and achieve a gaze accuracy of 1.62\,$^\circ$, which is comparable to other state-of-the-art scanned laser eye-tracking sensors. The outstanding robustness and high integrability of the proposed sensor will pave the way for everyday eye-tracking in consumer AR glasses.

Skip Supplemental Material Section

Supplemental Material

S6_A_Highly_Integrated.mp4

mp4

114.2 MB

References

  1. 2014. IEC 60825--1:2014 Safety of laser products - Part 1: Equipment classification and requirements.Google ScholarGoogle Scholar
  2. Aayush K Chaudhary, Rakshit Kothari, Manoj Acharya, Shusil Dangi, Nitinraj Nair, Reynold Bailey, Christopher Kanan, Gabriel Diaz, and Jeff B Pelz. 2019. RITnet: Real-time semantic segmentation of the eye for gaze tracking. In 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW). IEEE, 3698--3702.Google ScholarGoogle ScholarCross RefCross Ref
  3. Larry A. Coldren, Scott W. Corzine, and Milan L. Mashanovitch. 2012. Diode Lasers and Photonic Integrated Circuits -. John Wiley & Sons, New York.Google ScholarGoogle Scholar
  4. William A. Donnelly. 2008. The Advanced Human Eye Model (AHEM): a personal binocular eye modeling system inclusive of refraction, diffraction, and scatter. Journal of refractive surgery 24 9 (2008), 976--83. http://citeseerx.ist.psu. edu/viewdoc/download?doi=10.1.1.518.4567&rep=rep1&type=pdfGoogle ScholarGoogle ScholarCross RefCross Ref
  5. Andrew T. Duchowski. 2018. Gaze-based interaction: A 30 year retrospective. Computers & Graphics 73 (2018), 59 -- 69. https://doi.org/10.1016/j.cag.2018.04.002Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Wolfgang Fuhl. 2021. 1000 Pupil Segmentations in a Second using Haar Like Features and Statistical Learning. arXiv preprint arXiv:2102.01921 (2021).Google ScholarGoogle Scholar
  7. Wolfgang Fuhl, Hong Gao, and Enkelejda Kasneci. 2020. Tiny convolution, decision tree, and binary neuronal networks for robust and real time pupil outline estimation. In ACM Symposium on Eye Tracking Research and Applications. 1--5.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Wolfgang Fuhl, David Geisler, Thiago Santini, Tobias Appel, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2018. CBF: Circular Binary Features for Robust and Real-time Pupil Center Detection. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA 18). ACM, New York, NY, USA, Article 8, 6 pages. https: //doi.org/10.1145/3204493.3204559Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Wolfgang Fuhl, Dennis Hospach, Thomas C. KÃijbler, Wolfgang Rosenstiel, Oliver Bringmann, and Enkelejda Kasneci. 2017. Ways of improving the precision of eye tracking data: Controlling the influence of dirt and dust on pupil detection. Journal of Eye Movement Research 10, 3 (May 2017). https://doi.org/10.16910/jemr.10.3.1Google ScholarGoogle ScholarCross RefCross Ref
  10. Wolfgang Fuhl, Thiago Santini, Gjergji Kasneci, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2017. Pupilnet v2. 0: Convolutional neural networks for cpu based real time robust pupil detection. arXiv preprint arXiv:1711.00112 (2017).Google ScholarGoogle Scholar
  11. Wolfgang Fuhl, Thiago C Santini, Thomas Kübler, and Enkelejda Kasneci. 2016. Else: Ellipse selection for robust pupil detection in real-world environments. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. 123--130.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Wolfgang Fuhl, Marc Tonsen, Andreas Bulling, and Enkelejda Kasneci. 2016. Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art. Machine Vision and Applications 27, 8 (01 Nov 2016), 1275--1288. https://doi.org/10.1007/s00138-016-0776--4Google ScholarGoogle Scholar
  13. Joseph W. Goodman. 2020. Speckle Phenomena in Optics - Theory and Applications. SPIE Press, Bellingham, Washington.Google ScholarGoogle Scholar
  14. Martin Grabherr, Philipp Gerlach, Roger King, and Roland Jäger. 2009. Integrated photodiodes complement the VCSEL platform. In Vertical-Cavity Surface-Emitting Lasers XIII, Vol. 7229. International Society for Optics and Photonics, 72290E.Google ScholarGoogle ScholarCross RefCross Ref
  15. Boris Greenberg. 2021. EyeWay Vision: Highly Efficient Immersive AR Glasses Using Gaze-locked Exit Pupil Steering. In SPIE AVR21 Industry Talks II, Conference Chair (Ed.), Vol. 11764. International Society for Optics and Photonics, SPIE. https://doi.org/10.1117/12.2598227Google ScholarGoogle Scholar
  16. E.D. Guestrin and M. Eizenman. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on Biomedical Engineering 53, 6 (2006), 1124--1133. https://doi.org/10.1109/TBME.2005. 863952Google ScholarGoogle ScholarCross RefCross Ref
  17. Kenneth Holmqvist, Saga Lee Orbom, Michael Miller, Albert Kashchenevsky, Mark M. Shovman, and Mark W. Greenlee. 2020. Validation of a Prototype Hybrid Eye-Tracker against the DPI and the Tobii Spectrum. In ACM Symposium on Eye Tracking Research and Applications (ETRA 20 Full Papers). Association for Computing Machinery, New York, NY, USA, Article 6, 9 pages. https://doi.org/10.1145/3379155.3391330Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Anton S. Kaplanyan, Anton Sochenov, Thomas Leimkühler, Mikhail Okunev, Todd Goodall, and Gizem Rufo. 2019. DeepFovea: Neural Reconstruction for Foveated Rendering and Video Compression Using Learned Statistics of Natural Videos. ACM Trans. Graph. 38, 6, Article 212 (Nov. 2019), 13 pages. https://doi.org/10.1145/3355089.3356557Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Enkelejda Kasneci, Katrin Sippel, Kathrin Aehling, Martin Heister, Wolfgang Rosenstiel, Ulrich Schiefer, and Elena Papageorgiou. 2014. Driving with Binocular Visual Field Loss? A Study on a Supervised On-Road Parcours with Simultaneous Eye and Head Tracking. PLOS ONE 9, 2 (02 2014), 1--13. https://doi.org/10.1371/journal.pone.0087470Google ScholarGoogle Scholar
  20. Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). ACM, New York, NY, USA, 1151--1160. https: //doi.org/10.1145/2638728.2641695Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Jonghyun Kim, Youngmo Jeong, Michael Stengel, Kaan Ak?it, Rachel Albert, Ben Boudaoud, Trey Greer, Joohwan Kim, Ward Lopes, Zander Majercik, Peter Shirley, Josef Spjut, Morgan McGuire, and David Luebke. 2019. Foveated Proc. ACM Hum.-Comput. Interact., Vol. 6, No. ETRA, Article 140. Publication date: May 2022. A Highly Integrated Ambient Light Robust Eye-Tracking Sensor 140:17 AR: Dynamically-foveated Augmented Reality Display. ACM Trans. Graph. 38, 4, Article 99 (July 2019), 15 pages. https://doi.org/10.1145/3306346.3322987Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Xiaoxu Meng, Ruofei Du, and Amitabh Varshney. 2020. Eye-dominance-guided Foveated Rendering. IEEE Transactions on Visualization and Computer Graphics 26, 5 (2020), 1972--1980. https://doi.org/10.1109/TVCG.2020.2973442Google ScholarGoogle ScholarCross RefCross Ref
  23. Johannes Meyer, Adrian Frank, Thomas Schlebusch, and Enkeljeda Kasneci. 2022. A CNN-Based Human Activity Recognition System Combining a Laser Feedback Interferometry Eye Movement Sensor and an IMU for ContextAware Smart Glasses. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 5, 4, Article 172 (dec 2022), 24 pages. https://doi.org/10.1145/3494998Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Johannes Meyer, Thomas Schlebusch, Wolfgang Fuhl, and Enkelejda Kasneci. 2020. A Novel Camera-Free Eye Tracking Sensor for Augmented Reality based on Laser Scanning. IEEE Sensors Journal (2020), 1--9.Google ScholarGoogle ScholarCross RefCross Ref
  25. Johannes Meyer, Thomas Schlebusch, Hans Spruit, Jochen Hellmig, and Enkelejda Kasneci. 2020. A Novel -Eye-Tracking Sensor for AR Glasses Based on Laser Self-Mixing Showing Exceptional Robustness Against Illumination. In ACM Symposium on Eye Tracking Research and Applications (ETRA 20 Short Papers). Association for Computing Machinery, New York, NY, USA, Article 31, 5 pages. https://doi.org/10.1145/3379156.3391352Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Johannes Meyer, Thomas Schlebusch, Hans Spruit, Jochen Hellmig, and Enkelejda Kasneci. 2021. A compact low-power gaze gesture sensor based on laser feedback interferometry for smart glasses. In Proc. of SPIE Vol, Vol. 11788. 117880D--1.Google ScholarGoogle ScholarCross RefCross Ref
  27. Johannes Meyer, Thomas Schlebusch, Hans Spruit, Jochen Hellmig, and Enkelejda Kasneci. 2021. A Novel Gaze Gesture Sensor for Smart Glasses Based on Laser Self-Mixing. In CHI Conference on Human Factors in Computing Systems Extended Abstracts (CHI '21 Extended Abstracts). ACM, ACM, New York, NY, USA. https://doi.org/10.1145/3411763. 3451621Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Rainer Michalzik. 2013. VCSELs - Fundamentals, Technology and Applications of Vertical-Cavity Surface-Emitting Lasers. Springer Berlin Heidelberg. https://doi.org/10.1007/978--3--642--24986-0Google ScholarGoogle Scholar
  29. AdHawk microsystems. 2021. Ad-Hawk MindLink Specifications Preliminary Data Sheet.Google ScholarGoogle Scholar
  30. Fabricio Batista Narcizo, Fernando EustÃquio Dantas dos Santos, and Dan Witzner Hansen. 2021. High-Accuracy Gaze Estimation for Interpolation-Based Eye-Tracking Methods. Vision 5, 3 (2021). https://doi.org/10.3390/vision5030041Google ScholarGoogle Scholar
  31. Karlene Nguyen, Cindy Wagner, David Koons, and Myron Flickner. 2002. Differences in the Infrared Bright Pupil Response of Human Eyes. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications (ETRA 02). Association for Computing Machinery, New York, NY, USA, 6. https://doi.org/10.1145/507072.507099Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Diederick C Niehorster, Thiago Santini, Roy S Hessels, Ignace TC Hooge, Enkelejda Kasneci, and Marcus Nyström. 2020. The impact of slippage on the data quality of head-worn eye trackers. Behavior Research Methods 52, 3 (2020), 1140--1160.Google ScholarGoogle ScholarCross RefCross Ref
  33. Bernardo R. Pires, Myung Hwangbo, Michael Devyver, and Takeo Kanade. 2013. Visible-Spectrum Gaze Tracking for Sports. In 2013 IEEE Conference on Computer Vision and Pattern Recognition Workshops. 1005--1010. https://doi.org/10. 1109/CVPRW.2013.146Google ScholarGoogle Scholar
  34. Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2018. PuRe: Robust pupil detection for real-time pervasive eye tracking. Computer Vision and Image Understanding 170 (2018), 40--50.Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2018. PuReST: Robust pupil tracking for real-time pervasive eye tracking. In Proceedings of the 2018 ACM symposium on eye tracking research & applications. 1--5.Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Thiago Santini, Diederick C. Niehorster, and Enkelejda Kasneci. 2019. Get a Grip: Slippage-Robust and Glint-Free Gaze Estimation for Real-Time Pervasive Head-Mounted Eye Tracking. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (ETRA 19). Association for Computing Machinery, New York, NY, USA, Article 17, 10 pages. https://doi.org/10.1145/3314111.3319835Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. N. Sarkar, B. O'Hanlon, A. Rohani, D. Strathearn, G. Lee, M. Olfat, and R. R. Mansour. 2017. A resonant eye-tracking microsystem for velocity estimation of saccades and foveated rendering. In 2017 IEEE 30th International Conference on Micro Electro Mechanical Systems (MEMS). 304--307. https://doi.org/10.1109/MEMSYS.2017.7863402Google ScholarGoogle ScholarCross RefCross Ref
  38. N. Sarkar, D. Strathearn, G. Lee, M. Olfat, A. Rohani, and R. R. Mansour. 2015. A large angle, low voltage, small footprint micromirror for eye tracking and near-eye display applications. In 2015 Transducers - 2015 18th International Conference on Solid-State Sensors, Actuators and Microsystems (TRANSDUCERS). 855--858. https://doi.org/10.1109/ TRANSDUCERS.2015.7181058Google ScholarGoogle Scholar
  39. Joseph M. Schmitt, S. H. Xiang, and Kin Man Yung. 1999. Speckle in optical coherence tomography. Journal of Biomedical Optics 4, 1 (1999), 95 -- 105. https://doi.org/10.1117/1.429925Google ScholarGoogle ScholarCross RefCross Ref
  40. Bosch Sensortec. 2020. Smartglasses Light Drive.Google ScholarGoogle Scholar
  41. Lech Swirski, Andreas Bulling, and Neil Dodgson. 2012. Robust Real-Time Pupil Tracking in Highly off-Axis Images. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA 12). Association for Computing Machinery, New York, NY, USA, 4. https://doi.org/10.1145/2168556.2168585Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Lech wirski and Neil A. Dodgson. 2013. A fully-automatic, temporal approach to single camera, glint-free 3D eye model fitting [Abstract]. In Proceedings of ECEM 2013. http://www.cl.cam.ac.uk/research/rainbow/projects/eyemodelfit/ Proc. ACM Hum.-Comput. Interact., Vol. 6, No. ETRA, Article 140. Publication date: May 2022. 140:18 Johannes Meyer, Thomas Schlebusch, & Enkelejda KasneciGoogle ScholarGoogle Scholar
  43. Thomas Taimre, Milan Nikoli, Karl Bertling, Yah Leng Lim, Thierry Bosch, and Aleksandar D. Raki. 2015. Laser feedback interferometry: a tutorial on the self-mixing effect for coherent sensing. Adv. Opt. Photon. 7, 3 (Sep 2015), 570--631. https://doi.org/10.1364/AOP.7.000570Google ScholarGoogle ScholarCross RefCross Ref
  44. Tobii. 2021. Pro Glasses 3 Product Description. https://www.tobiipro.com/siteassets/tobii-pro/product-descriptions/ product-description-tobii-pro-glasses-3.pdf/?v=1.4Google ScholarGoogle Scholar
  45. Marc Tonsen, Chris Kay Baumann, and Kai Dierkes. 2020. A High-Level Description and Performance Evaluation of Pupil Invisible. arXiv preprint arXiv:2009.00508 (2020).Google ScholarGoogle Scholar
  46. Marc Tonsen, Julian Steil, Yusuke Sugano, and Andreas Bulling. 2017. InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3, Article 106 (Sept. 2017), 21 pages. https://doi.org/10.1145/3130971Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Tobias Wilm, Simone HÃckh, Reinhold Fiess, and Wilhelm Stork. 2021. Holographic combiners for augmented reality applications fabricated by wave front recording. In Novel Optical Systems, Methods, and Applications XXIV, Cornelius F. Hahlweg and Joseph R. Mulley (Eds.), Vol. 11815. International Society for Optics and Photonics, SPIE, 13 -- 22. https://doi.org/10.1117/12.2596838Google ScholarGoogle Scholar
  48. Yuk-Hoi Yiu, Moustafa Aboulatta, Theresa Raiser, Leoni Ophey, Virginia L. Flanagin, Peter zu Eulenburg, and SeyedAhmad Ahmadi. 2019. DeepVOG: Open-source pupil segmentation and gaze estimation in neuroscience using deep learning. Journal of Neuroscience Methods 324 (2019), 108307. https://doi.org/10.1016/j.jneumeth.2019.05.016Google ScholarGoogle ScholarCross RefCross Ref
  49. Jianbo Zhao, Benjamin D. Chrysler, and Raymond K. Kostuk. 2021. Design of a waveguide eye-tracking system operating in near-infrared with holographic optical elements. Optical Engineering 60, 8 (2021). https://doi.org/10.1117/ 1.OE.60.8.085101Google ScholarGoogle Scholar

Index Terms

  1. A Highly Integrated Ambient Light Robust Eye-Tracking Sensor for Retinal Projection AR Glasses Based on Laser Feedback Interferometry

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image Proceedings of the ACM on Human-Computer Interaction
        Proceedings of the ACM on Human-Computer Interaction  Volume 6, Issue ETRA
        ETRA
        May 2022
        198 pages
        EISSN:2573-0142
        DOI:10.1145/3537904
        Issue’s Table of Contents

        Copyright © 2022 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 13 May 2022
        Published in pacmhci Volume 6, Issue ETRA

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article
      • Article Metrics

        • Downloads (Last 12 months)116
        • Downloads (Last 6 weeks)14

        Other Metrics

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader