Abstract
Robust and highly integrated eye-tracking is a key technology to improve resolution of near-eye-display technologies for augmented reality (AR) glasses such as focus-free retinal projection as it enables display enhancements like foveated rendering. Furthermore, eye-tracking sensors enables novel ways to interact with user interfaces of AR glasses, improving thus the user experience compared to other wearables. In this work, we present a novel approach to track the user's eye by scanned laser feedback interferometry sensing. The main advantages over modern video-oculography (VOG) systems are the seamless integration of the eye-tracking sensor and the excellent robustness to ambient light with significantly lower power consumption. We further present an algorithm to track the bright pupil signal captured by our sensor with a significantly lower computational effort compared to VOG systems. We evaluate a prototype to prove the high robustness against ambient light and achieve a gaze accuracy of 1.62\,$^\circ$, which is comparable to other state-of-the-art scanned laser eye-tracking sensors. The outstanding robustness and high integrability of the proposed sensor will pave the way for everyday eye-tracking in consumer AR glasses.
Supplemental Material
- 2014. IEC 60825--1:2014 Safety of laser products - Part 1: Equipment classification and requirements.Google Scholar
- Aayush K Chaudhary, Rakshit Kothari, Manoj Acharya, Shusil Dangi, Nitinraj Nair, Reynold Bailey, Christopher Kanan, Gabriel Diaz, and Jeff B Pelz. 2019. RITnet: Real-time semantic segmentation of the eye for gaze tracking. In 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW). IEEE, 3698--3702.Google ScholarCross Ref
- Larry A. Coldren, Scott W. Corzine, and Milan L. Mashanovitch. 2012. Diode Lasers and Photonic Integrated Circuits -. John Wiley & Sons, New York.Google Scholar
- William A. Donnelly. 2008. The Advanced Human Eye Model (AHEM): a personal binocular eye modeling system inclusive of refraction, diffraction, and scatter. Journal of refractive surgery 24 9 (2008), 976--83. http://citeseerx.ist.psu. edu/viewdoc/download?doi=10.1.1.518.4567&rep=rep1&type=pdfGoogle ScholarCross Ref
- Andrew T. Duchowski. 2018. Gaze-based interaction: A 30 year retrospective. Computers & Graphics 73 (2018), 59 -- 69. https://doi.org/10.1016/j.cag.2018.04.002Google ScholarDigital Library
- Wolfgang Fuhl. 2021. 1000 Pupil Segmentations in a Second using Haar Like Features and Statistical Learning. arXiv preprint arXiv:2102.01921 (2021).Google Scholar
- Wolfgang Fuhl, Hong Gao, and Enkelejda Kasneci. 2020. Tiny convolution, decision tree, and binary neuronal networks for robust and real time pupil outline estimation. In ACM Symposium on Eye Tracking Research and Applications. 1--5.Google ScholarDigital Library
- Wolfgang Fuhl, David Geisler, Thiago Santini, Tobias Appel, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2018. CBF: Circular Binary Features for Robust and Real-time Pupil Center Detection. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA 18). ACM, New York, NY, USA, Article 8, 6 pages. https: //doi.org/10.1145/3204493.3204559Google ScholarDigital Library
- Wolfgang Fuhl, Dennis Hospach, Thomas C. KÃijbler, Wolfgang Rosenstiel, Oliver Bringmann, and Enkelejda Kasneci. 2017. Ways of improving the precision of eye tracking data: Controlling the influence of dirt and dust on pupil detection. Journal of Eye Movement Research 10, 3 (May 2017). https://doi.org/10.16910/jemr.10.3.1Google ScholarCross Ref
- Wolfgang Fuhl, Thiago Santini, Gjergji Kasneci, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2017. Pupilnet v2. 0: Convolutional neural networks for cpu based real time robust pupil detection. arXiv preprint arXiv:1711.00112 (2017).Google Scholar
- Wolfgang Fuhl, Thiago C Santini, Thomas Kübler, and Enkelejda Kasneci. 2016. Else: Ellipse selection for robust pupil detection in real-world environments. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. 123--130.Google ScholarDigital Library
- Wolfgang Fuhl, Marc Tonsen, Andreas Bulling, and Enkelejda Kasneci. 2016. Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art. Machine Vision and Applications 27, 8 (01 Nov 2016), 1275--1288. https://doi.org/10.1007/s00138-016-0776--4Google Scholar
- Joseph W. Goodman. 2020. Speckle Phenomena in Optics - Theory and Applications. SPIE Press, Bellingham, Washington.Google Scholar
- Martin Grabherr, Philipp Gerlach, Roger King, and Roland Jäger. 2009. Integrated photodiodes complement the VCSEL platform. In Vertical-Cavity Surface-Emitting Lasers XIII, Vol. 7229. International Society for Optics and Photonics, 72290E.Google ScholarCross Ref
- Boris Greenberg. 2021. EyeWay Vision: Highly Efficient Immersive AR Glasses Using Gaze-locked Exit Pupil Steering. In SPIE AVR21 Industry Talks II, Conference Chair (Ed.), Vol. 11764. International Society for Optics and Photonics, SPIE. https://doi.org/10.1117/12.2598227Google Scholar
- E.D. Guestrin and M. Eizenman. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on Biomedical Engineering 53, 6 (2006), 1124--1133. https://doi.org/10.1109/TBME.2005. 863952Google ScholarCross Ref
- Kenneth Holmqvist, Saga Lee Orbom, Michael Miller, Albert Kashchenevsky, Mark M. Shovman, and Mark W. Greenlee. 2020. Validation of a Prototype Hybrid Eye-Tracker against the DPI and the Tobii Spectrum. In ACM Symposium on Eye Tracking Research and Applications (ETRA 20 Full Papers). Association for Computing Machinery, New York, NY, USA, Article 6, 9 pages. https://doi.org/10.1145/3379155.3391330Google ScholarDigital Library
- Anton S. Kaplanyan, Anton Sochenov, Thomas Leimkühler, Mikhail Okunev, Todd Goodall, and Gizem Rufo. 2019. DeepFovea: Neural Reconstruction for Foveated Rendering and Video Compression Using Learned Statistics of Natural Videos. ACM Trans. Graph. 38, 6, Article 212 (Nov. 2019), 13 pages. https://doi.org/10.1145/3355089.3356557Google ScholarDigital Library
- Enkelejda Kasneci, Katrin Sippel, Kathrin Aehling, Martin Heister, Wolfgang Rosenstiel, Ulrich Schiefer, and Elena Papageorgiou. 2014. Driving with Binocular Visual Field Loss? A Study on a Supervised On-Road Parcours with Simultaneous Eye and Head Tracking. PLOS ONE 9, 2 (02 2014), 1--13. https://doi.org/10.1371/journal.pone.0087470Google Scholar
- Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). ACM, New York, NY, USA, 1151--1160. https: //doi.org/10.1145/2638728.2641695Google ScholarDigital Library
- Jonghyun Kim, Youngmo Jeong, Michael Stengel, Kaan Ak?it, Rachel Albert, Ben Boudaoud, Trey Greer, Joohwan Kim, Ward Lopes, Zander Majercik, Peter Shirley, Josef Spjut, Morgan McGuire, and David Luebke. 2019. Foveated Proc. ACM Hum.-Comput. Interact., Vol. 6, No. ETRA, Article 140. Publication date: May 2022. A Highly Integrated Ambient Light Robust Eye-Tracking Sensor 140:17 AR: Dynamically-foveated Augmented Reality Display. ACM Trans. Graph. 38, 4, Article 99 (July 2019), 15 pages. https://doi.org/10.1145/3306346.3322987Google ScholarDigital Library
- Xiaoxu Meng, Ruofei Du, and Amitabh Varshney. 2020. Eye-dominance-guided Foveated Rendering. IEEE Transactions on Visualization and Computer Graphics 26, 5 (2020), 1972--1980. https://doi.org/10.1109/TVCG.2020.2973442Google ScholarCross Ref
- Johannes Meyer, Adrian Frank, Thomas Schlebusch, and Enkeljeda Kasneci. 2022. A CNN-Based Human Activity Recognition System Combining a Laser Feedback Interferometry Eye Movement Sensor and an IMU for ContextAware Smart Glasses. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 5, 4, Article 172 (dec 2022), 24 pages. https://doi.org/10.1145/3494998Google ScholarDigital Library
- Johannes Meyer, Thomas Schlebusch, Wolfgang Fuhl, and Enkelejda Kasneci. 2020. A Novel Camera-Free Eye Tracking Sensor for Augmented Reality based on Laser Scanning. IEEE Sensors Journal (2020), 1--9.Google ScholarCross Ref
- Johannes Meyer, Thomas Schlebusch, Hans Spruit, Jochen Hellmig, and Enkelejda Kasneci. 2020. A Novel -Eye-Tracking Sensor for AR Glasses Based on Laser Self-Mixing Showing Exceptional Robustness Against Illumination. In ACM Symposium on Eye Tracking Research and Applications (ETRA 20 Short Papers). Association for Computing Machinery, New York, NY, USA, Article 31, 5 pages. https://doi.org/10.1145/3379156.3391352Google ScholarDigital Library
- Johannes Meyer, Thomas Schlebusch, Hans Spruit, Jochen Hellmig, and Enkelejda Kasneci. 2021. A compact low-power gaze gesture sensor based on laser feedback interferometry for smart glasses. In Proc. of SPIE Vol, Vol. 11788. 117880D--1.Google ScholarCross Ref
- Johannes Meyer, Thomas Schlebusch, Hans Spruit, Jochen Hellmig, and Enkelejda Kasneci. 2021. A Novel Gaze Gesture Sensor for Smart Glasses Based on Laser Self-Mixing. In CHI Conference on Human Factors in Computing Systems Extended Abstracts (CHI '21 Extended Abstracts). ACM, ACM, New York, NY, USA. https://doi.org/10.1145/3411763. 3451621Google ScholarDigital Library
- Rainer Michalzik. 2013. VCSELs - Fundamentals, Technology and Applications of Vertical-Cavity Surface-Emitting Lasers. Springer Berlin Heidelberg. https://doi.org/10.1007/978--3--642--24986-0Google Scholar
- AdHawk microsystems. 2021. Ad-Hawk MindLink Specifications Preliminary Data Sheet.Google Scholar
- Fabricio Batista Narcizo, Fernando EustÃquio Dantas dos Santos, and Dan Witzner Hansen. 2021. High-Accuracy Gaze Estimation for Interpolation-Based Eye-Tracking Methods. Vision 5, 3 (2021). https://doi.org/10.3390/vision5030041Google Scholar
- Karlene Nguyen, Cindy Wagner, David Koons, and Myron Flickner. 2002. Differences in the Infrared Bright Pupil Response of Human Eyes. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications (ETRA 02). Association for Computing Machinery, New York, NY, USA, 6. https://doi.org/10.1145/507072.507099Google ScholarDigital Library
- Diederick C Niehorster, Thiago Santini, Roy S Hessels, Ignace TC Hooge, Enkelejda Kasneci, and Marcus Nyström. 2020. The impact of slippage on the data quality of head-worn eye trackers. Behavior Research Methods 52, 3 (2020), 1140--1160.Google ScholarCross Ref
- Bernardo R. Pires, Myung Hwangbo, Michael Devyver, and Takeo Kanade. 2013. Visible-Spectrum Gaze Tracking for Sports. In 2013 IEEE Conference on Computer Vision and Pattern Recognition Workshops. 1005--1010. https://doi.org/10. 1109/CVPRW.2013.146Google Scholar
- Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2018. PuRe: Robust pupil detection for real-time pervasive eye tracking. Computer Vision and Image Understanding 170 (2018), 40--50.Google ScholarDigital Library
- Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2018. PuReST: Robust pupil tracking for real-time pervasive eye tracking. In Proceedings of the 2018 ACM symposium on eye tracking research & applications. 1--5.Google ScholarDigital Library
- Thiago Santini, Diederick C. Niehorster, and Enkelejda Kasneci. 2019. Get a Grip: Slippage-Robust and Glint-Free Gaze Estimation for Real-Time Pervasive Head-Mounted Eye Tracking. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (ETRA 19). Association for Computing Machinery, New York, NY, USA, Article 17, 10 pages. https://doi.org/10.1145/3314111.3319835Google ScholarDigital Library
- N. Sarkar, B. O'Hanlon, A. Rohani, D. Strathearn, G. Lee, M. Olfat, and R. R. Mansour. 2017. A resonant eye-tracking microsystem for velocity estimation of saccades and foveated rendering. In 2017 IEEE 30th International Conference on Micro Electro Mechanical Systems (MEMS). 304--307. https://doi.org/10.1109/MEMSYS.2017.7863402Google ScholarCross Ref
- N. Sarkar, D. Strathearn, G. Lee, M. Olfat, A. Rohani, and R. R. Mansour. 2015. A large angle, low voltage, small footprint micromirror for eye tracking and near-eye display applications. In 2015 Transducers - 2015 18th International Conference on Solid-State Sensors, Actuators and Microsystems (TRANSDUCERS). 855--858. https://doi.org/10.1109/ TRANSDUCERS.2015.7181058Google Scholar
- Joseph M. Schmitt, S. H. Xiang, and Kin Man Yung. 1999. Speckle in optical coherence tomography. Journal of Biomedical Optics 4, 1 (1999), 95 -- 105. https://doi.org/10.1117/1.429925Google ScholarCross Ref
- Bosch Sensortec. 2020. Smartglasses Light Drive.Google Scholar
- Lech Swirski, Andreas Bulling, and Neil Dodgson. 2012. Robust Real-Time Pupil Tracking in Highly off-Axis Images. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA 12). Association for Computing Machinery, New York, NY, USA, 4. https://doi.org/10.1145/2168556.2168585Google ScholarDigital Library
- Lech wirski and Neil A. Dodgson. 2013. A fully-automatic, temporal approach to single camera, glint-free 3D eye model fitting [Abstract]. In Proceedings of ECEM 2013. http://www.cl.cam.ac.uk/research/rainbow/projects/eyemodelfit/ Proc. ACM Hum.-Comput. Interact., Vol. 6, No. ETRA, Article 140. Publication date: May 2022. 140:18 Johannes Meyer, Thomas Schlebusch, & Enkelejda KasneciGoogle Scholar
- Thomas Taimre, Milan Nikoli, Karl Bertling, Yah Leng Lim, Thierry Bosch, and Aleksandar D. Raki. 2015. Laser feedback interferometry: a tutorial on the self-mixing effect for coherent sensing. Adv. Opt. Photon. 7, 3 (Sep 2015), 570--631. https://doi.org/10.1364/AOP.7.000570Google ScholarCross Ref
- Tobii. 2021. Pro Glasses 3 Product Description. https://www.tobiipro.com/siteassets/tobii-pro/product-descriptions/ product-description-tobii-pro-glasses-3.pdf/?v=1.4Google Scholar
- Marc Tonsen, Chris Kay Baumann, and Kai Dierkes. 2020. A High-Level Description and Performance Evaluation of Pupil Invisible. arXiv preprint arXiv:2009.00508 (2020).Google Scholar
- Marc Tonsen, Julian Steil, Yusuke Sugano, and Andreas Bulling. 2017. InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3, Article 106 (Sept. 2017), 21 pages. https://doi.org/10.1145/3130971Google ScholarDigital Library
- Tobias Wilm, Simone HÃckh, Reinhold Fiess, and Wilhelm Stork. 2021. Holographic combiners for augmented reality applications fabricated by wave front recording. In Novel Optical Systems, Methods, and Applications XXIV, Cornelius F. Hahlweg and Joseph R. Mulley (Eds.), Vol. 11815. International Society for Optics and Photonics, SPIE, 13 -- 22. https://doi.org/10.1117/12.2596838Google Scholar
- Yuk-Hoi Yiu, Moustafa Aboulatta, Theresa Raiser, Leoni Ophey, Virginia L. Flanagin, Peter zu Eulenburg, and SeyedAhmad Ahmadi. 2019. DeepVOG: Open-source pupil segmentation and gaze estimation in neuroscience using deep learning. Journal of Neuroscience Methods 324 (2019), 108307. https://doi.org/10.1016/j.jneumeth.2019.05.016Google ScholarCross Ref
- Jianbo Zhao, Benjamin D. Chrysler, and Raymond K. Kostuk. 2021. Design of a waveguide eye-tracking system operating in near-infrared with holographic optical elements. Optical Engineering 60, 8 (2021). https://doi.org/10.1117/ 1.OE.60.8.085101Google Scholar
Index Terms
- A Highly Integrated Ambient Light Robust Eye-Tracking Sensor for Retinal Projection AR Glasses Based on Laser Feedback Interferometry
Recommendations
A Holographic Single-Pixel Stereo Camera Sensor for Calibration-free Eye-Tracking in Retinal Projection Augmented Reality Glasses
ETRA '22: 2022 Symposium on Eye Tracking Research and ApplicationsEye-tracking is a key technology for future retinal projection based AR glasses as it enables techniques such as foveated rendering or gaze-driven exit pupil steering, which both increases the system’s overall performance. However, two of the major ...
A Novel -Eye-Tracking Sensor for AR Glasses Based on Laser Self-Mixing Showing Exceptional Robustness Against Illumination
ETRA '20 Short Papers: ACM Symposium on Eye Tracking Research and ApplicationsThe integration of eye-tracking sensors in next-generation AR glasses will increase usability and enable new interaction concepts. Consumer AR glasses emphasize however additional requirements to eye-tracking sensors, such as high integratability and ...
Low Power Scanned Laser Eye Tracking for Retinal Projection AR Glasses
ETRA '20 Adjunct: ACM Symposium on Eye Tracking Research and ApplicationsNext generation AR glasses require a highly integrated, high-resolution near-eye display technique such as focus-free retinal projection to enhance usability. Combined with low-power eye-tracking, such glasses enable better user experience and ...
Comments