ABSTRACT
In this paper, we show how recordings of gaze movements (via eye tracking) and brain activity (via electroencephalography) can be combined to provide an interface for implicit selection in a graphical user interface. This implicit selection works completely without manual intervention by the user. In our approach, we formulate implicit selection as a classification problem, describe the employed features and classification setup and introduce our experimental setup for collecting evaluation data. With a fully online-capable setup, we can achieve an F_0.2-score of up to 0.74 for temporal localization and a spatial localization accuracy of more than 0.95.
- Thierry Baccino and Yves Manunta. 2005. Eye-Fixation-Related Potentials: Insight into Parafoveal Processing. Journal of Psychophysiology 19, 3 (Jan. 2005), 204–215.Google ScholarCross Ref
- Rukshan Batuwita and Vasile Palade. 2013. Class Imbalance Learning Methods for Support Vector Machines. In Imbalanced Learning, Haibo He and Yunqian (Eds.). John Wiley & Sons, Inc., 83–99.Google Scholar
- TR Beelders and PJ Blignaut. 2014. Gaze and speech: pointing device and text entry modality. In Current Trends in Eye Tracking Research. Springer, 51–75.Google Scholar
- N. Bigdely-Shamlo, A. Vankov, R.R. Ramirez, and S. Makeig. 2008. Brain Activity-Based Image Classification From Rapid Serial Visual Presentation. IEEE Transactions on Neural Systems and Rehabilitation Engineering 16, 5 (Oct. 2008), 432–441.Google ScholarCross Ref
- Olaf Dimigen, Werner Sommer, Annette Hohlfeld, Arthur M. Jacobs, and Reinhold Kliegl. 2011. Coregistration of eye movements and EEG in natural reading: Analyses and review. Journal of Experimental Psychology: General 140, 4 (2011), 552–572.Google ScholarCross Ref
- E. Donchin, K.M. Spencer, and R. Wijesinghe. 2000. The mental prosthesis: assessing the speed of a P300-based brain-computer interface. IEEE Transactions on Rehabilitation Engineering 8, 2 (June 2000), 174–179.Google ScholarCross Ref
- Andrea Finke, Kai Essig, Giuseppe Marchioro, and Helge Ritter. 2016. Toward FRP-Based Brain-Machine Interfaces—Single-Trial Classification of Fixation-Related Potentials. PLOS ONE 11, 1 (Jan. 2016), e0146848.Google ScholarCross Ref
- James D Foley, Andries van Dam, Steven Feiner, and John Hughes. 1990. Com-puter Graphics: Principles and practice. Addison-Welsey, 1997 (1990). Google ScholarDigital Library
- A.D. Gerson, L.C. Parra, and P. Sajda. 2006. Cortically coupled computer vision for rapid image search. IEEE Transactions on Neural Systems and Rehabilitation Engineering 14, 2 (June 2006), 174–179.Google ScholarCross Ref
- Haibo He, Yang Bai, E. A. Garcia, and Shutao Li. 2008. ADASYN: Adaptive synthetic sampling approach for imbalanced learning. In 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence). 1322–1328.Google Scholar
- Jutta Hild, Christian Kühnle, and Jürgen Beyerer. 2016. Gaze-based moving target acquisition in real-time full motion video. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 241–244. Google ScholarDigital Library
- J Hild, E Müller, E Klaus, E Peinsipp-Byma, and J Beyerer. 2013. Evaluating multi-modal eye gaze interaction for moving object selection. Proc. ACHI (2013), 454–459.Google Scholar
- Jutta Hild, Felix Putze, David Kaufman, Christian Kühnle, Tanja Schultz, and Jürgen Beyerer. 2014. Spatio-Temporal Event Selection in Basic Surveillance Tasks using Eye Tracking and EEG. In Proceedings of the 7th Workshop on Eye Gaze in Intelligent Human Machine Interaction. ACM, 3–8. Google ScholarDigital Library
- Baihan Huang, Anthony H.P. Lo, and Bertram E. Shi. 2013. Integrating EEG information improves performance of gaze based cursor control. In 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER). 415–418.Google Scholar
- Florian Hutzler, Mario Braun, Melissa L. H. Võ, Verena Engl, Markus Hofmann, Michael Dambacher, Helmut Leder, and Arthur M. Jacobs. 2007. Welcome to the real world: Validating fixation-related brain potentials for ecologically valid settings. Brain Research 1172 (2007), 124–129.Google ScholarCross Ref
- Robert JK Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems (TOIS) 9, 2 (1991), 152–169. Google ScholarDigital Library
- Marcel A Just and Patricia A Carpenter. 1980. A theory of reading: from eye fixations to comprehension. Psychological review 87, 4 (1980), 329.Google Scholar
- S. Koelstra, C. Muhl, and I. Patras. 2009. EEG analysis for implicit tagging of video data. In 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 1–6.Google Scholar
- Manu Kumar, Andreas Paepcke, and Terry Winograd. 2007. Eyepoint: practical pointing and selection using gaze and keyboard. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 421–430. Google ScholarDigital Library
- Felix Putze, Christoph Amma, and Tanja Schultz. 2015. Design and Evaluation of a Self-Correcting Gesture Interface Based on Error Potentials from EEG. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI ’15). ACM, New York, NY, USA, 3375–3384. Google ScholarDigital Library
- Felix Putze, Sebastian Hesslinger, Chun-Yu Tse, YunYing Huang, Christian Herff, Cuntai Guan, and Tanja Schultz. 2014. Hybrid fNIRS-EEG based classification of auditory and visual perception processes. Frontiers in neuroscience 8 (2014).Google Scholar
- Felix Putze, Jutta Hild, Rainer Kärgel, Christian Herff, Alexander Redmann, Jürgen Beyerer, and Tanja Schultz. 2013. Locating User Attention Using Eye Tracking and EEG for Spatio-temporal Event Selection. In Proceedings of the International Conference on Intelligent User Interfaces. Santa Monica, USA. Google ScholarDigital Library
- Felix Putze, Maximilian Scherer, and Tanja Schultz. 2016. Starring into the void? Classifying Internal vs. External Attention from EEG. In Proceedings of 9th Nordic Conference on Human-Computer Interaction (NordiCHI). Gothenborg, Sweden. Google ScholarDigital Library
- Dario D. Salvucci and Joseph H. Goldberg. 2000. Identifying Fixations and Saccades in Eye-tracking Protocols. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications (ETRA ’00). ACM, New York, NY, USA, 71–78. Google ScholarDigital Library
- Alois Schlögl, Claudia Keinrath, Doris Zimmermann, Reinhold Scherer, Robert Leeb, and Gert Pfurtscheller. 2007.Google Scholar
- A fully automated correction method of EOG artifacts in EEG recordings. Clinical Neurophysiology 118, 1 (2007), 98–104.Google Scholar
- Pradeep Shenoy and Desney S. Tan. 2008. Human-aided Computing: Utilizing Implicit Human Processing to Classify Images. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’08). ACM, New York, NY, USA, 845–854. Google ScholarDigital Library
- Roel Vertegaal. 2008. A Fitts Law comparison of eye tracking and manual input in the selection of visual targets. In Proceedings of the 10th international conference on Multimodal interfaces. ACM, 241–248. Google ScholarDigital Library
- Jun Wang, Eric Pohlmeyer, Barbara Hanna, Yu-Gang Jiang, Paul Sajda, and Shih-Fu Chang. 2009. Brain State Decoding for Rapid Image Retrieval. In Proceedings of the 17th ACM International Conference on Multimedia (MM ’09). ACM, New York, NY, USA, 945–954. Google ScholarDigital Library
- X. Yong, M. Fatourechi, R. K. Ward, and G. E. Birch. 2011. The Design of a Point-and-Click System by Integrating a Self-Paced Brain #x2013;Computer Interface With an Eye-Tracker. IEEE Journal on Emerging and Selected Topics in Circuits and Systems 1, 4 (Dec. 2011), 590–602.Google ScholarCross Ref
Index Terms
- Intervention-free selection using EEG and eye tracking
Recommendations
Improving hands-free menu selection using eyegaze glances and fixations
ETRA '08: Proceedings of the 2008 symposium on Eye tracking research & applicationsA real-time eyegaze selection interface was implemented using a Tobii eyegaze tracking monitor. A hierarchical button menu was displayed on the screen and specified selections were made by eyegaze fixations and glances on the menu widgets. The initial ...
Eye detection and coarse localization of pupil for video-based eye tracking systems
AbstractA video-based eye tracking system generally captures NIR images, each of which contains one or two eyes of a subject. The subject’s point of gaze is then determined using 3D eye model and pupil centre corneal reflection technique. Eye detection ...
Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection
UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and TechnologyEye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and ...
Comments