skip to main content
10.1145/2993148.2993199acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Intervention-free selection using EEG and eye tracking

Published:31 October 2016Publication History

ABSTRACT

In this paper, we show how recordings of gaze movements (via eye tracking) and brain activity (via electroencephalography) can be combined to provide an interface for implicit selection in a graphical user interface. This implicit selection works completely without manual intervention by the user. In our approach, we formulate implicit selection as a classification problem, describe the employed features and classification setup and introduce our experimental setup for collecting evaluation data. With a fully online-capable setup, we can achieve an F_0.2-score of up to 0.74 for temporal localization and a spatial localization accuracy of more than 0.95.

References

  1. Thierry Baccino and Yves Manunta. 2005. Eye-Fixation-Related Potentials: Insight into Parafoveal Processing. Journal of Psychophysiology 19, 3 (Jan. 2005), 204–215.Google ScholarGoogle ScholarCross RefCross Ref
  2. Rukshan Batuwita and Vasile Palade. 2013. Class Imbalance Learning Methods for Support Vector Machines. In Imbalanced Learning, Haibo He and Yunqian (Eds.). John Wiley & Sons, Inc., 83–99.Google ScholarGoogle Scholar
  3. TR Beelders and PJ Blignaut. 2014. Gaze and speech: pointing device and text entry modality. In Current Trends in Eye Tracking Research. Springer, 51–75.Google ScholarGoogle Scholar
  4. N. Bigdely-Shamlo, A. Vankov, R.R. Ramirez, and S. Makeig. 2008. Brain Activity-Based Image Classification From Rapid Serial Visual Presentation. IEEE Transactions on Neural Systems and Rehabilitation Engineering 16, 5 (Oct. 2008), 432–441.Google ScholarGoogle ScholarCross RefCross Ref
  5. Olaf Dimigen, Werner Sommer, Annette Hohlfeld, Arthur M. Jacobs, and Reinhold Kliegl. 2011. Coregistration of eye movements and EEG in natural reading: Analyses and review. Journal of Experimental Psychology: General 140, 4 (2011), 552–572.Google ScholarGoogle ScholarCross RefCross Ref
  6. E. Donchin, K.M. Spencer, and R. Wijesinghe. 2000. The mental prosthesis: assessing the speed of a P300-based brain-computer interface. IEEE Transactions on Rehabilitation Engineering 8, 2 (June 2000), 174–179.Google ScholarGoogle ScholarCross RefCross Ref
  7. Andrea Finke, Kai Essig, Giuseppe Marchioro, and Helge Ritter. 2016. Toward FRP-Based Brain-Machine Interfaces—Single-Trial Classification of Fixation-Related Potentials. PLOS ONE 11, 1 (Jan. 2016), e0146848.Google ScholarGoogle ScholarCross RefCross Ref
  8. James D Foley, Andries van Dam, Steven Feiner, and John Hughes. 1990. Com-puter Graphics: Principles and practice. Addison-Welsey, 1997 (1990). Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. A.D. Gerson, L.C. Parra, and P. Sajda. 2006. Cortically coupled computer vision for rapid image search. IEEE Transactions on Neural Systems and Rehabilitation Engineering 14, 2 (June 2006), 174–179.Google ScholarGoogle ScholarCross RefCross Ref
  10. Haibo He, Yang Bai, E. A. Garcia, and Shutao Li. 2008. ADASYN: Adaptive synthetic sampling approach for imbalanced learning. In 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence). 1322–1328.Google ScholarGoogle Scholar
  11. Jutta Hild, Christian Kühnle, and Jürgen Beyerer. 2016. Gaze-based moving target acquisition in real-time full motion video. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 241–244. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. J Hild, E Müller, E Klaus, E Peinsipp-Byma, and J Beyerer. 2013. Evaluating multi-modal eye gaze interaction for moving object selection. Proc. ACHI (2013), 454–459.Google ScholarGoogle Scholar
  13. Jutta Hild, Felix Putze, David Kaufman, Christian Kühnle, Tanja Schultz, and Jürgen Beyerer. 2014. Spatio-Temporal Event Selection in Basic Surveillance Tasks using Eye Tracking and EEG. In Proceedings of the 7th Workshop on Eye Gaze in Intelligent Human Machine Interaction. ACM, 3–8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Baihan Huang, Anthony H.P. Lo, and Bertram E. Shi. 2013. Integrating EEG information improves performance of gaze based cursor control. In 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER). 415–418.Google ScholarGoogle Scholar
  15. Florian Hutzler, Mario Braun, Melissa L. H. Võ, Verena Engl, Markus Hofmann, Michael Dambacher, Helmut Leder, and Arthur M. Jacobs. 2007. Welcome to the real world: Validating fixation-related brain potentials for ecologically valid settings. Brain Research 1172 (2007), 124–129.Google ScholarGoogle ScholarCross RefCross Ref
  16. Robert JK Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems (TOIS) 9, 2 (1991), 152–169. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Marcel A Just and Patricia A Carpenter. 1980. A theory of reading: from eye fixations to comprehension. Psychological review 87, 4 (1980), 329.Google ScholarGoogle Scholar
  18. S. Koelstra, C. Muhl, and I. Patras. 2009. EEG analysis for implicit tagging of video data. In 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 1–6.Google ScholarGoogle Scholar
  19. Manu Kumar, Andreas Paepcke, and Terry Winograd. 2007. Eyepoint: practical pointing and selection using gaze and keyboard. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 421–430. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Felix Putze, Christoph Amma, and Tanja Schultz. 2015. Design and Evaluation of a Self-Correcting Gesture Interface Based on Error Potentials from EEG. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI ’15). ACM, New York, NY, USA, 3375–3384. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Felix Putze, Sebastian Hesslinger, Chun-Yu Tse, YunYing Huang, Christian Herff, Cuntai Guan, and Tanja Schultz. 2014. Hybrid fNIRS-EEG based classification of auditory and visual perception processes. Frontiers in neuroscience 8 (2014).Google ScholarGoogle Scholar
  22. Felix Putze, Jutta Hild, Rainer Kärgel, Christian Herff, Alexander Redmann, Jürgen Beyerer, and Tanja Schultz. 2013. Locating User Attention Using Eye Tracking and EEG for Spatio-temporal Event Selection. In Proceedings of the International Conference on Intelligent User Interfaces. Santa Monica, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Felix Putze, Maximilian Scherer, and Tanja Schultz. 2016. Starring into the void? Classifying Internal vs. External Attention from EEG. In Proceedings of 9th Nordic Conference on Human-Computer Interaction (NordiCHI). Gothenborg, Sweden. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Dario D. Salvucci and Joseph H. Goldberg. 2000. Identifying Fixations and Saccades in Eye-tracking Protocols. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications (ETRA ’00). ACM, New York, NY, USA, 71–78. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Alois Schlögl, Claudia Keinrath, Doris Zimmermann, Reinhold Scherer, Robert Leeb, and Gert Pfurtscheller. 2007.Google ScholarGoogle Scholar
  26. A fully automated correction method of EOG artifacts in EEG recordings. Clinical Neurophysiology 118, 1 (2007), 98–104.Google ScholarGoogle Scholar
  27. Pradeep Shenoy and Desney S. Tan. 2008. Human-aided Computing: Utilizing Implicit Human Processing to Classify Images. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’08). ACM, New York, NY, USA, 845–854. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Roel Vertegaal. 2008. A Fitts Law comparison of eye tracking and manual input in the selection of visual targets. In Proceedings of the 10th international conference on Multimodal interfaces. ACM, 241–248. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Jun Wang, Eric Pohlmeyer, Barbara Hanna, Yu-Gang Jiang, Paul Sajda, and Shih-Fu Chang. 2009. Brain State Decoding for Rapid Image Retrieval. In Proceedings of the 17th ACM International Conference on Multimedia (MM ’09). ACM, New York, NY, USA, 945–954. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. X. Yong, M. Fatourechi, R. K. Ward, and G. E. Birch. 2011. The Design of a Point-and-Click System by Integrating a Self-Paced Brain #x2013;Computer Interface With an Eye-Tracker. IEEE Journal on Emerging and Selected Topics in Circuits and Systems 1, 4 (Dec. 2011), 590–602.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Intervention-free selection using EEG and eye tracking

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ICMI '16: Proceedings of the 18th ACM International Conference on Multimodal Interaction
      October 2016
      605 pages
      ISBN:9781450345569
      DOI:10.1145/2993148

      Copyright © 2016 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 31 October 2016

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate453of1,080submissions,42%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader