skip to main content
10.1145/3172944.3172988acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
short-paper

Combining Brain-Computer Interface and Eye Tracking for High-Speed Text Entry in Virtual Reality

Published: 05 March 2018 Publication History

Abstract

Gaze interaction provides an efficient way for users to communicate and control in virtual reality (VR) presented by head-mounted displays. In gaze-based text-entry systems, eye tracking and brain-computer interface (BCI) are the two most commonly used approaches. This paper presents a hybrid BCI system for text entry in VR by combining steady-state visual evoked potentials (SSVEP) and eye tracking. The user interface in VR designed a 40-target virtual keyboard using a joint frequency-phase modulation method for SSVEP. Eye position was measured by an eye-tracking accessory in the VR headset. Target-related gaze direction was detected by combining simultaneously recorded SSVEP and eye position data. Offline and online experiments indicate that the proposed system can type at a speed around 10 words per minute, leading to an information transfer rate (ITR) of 270 bits per minute. The results further demonstrate the superiority of the hybrid method over single-modality methods for VR applications.

References

[1]
Jessica D. Bayliss and Dana H. Ballard. 2000. A virtual reality testbed for brain-computer interface research. IEEE Transactions on Rehabilitation Engineering 8, 2: 188--190.
[2]
Xiaogang Chen, Yijun Wang, Masaki Nakanishi, Xiaorong Gao, Tzyy-Ping Jung, and Shangkai Gao. 2015. High-speed spelling with a noninvasive brain-computer interface. Proceedings of the National Academy of Sciences 112, 44: E6058-E6067.
[3]
Xiaogang Chen, Yijun Wang, Shangkai Gao, Tzyy-Ping Jung, and Xiaorong Gao. 2015. Filter bank canonical correlation analysis for implementing a high-speed SSVEP-based brain-computer interface. Journal of Neural Engineering 12, 4, 046008.
[4]
Shangkai Gao, Yijun Wang, Xiaorong Gao, and Bo Hong. 2014. Visual and Auditory Brain-Computer Interfaces. IEEE Transactions on Biomedical Engineering 61, 5: 1436--1447.
[5]
Ivo Käthner, Andrea Kübler and Sebastian Halder. 2015. Comparison of eye tracking, electrooculography and an auditory brain-computer interface for binary communication: a case study with a participant in the locked-in state. Journal of Neuroengineering and Rehabilitation 12, 76.
[6]
Bonkon Koo, Hwan-Gon Lee, Yunjun Nam, and Seungjin Choi. 2015. Immersive BCI with SSVEP in VR head-mounted display. 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).
[7]
Anatole Lecuyer, Fabien Lotte, Richard B. Reilly, Robert Leeb, Michitaka Hirose, and Mel Slater. 2008. Brain-Computer Interfaces, Virtual Reality, and Videogames. Computer 41, 10: 66--72.
[8]
Eui Chul Lee, Jin Cheol Woo, Jong Hwa Kim, Mincheol Whang, and Kang Ryoung Park. 2010. A brain-computer interface method combined with eye tracking for 3D interaction. Journal of Neuroscience Methods 190, 2: 289--298.
[9]
Jeong-Hwan Lim, Jun-Hak Lee, Han-Jeong Hwang, Dong Hwan Kim, and Chang-Hwan Im. 2015. Development of a hybrid mental spelling system combining SSVEP-based brain-computer interface and webcam-based eye tracking. Biomedical Signal Processing and Control 21: 99--104.
[10]
Päivi Majaranta and Kari-Jouko Räihä. 2002. Twenty years of eye typing: systems and design issues. Proceedings of the 2002 symposium on Eye tracking research & applications (ETRA '02): 15--22.
[11]
Päivi Majaranta and Kari-Jouko Räihä.2007 Text entry by gaze: Utilizing eye-tracking. In Text entry systems: Mobility, accessibility, universality, I. Scott MacKenzie and Kumiko Tanaka-Ishii (eds.). Morgan Kaufmann, San Francisco, 175--187.
[12]
José del R. Millán, Rüdiger Rupp, Gernot Müller-Putz, Roderick Murray-Smith, Claudio Giugliemma, Michael Tangermann, C. Vidaurre, Febo Cincotti, Andrea Kübler, Robert Leeb, Christa Neuper, Klaus-Robert Müller and Donatella Mattia. 2010. Combining brain-computer interfaces and assistive technologies: state-of-the-art and challenges. Frontiers in Neuroscience 4, 161.
[13]
Gernot Müller-Putz, Robert Leeb, Michael Tangermann, Johannes Höhne, Andrea Kübler, Febo Cincotti, Donatella Mattia, Rüdiger Rupp, Klaus-Robert Müller, and José del R. Millán. 2015. Towards Noninvasive Hybrid Brain-Computer Interfaces: Framework, Practice, Clinical Application, and Beyond. Proceedings of the IEEE 103, 6: 926--943.
[14]
Masaki Nakanishi, Yijun Wang, Xiaogang Chen, Yu-Te Wang, Xiaorong Gao, and Tzyy-Ping Jung. 2018. Enhancing Detection of SSVEPs for a High-Speed Brain Speller Using Task-Related Component Analysis. IEEE Transactions on Biomedical Engineering, 65, 1: 104--112.
[15]
Diederick C. Niehorster, Li Li, and Markus Lappe. 2017. The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. i-Perception 8, 3: 1--23.
[16]
Gert Pfurtscheller, Brendan Z. Allison, Clemens Brunner, Gunther Bauernfeind, Teodoro Solis-Escalante. 2010. The hybrid BCI. Frontiers in Neuroscience 4, 30.
[17]
Thorsten Roth, Martin Weier, Andre Hinkenjann, Yongmin Li, and Philipp Slusallek. 2016. An analysis of eye-tracking data in foveated ray tracing. 2016 IEEE Second Workshop on Eye Tracking and Visualization (ETVIS): 69--73.
[18]
Kelly Tai, Stefanie Blain, and Tom Chau. 2008. A Review of Emerging Access Technologies for Individuals With Severe Motor Impairments. Assistive Technology 20, 4: 204--221.
[19]
Tobii Pro Accuracy and precision test method for remote eye trackers: https://www.tobiipro.com/siteassets/tobii-pro/accuracy-and-precision-tests/tobii-accuracy-and-precisiontest-method-version-2--1--1.pdf
[20]
Jonathan R. Wolpaw, Niels Birbaumer, Dennis J. Mcfarland, Gert Pfurtscheller, and Theresa M. Vaughan. 2002. Brain-computer interfaces for communication and control. Clinical Neurophysiology 113, 6: 767--791.

Cited By

View all
  • (2025)HCI Systems: Real-Time Detection and Interaction Based on EOG and IOGIEEE Transactions on Instrumentation and Measurement10.1109/TIM.2025.354210974(1-14)Online publication date: 2025
  • (2024)A comparative study of stereo-dependent SSVEP targets and their impact on VR-BCI performanceFrontiers in Neuroscience10.3389/fnins.2024.136793218Online publication date: 9-Apr-2024
  • (2024)Supporting Text Entry in Virtual Reality with Large Language Models2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00073(524-534)Online publication date: 16-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
IUI '18: Proceedings of the 23rd International Conference on Intelligent User Interfaces
March 2018
698 pages
ISBN:9781450349451
DOI:10.1145/3172944
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 March 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. brain-computer interface
  2. eye tracking
  3. steady-state visual evoked potentials
  4. text entry
  5. virtual reality

Qualifiers

  • Short-paper

Funding Sources

  • National Key Research and Development Plan
  • Beijing S&T planning task
  • National Natural Science Foundation of China

Conference

IUI'18
Sponsor:

Acceptance Rates

IUI '18 Paper Acceptance Rate 43 of 299 submissions, 14%;
Overall Acceptance Rate 746 of 2,811 submissions, 27%

Upcoming Conference

IUI '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)102
  • Downloads (Last 6 weeks)3
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)HCI Systems: Real-Time Detection and Interaction Based on EOG and IOGIEEE Transactions on Instrumentation and Measurement10.1109/TIM.2025.354210974(1-14)Online publication date: 2025
  • (2024)A comparative study of stereo-dependent SSVEP targets and their impact on VR-BCI performanceFrontiers in Neuroscience10.3389/fnins.2024.136793218Online publication date: 9-Apr-2024
  • (2024)Supporting Text Entry in Virtual Reality with Large Language Models2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00073(524-534)Online publication date: 16-Mar-2024
  • (2024)Design and Evaluation of Controller-Based Raycasting Methods for Efficient Alphanumeric and Special Character Entry in Virtual RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.334942830:9(6493-6506)Online publication date: 3-Jan-2024
  • (2024)Voice user interfaces for effortless navigation in medical virtual reality environmentsComputers & Graphics10.1016/j.cag.2024.104069(104069)Online publication date: Sep-2024
  • (2024)Design and evaluation of alphabetic and numeric input methods for virtual realityComputers and Graphics10.1016/j.cag.2024.103955122:COnline publication date: 1-Aug-2024
  • (2024)The role of eye movement signals in non-invasive brain-computer interface typing systemMedical & Biological Engineering & Computing10.1007/s11517-024-03070-762:7(1981-1990)Online publication date: 21-Mar-2024
  • (2024)Hands-free multi-type character text entry in virtual realityVirtual Reality10.1007/s10055-023-00902-z28:1Online publication date: 3-Jan-2024
  • (2023)Towards learner performance evaluation in iVR learning environments using eye-tracking and Machine-learningComunicar10.3916/C76-2023-0131:76Online publication date: 1-Jul-2023
  • (2023)VEPdgets: Towards Richer Interaction Elements Based on Visually Evoked PotentialsSensors10.3390/s2322912723:22(9127)Online publication date: 12-Nov-2023
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media