skip to main content
10.1145/3674746.3674760acmotherconferencesArticle/Chapter ViewAbstractPublication PagesrobceConference Proceedingsconference-collections
research-article

Combination of Augmented Reality Based Brain-Computer Interface and EyeTracking for Control of a Multi-Robot System

Published: 21 August 2024 Publication History

Abstract

Even as autonomous multi-robot systems (MRS) advance rapidly, current unmanned clusters struggle to complete many real-world robotic tasks relying solely on autonomous intelligence. Therefore, the human involvement is urgently required for MRS to effectively accomplish their missions in practical applications, and the hands-on control methods have been frequently employed for the operators to input MRS control commands. Whereas, in more complex environment where the tasks require both of the operator’s hands to be engaged, the hands-on approach is insufficient to achieve dependable human-MRS interaction. Therefore, this study represents an initial exploration into a hands-free control interface utilizing augmented reality (AR) and brain-computer interface (BCI) known as an AR-based FRP-BCI, an extension of the hands-on control approach. Specifically, the operator fixate on the target command menu displayed by Microsoft HoloLens2, the control command selection function is achieved with a more natural AR-based FRP-BCI via recognizing EEG signals. During the online experiment of the AR-based FRP-BCI command selection task with the MRS, the success rate of recognization is 90.67 ± 6.80 %. The proposed AR-based FRP-BCI system enables the users to select intention targets more natural in specific real-world environments. The results highlight the capability of the hands-free AR-based FRP-BCI to enhance traditional manual MRS input devices and the establishment of a more natural interface.

References

[1]
[1] Abhinav Dahiya, Alexander M. Aroyo, Kerstin Dautenhahn, and Stephen L. Smith. A survey of multi-agent human–robot interaction systems. Robotics and Autonomous Systems, 161:104335–, 2023.
[2]
[2] Marco Dorigo, Guy Theraulaz, and Vito Trianni. Swarm robotics: Past, present, and future [point of view]. Proceedings of the IEEE, 109(7):1152–1165, 2021.
[3]
[3] Pedro Monteiro, Guilherme Gonçalves, Hugo Coelho, Miguel Melo, and Maximino Bessa. Hands-free interaction in immersive virtual reality: A systematic review. IEEE Transactions on Visualization and Computer Graphics, 27(5):2702–2713, 2021.
[4]
[4] Maozheng Zhao, Henry Huang, Zhi Li, Rui Liu, Wenzhe Cui, Kajal Toshniwal, Ananya Goel, Andrew Wang, Xia Zhao, Sina Rashidian, et al. Eyesaycorrect: Eye gaze and voice based hands-free text correction for mobile devices. In 27th International Conference on Intelligent User Interfaces, pages 470–482, 2022.
[5]
[5] Lukas Grasse, Sylvain J Boutros, and Matthew S Tata. Speech interaction to control a hands-free delivery robot for high-risk health care scenarios. Frontiers in Robotics and AI, 8:612750, 2021.
[6]
[6] Hong Zeng, Yitao Shen, Dengfeng Sun, Xuhui Hu, Pengcheng Wen, and Aiguo Song. Extended control with hybrid gaze-bci for multi-robot system under hands-occupied dual-tasking. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 31:829–840, 2023.
[7]
[7] Amanpreet Kaur. Wheelchair control for disabled patients using emg/eog based human machine interface: a review. Journal of medical engineering & technology, 45(1):61–74, 2021.
[8]
[8] Sergei L Shishkin, Yuri O Nuzhdin, Evgeny P Svirin, Alexander G Trofimov, Anastasia A Fedorova, Bogdan L Kozyrskiy, and Boris M Velichkovsky. Eeg negativity in fixations used for gaze-based control: Toward converting intentions into actions with an eye-brain-computer interface. Frontiers in neuroscience, 10:528, 2016.
[9]
[9] Yang Deng, Qingyu Sun, Ce Wang, Yijun Wang, and S Kevin Zhou. Trca-net: using trca filters to boost the ssvep classification with convolutional neural network. Journal of Neural Engineering, 20(4):046005, 2023.
[10]
[10] Simon Ladouce, Magda Mustile, Magdalena Ietswaart, and Frédéric Dehais. Capturing cognitive events embedded in the real world using mobile electroencephalography and eye-tracking. Journal of Cognitive Neuroscience, 34(12):2237–2255, 2022.
[11]
[11] Dean J Krusienski, Eric W Sellers, Dennis J McFarland, Theresa M Vaughan, and Jonathan R Wolpaw. Toward enhanced p300 speller performance. Journal of neuroscience methods, 167(1):15–21, 2008.
[12]
[12] Stephanie Lees, Natalie Dayan, Hubert Cecotti, Paul McCullagh, Liam Maguire, Fabien Lotte, and Damien Coyle. A review of rapid serial visual presentation-based brain–computer interfaces. Journal of neural engineering, 15(2):021001, 2018.
[13]
[13] Emily J Allen, Ghislain St-Yves, Yihan Wu, Jesse L Breedlove, Jacob S Prince, Logan T Dowdle, Matthias Nau, Brad Caron, Franco Pestilli, Ian Charest, et al. A massive 7t fmri dataset to bridge cognitive neuroscience and artificial intelligence. Nature neuroscience, 25(1):116–126, 2022.
[14]
[14] Sarah Aliko, Jiawen Huang, Florin Gheorghiu, Stefanie Meliss, and Jeremy I Skipper. A naturalistic neuroimaging database for understanding the brain using ecological stimuli. Scientific Data, 7(1):347, 2020.
[15]
[15] Alexandra Krugliak and Alex Clarke. Towards real-world neuroscience using mobile eeg and augmented reality. Scientific Reports, 12(1):2291, 2022.
[16]
[16] Ruslan Aydarkhanov, Marija Ušćumlić, Ricardo Chavarriaga, Lucian Gheorghe, and José del R Millán. Closed-loop eeg study on visual recognition during driving. Journal of neural engineering, 18(2):026010, 2021.
[17]
[17] Vicente Soto, John Tyson-Carr, Katerina Kokmotou, Hannah Roberts, Stephanie Cook, Nicholas Fallon, Timo Giesbrecht, and Andrej Stancak. Brain responses to emotional faces in natural settings: A wireless mobile eeg recording study. Frontiers in Psychology, 9, 2018.
[18]
[18] Sergei L. Shishkin, Yuri O. Nuzhdin, Evgeny P. Svirin, Alexander G. Trofimov, Anastasia A. Fedorova, Bogdan L. Kozyrskiy, and Boris M. Velichkovsky. Eeg negativity in fixations used for gaze-based control: Toward converting intentions into actions with an eye-brain-computer interface. Frontiers in Neuroscience, 10:528, 2016.
[19]
[19] Bin Zhao, Jinfeng Huang, Gaoyan Zhang, Jianwu Dang, Minbo Chen, YingjianFu, and Longbiao Wang. Revealing spatiotemporal brain dynamics of speech production based on eeg and eye movement. In INTERSPEECH, pages 1427–1431, 2018.
[20]
[20] Thierry Baccino and Yves Manunta. Eye-fixation-related potentials: Insight into parafoveal processing. Journal of Psychophysiology, 19(3):204–215, 2005.
[21]
[21] Lisandro N Kaunitz, Juan E Kamienkowski, Alexander Varatharajah, Mariano Sigman, Rodrigo Quian Quiroga, and Matias J Ison. Looking for a face in the crowd: Fixation-related potentials in an eye-movement visual search task. NeuroImage, 89:297–305, 2014.
[22]
[22] Anna Maria Feit, Shane Williams, Arturo Toledo, Ann Paradiso, and Meredith Ringel Morris. Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In Chi Conference, 2017.
[23]
[23] Weihua Pei, Xiaoting Wu, Xiang Zhang, Aihua Zha, Sen Tian, Yijun Wang, and Xiaorong Gao. A pre-gelled eeg electrode and its application in ssvep-based bci. IEEE Transactions on Neural Systems And Rehabilitation Engineering, 30:843–850, 2022.
[24]
[24] Hong Zeng, Junjie Shen, Wenming Zheng, Aiguo Song, and Jia Liu. Toward measuring target perception: First-order and second-order deep network pipeline for classification of fixation-related potentials. Hindawi Limited, 2020.

Index Terms

  1. Combination of Augmented Reality Based Brain-Computer Interface and EyeTracking for Control of a Multi-Robot System

          Recommendations

          Comments

          Information & Contributors

          Information

          Published In

          cover image ACM Other conferences
          RobCE '24: Proceedings of the 2024 4th International Conference on Robotics and Control Engineering
          June 2024
          186 pages
          ISBN:9798400716782
          DOI:10.1145/3674746
          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          Published: 21 August 2024

          Permissions

          Request permissions for this article.

          Check for updates

          Author Tags

          1. Eye tracking
          2. augmented reality
          3. brain-computer interface
          4. electroencephalograph
          5. fixation related potential
          6. multi-robot system

          Qualifiers

          • Research-article
          • Research
          • Refereed limited

          Funding Sources

          • National Natural Science Foun- dation of China

          Conference

          RobCE 2024

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • 0
            Total Citations
          • 52
            Total Downloads
          • Downloads (Last 12 months)52
          • Downloads (Last 6 weeks)6
          Reflects downloads up to 05 Mar 2025

          Other Metrics

          Citations

          View Options

          Login options

          View options

          PDF

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader

          HTML Format

          View this article in HTML Format.

          HTML Format

          Figures

          Tables

          Media

          Share

          Share

          Share this Publication link

          Share on social media