skip to main content
10.1145/3359996.3364707acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
abstract

A Virtual Reality Simulator for Training Gaze Control of Wheeled Tele-Robots

Published:12 November 2019Publication History

ABSTRACT

People who cannot use their hands may use eye-gaze to interact with robots. Emerging virtual reality head-mounted displays (HMD) have built-in eye-tracking sensors. Previous studies suggest that users need substantial practice for gaze steering of wheeled robots with an HMD. In this paper, we propose to apply a VR-based simulator for training of gaze-controlled robot steering. The simulator and preliminary test results are presented.

References

  1. Robert Leeb, Luca Tonin, Martin Rohm, Lorenzo Desideri, Tom Carlson, and Jose del R Millan. 2015. Towards independence: a BCI telepresence robot for people with severe motor disabilities. Proc. IEEE 103, 6 (2015), 969–982.Google ScholarGoogle ScholarCross RefCross Ref
  2. Eoin MacCraith, James C Forde, and Niall F Davis. 2019. Robotic simulation training for urological trainees: a comprehensive review on cost, merits and challenges. Journal of robotic surgery(2019), 1–7.Google ScholarGoogle Scholar
  3. Luis Pérez, Eduardo Diez, Rubén Usamentiaga, and Daniel F García. 2019. Industrial robot control and operator training using virtual reality interfaces. Computers in Industry 109 (2019), 114–120.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Katherine M Tsui, Kelsey Flynn, Amelia McHugh, Holly A Yanco, and David Kontak. 2013. Designing speech-based interfaces for telepresence robots for people with disabilities. In 2013 IEEE 13th International Conference on Rehabilitation Robotics (ICORR). IEEE, 1–8.Google ScholarGoogle ScholarCross RefCross Ref
  5. Ginger S Watson, Yiannis E Papelis, and Katheryn C Hicks. 2016. Simulation-based environment for the eye-tracking control of tele-operated mobile robots. In Proceedings of the Modeling and Simulation of Complexity in Intelligent, Adaptive and Autonomous Systems 2016 (MSCIAAS 2016) and Space Simulation for Planetary Space Exploration (SPACE 2016). Society for Computer Simulation International, 4.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Tom Williams, Daniel Szafir, Tathagata Chakraborti, and Elizabeth Phillips. 2019. Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI). In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 671–672.Google ScholarGoogle ScholarCross RefCross Ref
  7. Guangtao Zhang, John Paulin Hansen, and Katsumi Minakata. 2019a. Hand- and gaze-control of telepresence robots. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. ACM, 70: 1– 8.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Guangtao Zhang, John Paulin Hansen, Katsumi Minakata, Alexandre Alapetite, and Zhongyu Wang. 2019b. Eye-Gaze-Controlled Telepresence Robots for People with Motor Disabilities. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 574–575.Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format