ABSTRACT
People who cannot use their hands may use eye-gaze to interact with robots. Emerging virtual reality head-mounted displays (HMD) have built-in eye-tracking sensors. Previous studies suggest that users need substantial practice for gaze steering of wheeled robots with an HMD. In this paper, we propose to apply a VR-based simulator for training of gaze-controlled robot steering. The simulator and preliminary test results are presented.
- Robert Leeb, Luca Tonin, Martin Rohm, Lorenzo Desideri, Tom Carlson, and Jose del R Millan. 2015. Towards independence: a BCI telepresence robot for people with severe motor disabilities. Proc. IEEE 103, 6 (2015), 969–982.Google ScholarCross Ref
- Eoin MacCraith, James C Forde, and Niall F Davis. 2019. Robotic simulation training for urological trainees: a comprehensive review on cost, merits and challenges. Journal of robotic surgery(2019), 1–7.Google Scholar
- Luis Pérez, Eduardo Diez, Rubén Usamentiaga, and Daniel F García. 2019. Industrial robot control and operator training using virtual reality interfaces. Computers in Industry 109 (2019), 114–120.Google ScholarDigital Library
- Katherine M Tsui, Kelsey Flynn, Amelia McHugh, Holly A Yanco, and David Kontak. 2013. Designing speech-based interfaces for telepresence robots for people with disabilities. In 2013 IEEE 13th International Conference on Rehabilitation Robotics (ICORR). IEEE, 1–8.Google ScholarCross Ref
- Ginger S Watson, Yiannis E Papelis, and Katheryn C Hicks. 2016. Simulation-based environment for the eye-tracking control of tele-operated mobile robots. In Proceedings of the Modeling and Simulation of Complexity in Intelligent, Adaptive and Autonomous Systems 2016 (MSCIAAS 2016) and Space Simulation for Planetary Space Exploration (SPACE 2016). Society for Computer Simulation International, 4.Google ScholarDigital Library
- Tom Williams, Daniel Szafir, Tathagata Chakraborti, and Elizabeth Phillips. 2019. Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI). In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 671–672.Google ScholarCross Ref
- Guangtao Zhang, John Paulin Hansen, and Katsumi Minakata. 2019a. Hand- and gaze-control of telepresence robots. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. ACM, 70: 1– 8.Google ScholarDigital Library
- Guangtao Zhang, John Paulin Hansen, Katsumi Minakata, Alexandre Alapetite, and Zhongyu Wang. 2019b. Eye-Gaze-Controlled Telepresence Robots for People with Motor Disabilities. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 574–575.Google Scholar
Recommendations
Head and gaze control of a telepresence robot with an HMD
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & ApplicationsGaze interaction with telerobots is a new opportunity for wheelchair users with severe motor disabilities. We present a video showing how head-mounted displays (HMD) with gaze tracking can be used to monitor a robot that carries a 360° video camera and ...
Haptics-based virtual reality periodontal training simulator
This paper focuses upon the research and development of a prototype dental simulator for training of periodontal procedures. By the use of virtual reality and haptics technology, the periodontal simulator allows trainees to learn performing diagnosis ...
Advantage of Gaze-Only Content Browsing in VR using Cumulative Dwell Time Compared to Hand Controller
SUI '23: Proceedings of the 2023 ACM Symposium on Spatial User InteractionHead-mounted displays(HMDs) are expected to be used as daily devices. Developing interfaces to control contents projected in a head-mounted display (HMD) is key to leading the spread of HMD usage. With the need for a new interface of the HMD, gaze ...
Comments