Abstract
People with severe speech and motor impairment often find it difficult to manipulate physical objects due to spasticity and have familiarity with eye pointing based communication. This article presents a novel eye gaze controlled augmented reality human-robot interface that maintains a safe distance of the robot from the operator. We used a bespoke appearance-based eye gaze tracking algorithm and compared two different safe distance maintenance algorithms. We undertook simulation studies followed by a user trial involving end users. Users with severe speech and motor impairment could bring the robotic arm at any designated point within its working envelope in less than 3 minutes.
- A. Agarwal, D. V. JeevithaShree, K. S. Saluja, A. Sahay, P. Mounika, A. Sahu, and P. Biswas. 2019. Comparing two webcam-based eye gaze trackers for users with severe speech and motor impairment. In Research into Design for a Connected World. Springer, Singapore, 641--652.Google Scholar
- 2021. Effects of gaze and arm motion kinesics on a humanoid's perceived confidence, eagerness to learn, and attention to the task in a teaching scenario. In Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. 197–206.Google ScholarDigital Library .
- S. Alsharif. 2018. Gaze-Based Control of Robot Arm in Three-Dimensional Space Doctoral dissertation, Universität Bremen.Google Scholar
- Reuben M. Aronson and Henny Admoni. 2020. Eye gaze for assistive manipulation. In Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction. 552--554.Google Scholar
- 2016. Gaze gesture-based human-robot interface. In Proceedings of Technische Unterstützungssysteme, die Menschen wirklich wollen.Google Scholar .
- 2004. I, Robot. Random House, New York, NY.Google Scholar .
- 2019. Estimating pilots’ cognitive load from ocular parameters through simulation and in-flight studies. Journal of Eye Movement Research 12, 3 (2019), 10.16910/jemr.12.3.3.Google ScholarCross Ref .
- 2009. A multimodal human-robot-interaction scenario: Working together with an industrial robot. In Proceedings of the International Conference on Human-Computer Interaction. 303–311.Google ScholarDigital Library .
- 2002. The camera mouse: visual tracking of body features to provide computer access for people with severe disabilities. IEEE Transactions on Neural Systems and Rehabilitation Engineering 10, 1 (2002), 1–10.Google ScholarCross Ref .
- 2016. Exploring the use of eye gaze-controlled interfaces in automotive environments. Springer.Google Scholar .
- 2018. Eye-gaze controlled MFD for military aviation. In Proceedings of the ACM International Conference on Intelligent User Interfaces (IUI’18).Google ScholarDigital Library .
- 2018. Detecting drivers’ cognitive load from saccadic intrusion. Transportation Research Part F: Traffic Psychology and Behaviour 54 (2018), 63–78.Google ScholarCross Ref .
- 2010. Evaluating the design of inclusive interfaces by simulation. In Proceedings of the ACM International Conference on Intelligent User Interfaces (IUI’10).Google ScholarDigital Library .
- 2011. Real-time collision avoidance algorithm for robotic manipulators. Industrial Robot 38, 2 (2011), 186–197.Google ScholarCross Ref .
- 2016. Personality perception of robot avatar tele-operators. In Proceedings of the 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI’16). IEEE, Los Alamitos, CA, 141–148.Google ScholarCross Ref .
- 2019. Telesuit: An immersive user-centric telepresence control suit. In Proceedings of the 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI’19). IEEE, Los Alamitos, CA, 654–655.Google ScholarCross Ref .
- 2004. A human-robot interface based on electrooculography. In Proceedings of the IEEE International Conference on Robotics and Automation.Google Scholar .
- 2009. ImageNet: A large-scale hierarchical image database. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition. IEEE, Los Alamitos, CA, 248–255.Google ScholarCross Ref .
- 2021. MIT robot could help people with limited mobility dress themselves. Engadget. Retrieved July 30, 2021 from https://www.engadget.com/mit-csail-unveils-a-robot-that-helps-the-infirmed-dress-themselves-120016438.html.Google Scholar .
- J. de Wit, A. Brandse, E. Krahmer, and P. Vogt. 2020. Varied human-like gestures for social robots: Investigating the effects on children's engagement and language learning. In 2020 15th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 359--367.Google Scholar
- DOBOT. 2021. DOBOT Magician. Retrieved July 30, 2021 from https://www.dobot.cc/dobot-magician/product-overview.html.Google Scholar
- 2016. Gaze-based teleprosthetic enables intuitive continuous control of complex robot arm use: Writing & drawing. In Proceedings of the 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob’16). IEEE, Los Alamitos, CA.Google ScholarDigital Library .
- M. A. Eid, N. Giakoumidis, and A. El Saddik. 2016. A novel eye-gaze-controlled wheelchair system for navigating unknown environments: case study with a person with ALS. IEEE Access 4 (2016), 558--573.Google Scholar
- 2016. Real-time computation of distance to dynamic obstacles with multiple depth sensors. IEEE Robotics and Automation Letters 2, 1 (2016), 56–63.Google ScholarCross Ref .
- 2014. A real time distributed approach to collision avoidance for industrial manipulators. In Proceedings of the 2014 IEEE Emerging Technology and Factory Automation (ETFA’14). IEEE, Los Alamitos, CA, 1–8.Google ScholarCross Ref .
- 2012. A depth space approach to human-robot collision avoidance. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation. IEEE, Los Alamitos, CA, 338–345.Google ScholarCross Ref .
- 2013. Gaze contingent Cartesian control of a robotic arm for laparoscopic surgery. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, Los Alamitos, CA, 3582–3589.Google ScholarCross Ref .
- 2021. Gaze+ Hold: Eyes-only direct manipulation with continuous gaze modulated by closure of one eye. In Proceedings of the ACM Symposium on Eye-Tracking Research and Applications: Bridging Communities. ACM, New York, NY.Google ScholarDigital Library .
- Google MediaPipe. 2021. MediaPipe Hands. Retrieved July 30, 2021 from https://google.github.io/mediapipe/solutions/hands.html.Google Scholar
- 2014. The use of gaze to control drones. In Proceedings of the Symposium on Eye-Tracking Research and Applications. 27–34.Google ScholarDigital Library .
- 2002. Rob@work: Robot assistant in industrial environments. In Proceedings of the 11th IEEE International Workshop on Robot and Human Interactive Communication. IEEE, Los Alamitos, CA.Google ScholarCross Ref .
- ISO. 2019. ISO/TS 15066:2016: Robots and Robotic Devices—Collaborative Robots. Retrieved July 30, 2021 from https://www.iso.org/standard/62996.html.Google Scholar
- ISO. n.d. ISO 10218-1:2011: Robots and Robotic Devices—Safety Requirements for Industrial Robots—Part 1: Robots. Retrieved July 30, 2021 from https://www.iso.org/standard/51330.html.Google Scholar
- 2018. Mastering ROS for Robotics Programming: Design, Build, and Simulate Complex Robots Using the Robot Operating System. Packt Publishing Ltd.Google Scholar .
- 2000. Investigating the applicability of user models for motion impaired users. In Proceedings of the ACM SIGACCESS Conference on Computers and Accessibility. 129–136.Google Scholar .
- 2001. A human-robot interface using eye-gaze tracking system for people with motor disabilities. Transaction on Control, Automation and Systems Engineering 3, 4 (2001), 229–235.Google Scholar .
- 2012. Studying gaze-based human-robot interaction: An experimental platform. In Proceedings of the 7th ACM/IEEE International Conference on Human-Robot Interaction.Google Scholar .
- 2003. Look where you're going [robotic wheelchair]. IEEE Robotics & Automation Magazine 10, 1 (2003), 26–34.Google ScholarCross Ref .
- 2012. Modelling empathic behaviour in a robotic game companion for children: An ethnographic study in real-world settings. In Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction. ACM, New York, NY, 367–374.Google ScholarDigital Library .
- 2007. USARSim: Simulation for the study of human-robot interaction. Journal of Cognitive Engineering and Decision Making 1, 1 (2007), 98–120.Google ScholarCross Ref .
- 2006. Powered wheelchair controlled by eye-tracking system. Optica Applicata 36, 2–3 (2006), 1–12.Google Scholar .
- J. W. Machangpa and T. S. Chingtham. 2018. Head gesture controlled wheelchair for quadriplegic patients. Procedia Computer Science 132 (2018), 342--351.Google Scholar
- 2015. Design considerations for safe human-robot collaborative workplaces. Procedia CIRP 37 (2015), 248–253.Google ScholarCross Ref .
- 2014. Toward safe human-robot collaboration by using multiple Kinects based real-time human tracking. Journal of Computing and Information Science in Engineering 14, 1 (2014), 011006.Google ScholarCross Ref .
- 2021. Appearance-based gaze estimation using attention and difference mechanism. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 3143–3152.Google Scholar .
- 2021. Design and real-time evaluation of an appearance-based gaze estimation system. Journal of Eye Movement Research 14, 2 (2021), 10.16910/jemr.14.4.2.Google Scholar .
- OptiTrack. 2021. OptiTrack Motion Capture Camera Flex13. Retrieved July 30, 2021 from https://optitrack.com/cameras/flex-13/.Google Scholar
- 2016. Robot reading human gaze: Why eye-tracking is better than head tracking for human-robot collaboration. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’16). IEEE, Los Alamitos, CA, 5048–5054.Google ScholarDigital Library .
- 2010. Artificial Intelligence: A Modern Approach (3rd ed.). Prentice Hall, Upper Saddle River, NJ.Google Scholar .
- M. Safeea, P. Neto, and R. Bearee. 2018. Efficient calculation of minimum distance between capsules and its use in robotics. IEEE Access 7 (2018), 5368--5373.Google Scholar
- 2021. Human–robot interaction through eye-tracking for artistic drawing. Robotics 10, 2 (2021), 54.Google ScholarCross Ref .
- B. Schmidt and L. Wang. 2014. Depth camera-based collision avoidance via active robot control. Journal of Manufacturing Systems 33, 4 (2014), 711--718.Google Scholar
- 2020. Webcam controlled robotic arm for persons with SSMI. Technology and Disability 32 (3), 1–19.Google ScholarCross Ref .
- 2016. Collaborative robot technical specification ISO/TS 15066 Update. Retrieved April 20, 2022 from https://www.robotics.org/userAssets/riaUploads/file/12-TR15066Overview-SafetyforCollaborativeApplications-RobertaNelsonShea.pdf.Google Scholar .
- 2019. A case study of developing gaze controlled interface for users with severe speech and motor impairment. Technology and Disability 31, 1 (2019), 1–19.Google Scholar .
- 2016. Springer Handbook of Robotics. Springer.Google ScholarCross Ref (Eds.).
- 2019. Remote-operated multimodal interface for therapists during walker-assisted gait rehabilitation: A preliminary assessment. In Proceedings of the 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI’19). IEEE, Los Alamitos, CA, 528–529.Google ScholarCross Ref .
- 2014. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 (2014).Google Scholar .
- 2007. Enabling multimodal human–robot interaction for the Karlsruhe humanoid robot. IEEE Transactions on Robotics 23, 5 (2007), 840–851.Google ScholarDigital Library .
- 2019. Symbiotic human-robot collaborative assembly. CIRP Annals 68, 2 (2019), 701–726.Google ScholarCross Ref .
- 2018. Classification, personalised safety framework and strategy for human-robot collaboration. In Proceedings of the 2018 International Conference on Computers and Industrial Engineering (CIE’18).Google Scholar .
- 2017. Robotic arm control using hybrid brain-machine interface and augmented reality feedback. In Proceedings of the 2017 8th International IEEE/EMBS Conference on Neural Engineering (NER’17). IEEE, Los Alamitos, CA, 411–414.Google ScholarCross Ref .
- 2014. The WAM Arm: Modelling, Control and Its Application in an HMI Based on Gaze Tracking. Ph.D. Thesis. University of Leicester.Google Scholar .
- 2019. Eye-gaze-controlled telepresence robots for people with motor disabilities. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction. 574–575.Google ScholarDigital Library .
- 2012. Microsoft Kinect sensor and its effect. IEEE Multimedia 19, 2 (2012), 4–10.Google ScholarDigital Library .
Index Terms
- Comparing Two Safe Distance Maintenance Algorithms for a Gaze-Controlled HRI Involving Users with SSMI
Recommendations
Eye Gaze Controlled Robotic Arm for Persons with Severe Speech and Motor Impairment
ETRA '20 Full Papers: ACM Symposium on Eye Tracking Research and ApplicationsRecent advancements in the field of robotics offers new promises for people with different range of abilities although making a human robot interface for people with severe disabilities is challenging. This paper describes the design and development of ...
Analyzing Eye Gaze of Users with Learning Disability
ICGSP '19: Proceedings of the 3rd International Conference on Graphics and Signal ProcessingThis paper investigates eye gaze movements of children with different reading abilities due to learning disability. We analyzed eye gaze fixations using cluster analysis and velocity-based fixation classification algorithms. We used a non-invasive, ...
Eye gaze tracking with free head movements using a single camera
SoICT '10: Proceedings of the 1st Symposium on Information and Communication TechnologyThe problem of eye gaze tracking has been researched and developed for a long time. The most difficult problem in the non-intrusive system of eye gaze tracking is the problem of head movements. Some of existing methods have to use two cameras and an ...
Comments