ABSTRACT
Robotics is a trailblazing technology that has found extensive applications in the field of assistive aids for individuals with severe speech and motor impairment (SSMI). This article describes the design and development of an eye gaze-controlled user interface to manipulate the robotic arm. User studies were reported to engage users through eye gaze input to select stamps from the two designs and select the stamping location on cards using three designated boxes present in the User Interface. The entire process, from stamp selection to stamping location selection, is controlled by eye movements. The user interface contains the print button to initiate the robotic arm that enables the user to independently create personalized stamped cards. Extensive user interface trials revealed that individuals with severe speech and motor impairment showed improvements with a 33.2% reduction in the average time taken and a 42.8% reduction in the standard deviation for the completion of the task. This suggests the effectiveness and potential to enhance the autonomy and creativity of individuals with SSMI, contributing to the development of inclusive assistive technologies.
Supplemental Material
- Biswas P., Joshi R., Chattopadhyay S., Acharya U. R., and Lim T. C., 2015, Interaction Techniques for Users with Severe Motor-Impairment, A Multimodal End-2-End Approach to Accessible Computing, Springer-Verlag London.Google Scholar
- Faller, J., Scherer, R., Costa, U., Opisso, E., Medina, J., & Müller-Putz, G. R., 2014, A co-adaptive brain-computer interface for end users with severe motor impairment. PloS one, 9(7), e101168.Google ScholarCross Ref
- Lazarou, I., Nikolopoulos, S., Petrantonakis, P. C., Kompatsiaris, I., & Tsolaki, M., 2018, EEG-based brain–computer interfaces for communication and rehabilitation of people with motor impairment: a novel approach of the 21st century. Frontiers in human neuroscience.Google Scholar
- Borgestig, M., Sandqvist, J., Parsons, R., Falkmer, T., & Hemmingsson, H., 2016, Eye gaze performance for children with severe physical impairments using gaze-based assistive technology—A longitudinal study. Assistive technology, 28(2), 93-102.Brouillette.Google Scholar
- Majaranta, P., Aoki, H., Donegan, M., Hansen, D. W., Hansen, J. P., Hyrskykari, A.,& Räihä, K., 2012, Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies (pp. 1-382). Hershey, PA: IGI Global.doi:10.4018/978-1-61350-098-9.Google ScholarCross Ref
- V. Kartsch, M. Guermandi, S. Benatti, F. Montagna, L. Benini, presented at IEEE Sensors Applications Symp., IEEE, Piscataway, NJ 2019.Google Scholar
- Dziemian, Sabine, William W. Abbott, and A. Aldo Faisal (2016) "Gaze-based teleprosthetic enables intuitive continuous control of complex robot arm use: Writing & drawing." 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob). IEEE, 2016.Google Scholar
- Wöhle, L.; Gebhard, M. Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head-and Eye-Gaze Interface. Sensors 2021, 21, 1798.Google Scholar
- Maimon-Mor, R.O.; Fernandez-Quesada, J.; Zito, G.A.; Konnaris, C.; Dziemian, S.; Faisal, A.A. Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking. In Proceedings of the 2017 International Conference on Rehabilitation Robotics (ICORR), London, UK, 17–20 July 2017; pp. 1049–1054.Google ScholarDigital Library
- Vinay Krishna Sharma, Kamalpreet Singh Saluja, Vimal Mollyn, and Pradipta Biswas. 2020. Eye Gaze Controlled Robotic Arm for Persons with Severe Speech and Motor Impairment. In Symposium on Eye Tracking Research and Applications (ETRA ’20 Full Papers), June 2–5, 2020, Stuttgart, Germany. ACM, New York, NY, USA, 9 pages. https://doi.org/10.1145/3379155.3391324Google ScholarDigital Library
- Jeevithashree, D. V., Saluja, K. S., & Biswas, P., 2019, A case study of developing gaze-controlled interface for users with severe speech and motor impairment. Technology and Disability, 31(1-2), 63-76. DOI 10.3233/TAD-180206Google ScholarCross Ref
- J. J. Craig, “Introduction to Robotics: Mechanics and Control 3rd,” Prentice Hall, vol. 1, no. 3, p. 408, 2004.Google Scholar
- P.I. Corke, “Robotics, Vision & Control”, Springer 2017, ISBN 978-3-319-54413-7.Google ScholarDigital Library
- Paulson A, Vargus-Adams J. Overview of Four Functional Classification Systems Commonly Used in Cerebral Palsy. Children (Basel). 2017 Apr 24;4(4):30. doi: 10.3390/children4040030. PMID: 28441773; PMCID: PMC5406689.Google ScholarCross Ref
Index Terms
- Eye-Gaze-Enabled Assistive Robotic Stamp Printing System for Individuals with Severe Speech and Motor Impairment
Recommendations
Eye Gaze Controlled Robotic Arm for Persons with Severe Speech and Motor Impairment
ETRA '20 Full Papers: ACM Symposium on Eye Tracking Research and ApplicationsRecent advancements in the field of robotics offers new promises for people with different range of abilities although making a human robot interface for people with severe disabilities is challenging. This paper describes the design and development of ...
Analyzing Eye Gaze of Users with Learning Disability
ICGSP '19: Proceedings of the 3rd International Conference on Graphics and Signal ProcessingThis paper investigates eye gaze movements of children with different reading abilities due to learning disability. We analyzed eye gaze fixations using cluster analysis and velocity-based fixation classification algorithms. We used a non-invasive, ...
People with Motor Disabilities Using Gaze to Control Telerobots
CHI EA '20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing SystemsTelerobots may give people with motor disabilities access to education, events and places. Eye-gaze interaction with these robots is an option when hands are not functional. Gaze control of telerobots has not yet been evaluated by people from this target ...
Comments