ABSTRACT
Current assistive technologies need complicated, cumbersome, and expensive equipment, which are not user-friendly, not portable, and often require extensive fine motor control. Our approach aims at solving these problems by developing, a compact, non-obtrusive and ergonomic wearable device, to measure signals associated with human physiological gestures, and thereafter generate useful commands to interact with the environment. Our innovation uses machine learning and non- invasive biosensors on top of the ears to identify eye movements and facial expressions with over 95% accuracy. Users can control different applications, such as a robot, powered wheelchair, cell phone, smart home, or other Internet of Things (IoT) devices. Combined with VR headset and hand gesture recognition devices, user can use our technology to control a camera-mounted robot (e.g., telepresence robot, drones, or any robotic manipulator) to navigate around the environment in first-person's view simply by eye movements and facial expressions. It enables a human- intuitive way of interaction totally 'touch-free'. The experimental results show satisfactory performance in different applications, which can be a powerful tool to help disabled people interact with the environment and measure other physiological signals as a universal controller and health monitoring device.
- W. Jia, J. Wu, D. Gao, and M. Sun. Characteristics of Skin-Electrode Impedance for a Novel Screw Electrode. Proceedings of the IEEE Annual Northeast Bioengineering Conference, 1--2, 2014.Google ScholarCross Ref
- B. Luan, W. Jia, P. D. Thirumala, J. Balzer, D. Gao, and M. Sun. A feasibility study on a single-unit wireless EEG sensor. The 12th IEEE International Conference on Signal Processing (ICSP 2014), Chicago, USA, 2282--2285, October 2014.Google ScholarCross Ref
- K.-J. Wang, L. Zhang, B. Luan, H.-W. Tung, Q. Liu, J. Wei, M. Sun, and Z.-H. Mao. Brain-computer interface combining eye saccade two-electrode EEG signals and voice cues to improve the maneuverability of wheelchair. Proceedings of the15th IEEE/RAS-EMBS International Conference on Rehabilitation Robotics (ICORR 2017), London, UK, 1073--1078, July 2017.Google ScholarDigital Library
- Myo Gesture Control Armband website. URL:https://www.myo.com/,2017.Google Scholar
- D. H. Lee and A. K. Anderson. Reading what the mind thinks from how the eye sees. Psychological science, 28(4), 494--503, 2017.Google ScholarCross Ref
- Lots of people dislike voice assistants. URL: https://www.cnet.com/news/lots- of-people-dislike-voiceassistants-blame-siri/, 2017.Google Scholar
Index Terms
- EXGbuds: Universal Wearable Assistive Device for Disabled People to Interact with the Environment Seamlessly
Recommendations
Development of Seamless Telepresence Robot Control Methods to Interact with the Environment Using Physiological Signals
HRI '18: Companion of the 2018 ACM/IEEE International Conference on Human-Robot InteractionCurrent assistive devices to help disabled people interact with the environment are complicated and cumbersome. Our approach aims to solve these problems by developing a compact and non-obtrusive wearable device to measure signals associated with human ...
Humanoid robot control using depth camera
HRI '11: Proceedings of the 6th international conference on Human-robot interactionMost human interactions with the environment depend on our ability to navigate freely and to use our hands and arms to manipulate objects. Developing natural means of controlling these abilities in humanoid robots can significantly broaden the usability ...
Intuitive Bare-Hand Teleoperation of a Robotic Manipulator Using Virtual Reality and Leap Motion
Towards Autonomous Robotic SystemsAbstractDespite various existing works on intuitive human-robot interaction (HRI) for teleoperation of robotic manipulators, to the best of our knowledge, the following research question has not been investigated yet: Can we have a teleoperated robotic ...
Comments