Abstract
Most of the electric wheelchairs available in the market are joystick-driven and therefore assume that the user is able to use his hand motion to steer the wheelchair. This does not apply to many users that are only capable of moving the head like quadriplegia patients. This paper presents a vision-based head motion tracking system to enable such patients of controlling the wheelchair. The novel approach that we suggest is to use active vision rather than passive to achieve head motion tracking. In active vision-based tracking, the camera is placed on the user’s head rather than in front of it. This makes tracking easier, more accurate and enhances the resolution. This is demonstrated theoretically and experimentally. The proposed tracking scheme is then used successfully to control our electric wheelchair to navigate in a real world environment.
Similar content being viewed by others
References
Ashdown M, Oka K, Sato Y (2005) Combining head tracking and mouse input for a GUI on multiple monitors. In: CHI ’05 extended abstracts on human factors in computing systems. ACM, pp 1188–1191
Azarbayejani A, Starner T, Horowitz B, Pentland A (1993) Visually Controlled Graphics. IEEE Trans Pattern Anal Mach Intell 15(6): 602–605
Christensen HV, Garcia JC (2005) Infrared non-contact head sensor, for control of wheelchair movements. In: Pruski A, Knops H (eds) Assistive technology: from virtuality to reality. IOS Press, Amsterdam, pp 336–340
Civera J, Grasa OG, Davison AJ, Montiel JMM (2010) 1-Point RANSAC for extended Kalman filtering: application to real-time structure from motion and visual odometry. J Field Robot 27(5): 609–631
Fischler M, Bolles R (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6): 381–395
Ford J, Sheredos S (1995) Ultrasonic head controller for powered wheelchairs. J Rehabil Res Dev 32(2): 280–284
Hartley RI, Zisserman A (2004) Multiple view geometry in computer vision, 2nd edn. Cambridge University Press, Cambridge
Jia P, Hu HH, Lu T, Yuan K (2007) Head gesture recognition for hands-free control of an intelligent wheelchair. Ind Robot 34(1): 60–68
Liu Z, Zhang Z (2000) Robust head motion computation by taking advantage of physical properties. In: Proceedings of the workshop on human motion (HUMO’00), pp 73–77
Lowe D (1999) Object recognition from local scale-invariant features. In: Proceedings of the international conference on computer vision, ICCV ’99
Lowe D (2004) Distinctive image features from scale-invariant keypoints. Int J Comput Vis 60(2): 91–110
Mandel C, Röfer T, Frese U (2007) Applying a 3d of orientation tracker as a human-robot interface for autonomous wheelchairs. In: Proceedings of the IEEE international conference on rehabilitation robotics
Matsumoto Y, Ino T, Ogasawara T (2001) Development of intelligent wheelchair system with face and gaze based interface. In: Proceedings of 10th IEEE international workshop on robot and human communication, pp 262–267
Mikolajczyk K, Schmid C (2003) A performance evaluation of local descriptors. In: Proceedings of the international conference on computer vision and pattern recognition, vol 2, pp 257–263
Nistér D (2004) An efficient solution to the five-point relative pose problem. IEEE PAMI 26(6): 756–777
Rekimoto J (1995) A vision-based head tracker for fish tank virtual reality—VR without head gear. In: Proceedings of the virtual reality annual international symposium. IEEE Computer Society
Schiele B, Pentland A (1999) Attentional objects for visual context understanding. Tech. Rep. 500, MIT Media Lab
Simpson RC, Levine SP (2002) Voice control of a powered wheelchair. IEEE Trans Neural Syst Rehabil Eng 10(2): 122–125
Sugimoto A, Matsuyama T (2003) Active wearable vision sensor: detecting person’s blink points and estimating human motion trajectory. In: Proceedings of the international conference on advanced intelligent mechatronics, vol 1, pp 539–545
Tordoff B, de Campos WMT, Murray D (2002) Head pose estimation for wearable robot control. In: Proceedings of the 13th British machine vision conference, vol 2, pp 807–816
ur Réhman S (2010) Expressing emotions through vibrations. Ph.D. thesis, Umeå University, Sweden
ur Réhman S, Raytchev B, Yoda I, Liu L (2009) Vibrotactile Rendering of head gestures for controlling electric wheelchair. In: Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics, pp 413–417
Yao Z, Li H (2004) Is a magnetic sensor capable of evaluating a vision-based face tracking system? In: Proceedings of the 2004 conference on computer vision and pattern recognition workshop, Washington, DC, USA
Yoda I, Sakaue K, Inoue T (2007) Development of head gesture interface for electric wheelchair. In: Proceedings of the 1st international convention on Rehabilitation engineering and assistive technology, pp 77–80
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Halawani, A., ur Réhman, S., Li, H. et al. Active vision for controlling an electric wheelchair. Intel Serv Robotics 5, 89–98 (2012). https://doi.org/10.1007/s11370-011-0098-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11370-011-0098-3