ABSTRACT
Recent developments in the manufacturing and marketing of low power-consumption computers, small enough to be "worn" by users and remain almost invisible, have reintroduced the problem of overcoming the outdated paradigm of human-computer interaction based on use of a keyboard and a mouse. Approaches based on visual tracking seem to be the most promising, as they do not require any additional devices (gloves, etc.) and can be implemented with off-the-shelf devices such as webcams. Unfortunately, extremely variable lighting conditions and the high degree of computational complexity of most of the algorithms available make these techniques hard to use in systems where CPU power consumption is a major issue (e.g. wearable computers) and in situations where lighting conditions are critical (outdoors, in the dark, etc.). This paper describes the work carried out at VisiLAB at the University of Messina as part of the VisualGlove Project to develop a real-time, vision-based device able to operate as a substitute for the mouse and other similar input devices. It is able to operate in a wide range of lighting conditions, using a low-cost webcam and running on an entry-level PC. As explained in detail below, particular care has been taken to reduce computational complexity, in the attempt to reduce the amount of resources needed for the whole system to work.
- A. Blake and M. Isard. Active Contours. Springer-Verlag, 1998.]]Google ScholarCross Ref
- J. F. Canny. Finding edges and lines in images. Technical Report Tech. Rep. AI-TR-720, MIT Artificial Intelligence Laboratory, Cambridge, MA, 1983.]] Google ScholarDigital Library
- J. F. Canny. A computational approach to edge detection. IEEE Trans. on PAMI, 8(6):679--698, June 1986.]] Google ScholarDigital Library
- R. Cipolla and A. Pentland. Computer Vision for Human-Machine Interaction. Cambridge University Press, 1998.]] Google ScholarDigital Library
- S. Ditlea. The pc goes ready-to-wear. IEEE Spectrum, 37(10):34--39, October 2000.]]Google ScholarDigital Library
- R. Gonzalez and R. Woods. Digital Image Processing. Addison-Wesley, 1992.]] Google ScholarDigital Library
- R. Grzeszczuk, G. Bradski, M. H. Chu, and J.-Y. Bouguet. Stereo based gesture recognition invariant to 3d pose and lighting. In ICCV98, pages 826--833, 2000.]]Google Scholar
- D. P. Huttenlocker, G. A. Klanderman, and W. J. Rucklidge. Comparing images using the hausdorff distance. IEEE Transactions on Pattern Analysis and Machine Intelligence, 15, September 1993.]] Google ScholarDigital Library
- M. Kohler and S. Schroter. A survey of video-based gesture recognition - stereo and mono systems. Technical report, Fachbereich Informatik, Dortmund University, 44221 Dortmund, Germany, 1998.]]Google Scholar
- V. I. Pavlovic, R. Sharma, and T. S. Huang. Visual interpretation of hand gestures for human-computer interaction: a review. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7):677--695, July 1997.]] Google ScholarDigital Library
- W. Rucklidge. Efficient Visual Recognition Using the Hausdorff Distance. Springer-Verlag, 1996.]] Google ScholarDigital Library
- Y. Sato, Y. Kobayashi, and H. Koike. Fast tracking of hands and fingertips in infrared images for augmented desk interface. In Proc. of IEEE Automatic Face and Gesture Recognition (FG2000), pages 462--467, 2000.]] Google ScholarDigital Library
- M. Sonka, V. Hlavac, and R. Boyle. Image Processing, Analysis and Machine Vision. Chapman & Hall Computing, 1993.]] Google ScholarDigital Library
- T. Starner and A. Pentland. Visual recognition of american sign language using hidden markov models. In Proc. of Int. Workshop on Automatic Face and Gesture Recognition, pages 189--194, Zurich, Switzerland, June 1995.]]Google Scholar
- Ibm wearable computers news. http://www.ibm.com/news/ls/1998/09/jp_3.phtml.]]Google Scholar
- Xybernaut home page. http://www.xybernaut.com/.]]Google Scholar
Recommendations
Hand tracking and gesture recognition system for human-computer interaction using low-cost hardware
Human-Computer Interaction (HCI) exists ubiquitously in our daily lives. It is usually achieved by using a physical controller such as a mouse, keyboard or touch screen. It hinders Natural User Interface (NUI) as there is a strong barrier between the ...
Bare-hand human-computer interaction
PUI '01: Proceedings of the 2001 workshop on Perceptive user interfacesIn this paper, we describe techniques for barehanded interaction between human and computer. Barehanded means that no device and no wires are attached to the user, who controls the computer directly with the movements of his/her hand.Our approach is ...
Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review
The use of hand gestures provides an attractive alternative to cumbersome interface devices for human-computer interaction (HCI). In particular, visual interpretation of hand gestures can help in achieving the ease and naturalness desired for HCI. This ...
Comments