Abstract
This paper presents a system recognizing manipulative hand gestures like grasping, moving, holding an object(s) with both hands, and extending or shortening of the object(s) in the virtual world using contextual information. Contextual information is represented by a state transition diagram, each state of which indicates possible gestures at the next moment. Image features obtained from extracted hand regions are used to judge state transition. When we use a gesture recognition system, we sometimes move our hands unintentionally. To solve this problem, our system has a rest state in the state transition diagram. All unintentional actions are considered as taking a rest and ignored. In addition, the system can recognize collaborative gestures with both hands. They are expressed in a single state so that the complexity in combination of gestures of each hand can be avoided. We have realized an experimental human interface system. Operational experiments show promising results.
He is currently with System Engineering Research Institute, Taejon, Korea.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
T.S. Huang and V.I. Pavlović, “Hand gesture modeling, analysis, and synthesis”, Pooc. Int. Workshop on Automatic Face-and Gesture-Recognition, pp.73–79, 1995.
P. Maes, T. Darrell, B. Blumberg, and A. Pentland, “The ALIVE system: wireless, full-body interaction with autonomous agents”, MIT Media Lab. Perceptual Computing Technical Report No. 257, 1995.
R. Kjeldsen and J. Kender, “Visual hand gesture recognition for window system control”, Pooc. Int. Workshop on Automatic Face-and Gesture-Recognition, pp.184–188, 1995.
Francis K.H. Quek, “Unencumbered gestural interaction”, IEEE Multimedia, Vol.4, No 3, pp.36–47, 1996.
M. Fukumoto, K. Mase, and Y. Suenaga, “Real-time detection of pointing actions for a glove-free interface”, Proc. IAPR Workshop on Machine Vision Applications '92, pp.473–476, 1992.
R. Cipolla, P.A. Hadfield, and N.J. Hollinghurst, “Uncalibrated stereo vision with pointing for a man-machine interface”, Proc. IAPR Workshop on Machine Vision Applications '94, pp.163–166, 1994.
K.-H. Jo, Y. Kuno, and Y. Shirai, “Invariance based human interface system using realtime tracker”, Proc. Second Asian Conf. on Computer Vision '95, Vol.2, pp.22–27, 1995.
K.-H. Jo, K. Hayashi, Y. Kuno and Y. Shirai, “Vision-based human interface system with world-fixed and human-centered frames using multiple view invariance”, Trans. IEICE Information and Systems, VoL.E79-D, No.6, pp.219–228, 1996.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1997 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Jo, KH., Kuno, Y., Shirai, Y. (1997). Context-based recognition of manipulative hand gestures for human computer interaction. In: Chin, R., Pong, TC. (eds) Computer Vision — ACCV'98. ACCV 1998. Lecture Notes in Computer Science, vol 1352. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-63931-4_238
Download citation
DOI: https://doi.org/10.1007/3-540-63931-4_238
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-63931-2
Online ISBN: 978-3-540-69670-4
eBook Packages: Springer Book Archive