Skip to main content
Log in

Model–based video tracking for gestural interaction

  • Published:
Virtual Reality Aims and scope Submit manuscript

Abstract

Among many techniques to interact with 3D environments, gesture-based input appears promising. However, due to insufficient computing hardware capabilities, such interfaces have to be built either upon standard tracking devices or using limited image-based video tracking algorithms. As today computing power tends to be more and more powerful, more complex video analysis such as real-time model-based tracking is at hand. Considering the use of a model-based approach to allow unencumbered input gives us the advantage of extracting a low-level hand description useful to build natural interfaces. The algorithm we developed relies on a 3D polygonal hand model. Its pose parametrization is iteratively refined so that its 2D projection matches more closely the input 2D image. Relying on the graphics hardware to handle fast 2D projection is critical, while adding more cameras is useful to cope with the occlusion issue.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Segen J, Kumar S (1999) Shadow gestures: 3d hand pose estimation using a single camera. In: IEEE conference on computer vision and pattern recognition, pp 479–485

  2. Freeman W, Tanaka K, Ohta J, Kyuma K (1996) Computer vision for computer games. FG’96 (1996), pp 100–105

  3. Davis J (1998) Recognizing movement using motion histograms. Technical Report 487, MIT Media Lab

  4. Kwatra V, Bobick A, Johnson A (2001) Temporal integration of multiple silhouette–based body–part hypotheses. In: Computer Vision and Pattern Recognition

  5. Rehg J, Kanade T (1994) Digiteyes: vision–based hand tracking for human computer interaction. In: Workshop on motion of non–rigid and articulated objects. pp 16–22

  6. Wu Y, Huang T (1999) Capturing articulated human hand motion: A divide–and–conquer approach. In: International Conference on Computer Vision. pp 606–611

  7. Heap T, Hogg D (1996) Towards 3d hand tracking using a deformable model. In: Conference on automatic face and gesture recognition. pp 140–145

  8. Shimada N, Shirai Y, Kuno Y, Miura J (1998) Hand gesture estimation and model refinment using monocular camera – ambiguity limitation by inequality constraints. In: 3rd conference on Face and Gesture Recognition. pp 268–273

  9. Stenger B, Mendonça P, Cipolla R (2001) Model–based hand tracking using an unscented kalman filter. In: British machine vision conference. vol 1, pp 63–72

  10. Deawele G, Devernay F, Horaud R (2003) Hand motion from 3d point trajectories and a smooth surface model. In: 8th European conference on computer vision

  11. Leubner C, Brockman C, Müller H (2001) Computer–vision–based human-computer interaction with a back projection wall using arm gestures. Euromicro conference

  12. Moeslund T, Störring M, Granum E (2000) Vision–based user interface for interacting with a virtual environment. In: conference DANKOMB

  13. Nölker C, Ritter H (1998) Illumination independant recognition of deictic arm postures. In: 24th annual conference of the IEEE industrial electronic society. pp 2006–2011

  14. Sato Y, Saito M, Koike H (2001) Real–time input of 3d hand pose and gestures of a user’s hand and its applications for hci. In: Virtual reality conference. p 79

  15. Leibe B, Starner T, Ribarsky W, Wartell Z, Krum D, Singletary B, Hodges L (2000) The perceptive workbench: Toward spontaneous and natural interaction in semi–immersive virtual environments. In: IEEE conference on virtual reality. pp 13–20

  16. Davis J, Bobick A (1998) Sideshow: a silhouette–based interactive dual–screen environment. Technical Report 457, MIT Media Lab Perceptual Computing Group

  17. Ouhaddi H, Horain P, Mikolajczyk K (1998) Modélisation et suivi de la main. In: Compression et représentation des signaux audiovisuels. pp 109–114

  18. McDonald J, Toro J, Alkoby K, Berthiaume A, Carter R, Chomwong P, Christopher J, Davidson M, Furst J, Konie B, Lancaster G, Roychoudhuri L, Sedgwick E, Tomuro N, Wolfe R (2001) An improved articulated model of the human hand. The Visual Computer 17:158–166

    Google Scholar 

  19. Delamarre Q, Faugeras O (1998) Finding pose of hand in video images: a stereo–based approach. In: FG’98

  20. de la Rivière JB (2004) http://www.labri.fr/perso/∼leproux/modelbased.htm

  21. Athitsos V, Sclaroff S (2003) Database indexing methods for 3d hand pose estimation. In: International Gesture Workshop, Springer, Berlin Heidelberg New York

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to J. -B. de la. Rivière or P. Guitton.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Rivière, J.B.d.l., Guitton, P. Model–based video tracking for gestural interaction. Virtual Reality 8, 213–221 (2005). https://doi.org/10.1007/s10055-005-0154-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10055-005-0154-4

Keywords

Navigation