Abstract
In this paper, we present a stereo-based face tracking system which can track the 3D position and orientation of a user in real-time, and the system’s application for interaction with a large display. Our tracking system incorporates dynamic update of template images for tracking facial features so that the system can successfully track a user’s face for a large angle of rotation. Another advantage of our tracking system is that it does not require a user to manually initialize the tracking process, which is critical for natural and intuitive interaction. Based on our face tracking system, we have implemented several prototype applications which change information shown on a large display adaptively according to the location looked at by a user.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Christian, A.D. and Avery, B.L, Digital Smart Kiosk Project, Proceedings of ACM Human Factors in Computing Systems (SIGCHI’ 98), pp. 155–163 (1998).
Haro, A., M. Flickner, and I. Essa, Detecting and Tracking Eyes By Using Their Physiological Properties, Dynamics, and Appearance, Proceedings IEEE CVPR 2000, pp. 163–168 (2000).
Kawato, S. and Ohya, J., Two-step Approach for Real-time Eye Tracking with a New Filtering Technique, Proceedings of Int. Conf. on System, Man & Cybernetics, pp. 1366–1371 (2000).
Kitajima, K., Sato, Y. and Koike, H., Vision-based face tracking system for window interface: prototype application and empirical studies, Extended Abstracts of 2001 ACM Human Factors in Computing Systems (SIGCHI 2001), pp. 359–360 (2001).
Matsumoto, Y. and Zelinsky, A., An Algorithm for Real-time Stereo Vision Implementation of Head Pose and Gaze Direction Measurement, Proceedings of IEEE Fourth International Conference on Face and Gesture Recognition (FG’2000), pp. 499–505 (2000).
Pirolli, P, Card, S.K. and Van Der Wege, M.M., Visual Information Foraging in a Focus + Context Visualization, Proceedings of ACM Human Factors in Computing Systems (SIGCHI’2001), pp. 506–513 (2001).
PosterCam, http://crl.research.compaq.com/vision/interfaces/postercam/default.htm
Sarkar, M. and Brown, M.H., GRAPHICAL FISHEYE VIEWS of GRAPHS, Proceedings of ACM Human Factors in Computing Systems (SIGCHI’92), pp. 83–91 (1992).
Sawhney, N., Wheeler, S. and Schmandt, C., Aware Community Portals: Shared Information Applicances for Transitional Spaces, Journal of Personal and Ubiquitous Computing, Vol. 5, No. 1, pp. 66–70 (2001).
Stiefelhagen, R., Yang, J. and Waibel, A., Tracking Eyes and Monitoring Eye Gaze, Proceedings of Workshop on Perceptive User Interfaces, pp. 98–100 (1997).
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Nakanishi, Y., Fujii, T., Kiatjima, K., Sato, Y., Koike, H. (2002). Vision-Based Face Tracking System for Large Displays. In: Borriello, G., Holmquist, L.E. (eds) UbiComp 2002: Ubiquitous Computing. UbiComp 2002. Lecture Notes in Computer Science, vol 2498. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45809-3_11
Download citation
DOI: https://doi.org/10.1007/3-540-45809-3_11
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44267-7
Online ISBN: 978-3-540-45809-8
eBook Packages: Springer Book Archive