Abstract
Vision and audio can be used in a complementary way to efficiently create augmented reality tools. This paper describes how sound detection can be used to enhance the efficiency of a computer vision augmented reality tool such as the MagicBoard. A MagicBoard is an ordinary white board with which a user can handle both real and virtual information, allowing, e.g., for a copy&paste operation on a physical drawing on the board. The electronic part of the MagicBoard consists of a video-projector and the concurrent use of several detectors such as a camera and microphones. In our system, sound is used to support a purely vision-based finger tracker, especially during the most critical phase, i.e., when trying to detect and localize a finger tap as a click on the board. The relative phase of a signal caused by a finger tap on the board, detected by several microphones is used to estimate the postition of the finger tap. In the probable case of several possible estimates, the estimates are eventually verified by the visual tracker. The resulting system is surprisingly precise and robust.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
M. H. Coen, “Design principals for intelligent environments,” in Proceeding American Association for Artificial Intelligence 1998 Spring Symposium on Intelligent Environments, Stanford, CA, USA, Mar. 1998.
M. H. Coen, “Building brains for rooms: Designing distributed software agents,” in Ninth Conference on Innovative Applications of Artificial Intelligence. (IAAI97), Providence, R.I., 1997.
C. Le Gal, J. Martin, and G. Durand, “SmartOffice: An intelligent and interactive environment,” in Proc. of 1st International Workshop on Managing Interaction in Smart Environments, P. Nixon, G. Lacey, and S. Dobson, Eds., Dublin, Dec. 1999, pp. 104–113, Springer-Verlag.
Monica Team, “MONICA: Office Network with Intelligent Computer Assistant,” http://www-prima.inrialpes.fr/MONICA.
P. Wellner, “Interacting with paper on this digitaldesk,” CACM, vol. 36, no. 7, pp. 86–95, 1993.
D. Hall, C. Le Gal, J. Martin, O. Chomat, T. Kapuscinski, and J. L. Crowley, “Magicboard: A contribution to an intelligent office environment,” in Proc. of the International Symposium on Intelligent Robotic Systems, 1999.
D. Hall and J. L. Crowley, “Treacking fingers and hands with a rigid contour model in an augmented reality,”n MANSE’99, 1999, submission.
A. Kendon, “The biological foundations of gestures: Motor and semiotic aspects,” in Current Issues in the Study of Gesture, Nespoulous, Perron, and Lecours, Eds. Lawrence Erlbaum Associates, Hillsday, N.J., 1986.
Bernt Schiele and A. Waibel, “Gaze tracking based on face color,” International Workshop on Automatic Face-and Gesture-Recognition, June 1995.
K. Schwerdt and J. L. Crowley, “Robust face tracking using color,” in Proc. of 4th International Conference on Automatic Face and Gesture Recognition, Grenoble, France, 2000, pp. 90–95.
J. F. Lamb, C. G. Ingram, I. A. Johnston, and R. M. Pitman, Essentials of Physiology, 2nd edition, Blackwell Scientific Publication, Oxford, 1986.
Keith Dana Martin, “A computational model of spatial hearing,” M.S. thesis, Cornell University, 1995.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Le Gal, C., Ozcan, A.E., Schwerdt, K., Crowley, J.L. (2000). A Sound MagicBoard. In: Tan, T., Shi, Y., Gao, W. (eds) Advances in Multimodal Interfaces — ICMI 2000. ICMI 2000. Lecture Notes in Computer Science, vol 1948. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-40063-X_9
Download citation
DOI: https://doi.org/10.1007/3-540-40063-X_9
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-41180-2
Online ISBN: 978-3-540-40063-9
eBook Packages: Springer Book Archive