Abstract
This paper presents a technical approach for temporal symbol integration aimed to be generally applicable in unimodal and multimodal user interfaces. It draws its strength from symbolic data representation and an underlying rule-based system, and is embedded in a multiagent system. The core method for temporal integration is motivated by findings from cognitive science research. We discuss its application for a gesture recognition task and speech-gesture integration in a Virtual Construction scenario. Finally an outlook of an empirical evaluation is given.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Dana H. Ballard. An Introduction to Natural Computation. MIT Press, Cambridge, MA, USA, 1997.
C. Benoit, J. C. Martin, C. Pelachaud, L. Schomaker and B. Suhm. Audio-Visual and Multimodal Speech Systems. In D. Gibbon (Ed.) Handbook of Standards and Resources for Spoken Language Systems-Supplement Volume, to appear.
G.A. Fink, C. Schillo, F. Kummert, and G. Sagerer. Incremental Speech Recognition for Multimodal Interfaces. In IECON’98-Proceedings of the 24 th Annual Conference of the IEEE Industrial Electronics Society [6], pages 2012–2017.
Martin Fröhlich and Ipke Wachsmuth. Gesture recognition of the upper limbs: From signal to symbol. In Wachsmuth and Fröhlich [17], pages 173–184.
Philip A. Harling and Alistair D. N. Edwards. Hand tension as a gesture segmentation cue. In Philip A. Harling and Alistair D. N. Edwards, editors, Progress in Gestural Interaction: Proceedings of Gesture Workshop’ 96, pages 75–87, Berlin Heidelberg New York, 1997. Dep. of Computer Science, University of York, Springer-Verlag.
IEEE. IECON’98-Proceedings of the 24 th Annual Conference of the IEEE Industrial Electronics Society, volume 4, Aachen, September 1998.
M. Johnston, P. R. Cohen, D. McGee, S. L. Oviatt, J. A. Pittman and I. Smith. Unification-based Multimodal Integration. 35 th Annual Meeting of the Association for Computational Linguistics, Conference Proceedings, pages 281–288, Madrid, 1997.
Bernhard Jung, Marc Erich Latoschik, and Ipke Wachsmuth. Knowledge based assembly simulation for virtual prototype modelling. In IECON’98-Proceedings of the 24 th Annual Conference of the IEEE Industrial Electronics Society [6], pages 2152–2157.
Marc Erich Latoschik, Martin Fröhlich, Bernhard Jung, and Ipke Wachsmuth. Utilize speech and gestures to realize natural interaction in a virtual environment. In IECON’98-Proceedings of the 24 th Annual Conference of the IEEE Industrial Electronics Society [6], pages 2028–2033.
Marc Erich Latoschik and Ipke Wachsmuth. Exploiting distant pointing gestures for object selection in a virtual environment. In Wachsmuth and Fröhlich [17], pages 185–196.
Britta Lenzmann. Benutzeradaptive und multimodale Interface-Agenten, volume 184 of Dissertationen zur Künstlichen Intelligenz. Dissertation, Technische Fakultät der Universität Bielefeld, Infix Verlag, Sankt Augustin, March 1998.
D. McNeill. Hand and Mind: What Gestures Reveal about Thought. University of Chicago Press, Chicago, 1992.
L. Nigay and J. Coutaz. A generic Platform for Addressing the Multimodal Challenge. Human Factors in Computing Systems: CHI’ 95 Conference Proceedings, pages 98–105, ACM Press, New York, 1995.
Ernst Pöppel. A hierarchical model of temporal perception. Trends in Cognitive Sciences, 1(2):56–61, May 1997.
Siegmund Prillwitz, Regina Leven, Heiko Zienert, Thomas Hanke, and Jan Henning. HamNoSys Version 2.0: Hamburg Notation System for Sign Languages: An Introductory Guide, volume 5 of International Studies on Sign Language and Communication of the Deaf, Signum Press, Hamburg, Germany, 1989.
C. J. Sparrel and D. B. Koons. Interpretation of Coverbal Depictive Gestures. AAAI Spring Symposium Series, pages 8–12. Stanford University, March 1994.
Ipke Wachsmuth and Martin Fröhlich, editors. Gesture and Sign-Language in Human-Computer Interaction: Proceedings of Bielefeld Gesture Workshop 1997, number 1371 in Lecture Notes in Artificial Intelligence, Berlin Heidelberg New York, Springer-Verlag, 1998.
Ipke Wachsmuth. Communicative Rhythm in Gesture and Speech. This volume.
Alan Daniel Wexelblat. Research challenges in gesture: Open issues and unsolved problems. In Wachsmuth and Fröhlich [17], pages 1–12.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Sowa, T., Fröhlich, M., Latoschik, M.E. (1999). Temporal Symbolic Integration Applied to a Multimodal System Using Gestures and Speech. In: Braffort, A., Gherbi, R., Gibet, S., Teil, D., Richardson, J. (eds) Gesture-Based Communication in Human-Computer Interaction. GW 1999. Lecture Notes in Computer Science(), vol 1739. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46616-9_26
Download citation
DOI: https://doi.org/10.1007/3-540-46616-9_26
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-66935-7
Online ISBN: 978-3-540-46616-1
eBook Packages: Springer Book Archive