Abstract
This work contributes an integrated and flexible approach to sign language processing in virtual environments that allows for interactive experimental evaluations with high ecological validity. Initial steps deal with real-time tracking and processing of manual gestures. Motion data is stereoscopically rendered in immersive virtual environments with varying spatial and representational configurations. Besides flexibility, the most important aspect is the seamless integration within a VR-based neuropsychological experiment software. Ongoing studies facilitated with this system contribute to the understanding of the cognition of sign language. The system is beneficial for experimenters because of the controlled and immersive three-dimensional environment enabling experiments with visual depth perception that can not be achieved with video presentations.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Fehrmann, G., Jäger, L.: Sprachbewegung und Raumerinnerung. Zur topographischen Medialität der Gebärdensprachen. In: der Bewegung, K. Kinästhetische Wahrnehmung und Probehandeln in virtuellen Welten. Publikationen zur Zeitschrift für Germanistik, vol. 8, pp. 311–341. Peter Lang, Bern (2004)
Lee, J., Kunii, T.L.: Visual translation: from native language to sign language. In: Proc. of IEEE Workshop on Visual Languages, USA, pp. 103–109 (1992)
Lebourque, T., Gibet, S.: High level specification and control of communication gestures: the GESSYCA system. In: Proc. of IEEE Computer Animation, Switzerland, pp. 24–35 (1999)
Heloir, A., Gibet, S., Multon, F., Courty, N.: Captured Motion Data Processing for Real Time Synthesis of Sign Language. In: Gibet, S., Courty, N., Kamp, J.-F. (eds.) GW 2005. LNCS, vol. 3881, pp. 168–171. Springer, Heidelberg (2006)
Cox, S., Lincoln, M., Tryggvason, J., Nakisa, M., Wells, M., Tutt, M., Abbott, S.: TESSA, a system to aid communication with deaf people. In: Proc. of ASSETS 2002, Fifth International ACM SIGCAPH Conference on Assistive Technologies, Scotland, pp. 205–212 (2002)
Kennaway, R.: Synthetic Animation of Deaf Signing Gestures. In: Wachsmuth, I., Sowa, T. (eds.) GW 2001. LNCS, vol. 2298, pp. 146–157. Springer, Heidelberg (2002)
Rezzoug, N., Gorce, P., Héloir, A., Gibet, S., Courty, N., Kamp, J.F., Multon, F., Pelachaud, C.: Virtual humanoids endowed with expressive communication gestures: The HuGEx project. In: Proc. of IEEE International Conference on Systems, Man, and Cybernetics, Taiwan (2006)
Ullrich, S., Valvoda, J.T., Prescher, A., Kuhlen, T.: Comprehensive Architecture for Simulation of the Human Body based on Functional Anatomy. In: Proc. of BVM 2007, Germany (2007)
Valvoda, J.T., Kuhlen, T., Bischof, C.: Interactive Virtual Humanoids for Virtual Environments. In: Short Paper Proc. of Eurographics Symposium on Virtual Environments, Portugal, pp. 9–12 (2006)
Valvoda, J.T., Kuhlen, T., Wolter, M., et al.: NeuroMan: A Comprehensive Software System for Neuropsychological Experiments. CyberPsychology and Behaviour 8(4), 366–367 (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ullrich, S. et al. (2009). A Virtual Reality-Based Framework for Experiments on Perception of Manual Gestures. In: Sales Dias, M., Gibet, S., Wanderley, M.M., Bastos, R. (eds) Gesture-Based Human-Computer Interaction and Simulation. GW 2007. Lecture Notes in Computer Science(), vol 5085. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-92865-2_19
Download citation
DOI: https://doi.org/10.1007/978-3-540-92865-2_19
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-92864-5
Online ISBN: 978-3-540-92865-2
eBook Packages: Computer ScienceComputer Science (R0)