Skip to main content

Synthetic Animation of Deaf Signing Gestures

  • Conference paper
  • First Online:
Gesture and Sign Language in Human-Computer Interaction (GW 2001)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2298))

Included in the following conference series:

Abstract

We describe a method for automatically synthesizing deaf signing animations from a high-level description of signs in terms of the HamNoSys transcription system. Lifelike movement is achieved by combining a simple control model of hand movement with inverse kinematic calculations for placement of the arms. The realism can be further enhanced by mixing the synthesized animation with motion capture data for the spine and neck, to add natural “ambient motion”.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. D. Connolly. Extensible Markup Language (XML). World Wide Web Consortium, 2000.

    Google Scholar 

  2. R. Elliott, J. R. W. Glauert, J. R. Kennaway, and I. Marshall. The development of language processing support for the ViSiCAST project. In ASSETS 2000-Proc. 4th International ACM Conference on Assistive Technologies, November 2000, Arlington, Virginia, pages 101–108, 2000.

    Google Scholar 

  3. S. Gibet and T. Lebourque. High-level specification and animation of communicative gestures. J. Visual Languages and Computing, 12:657–687, 2001. On-line at http://www.idealibrary.com.

    Article  Google Scholar 

  4. A. Glassner. Introduction to animation. In SIGGRAPH’ 2000 Course Notes. Assoc. Comp. Mach., 2000.

    Google Scholar 

  5. Humanoid Animation Working Group. Specification for a Standard Humanoid, version 1.1. 1999. http://h-anim.org/Specifications/H-Anim1.1/.

  6. A. J. Hanson. Visualizing quaternions. In SIGGRAPH’ 2000 Course Notes. Assoc. Comp. Mach., 2000.

    Google Scholar 

  7. J. Hodgins and Z. Popović. Animating humans by combining simulation and motion capture. In SIGGRAPH’ 2000 Course Notes. Assoc. Comp. Mach., 2000.

    Google Scholar 

  8. The VRML Consortium Incorporated. The Virtual Reality Modeling Language: International Standard ISO/IEC 14772-1:1997. 1997. http://www.web3d.org/Specifications/VRML97/.

  9. R. Koenen. Overview of the MPEG-4 Standard. ISO/IEC JTC1/SC29/WG11 N2725, 1999. http://www.cselt.it/mpeg/standards/mpeg-4/mpeg-4.htm.

  10. T. Lebourque and S. Gibet. A complete system for the specification and the generation of sign language gestures. In Gesture-based Communication in Human-Computer Interaction, Lecture Notes in Artificial Intelligence vol.1739. Springer, 1999.

    Google Scholar 

  11. M. Lincoln, S. J. Cox, and M. Nakisa. The development and evaluation of a speech to sign translation system to assist transactions. In Int. Journal of Humancomputer Studies, 2001. In preparation.

    Google Scholar 

  12. I. Marshall, F. Pezeshkpour, J. A. Bangham, M. Wells, and R. Hughes. On the real time elision of text. In RIFRA 98-Proc. Int. Workshop on Extraction, Filtering and Automatic Summarization, Tunisia. CNRS, November 1998.

    Google Scholar 

  13. F. Pezeshkpour, I. Marshall, R. Elliott, and J. A. Bangham. Development of a legible deaf-signing virtual human. In Proc. IEEE Conf. on Multi-Media, Florence, volume 1, pages 333–338, 1999.

    Google Scholar 

  14. Z. Popović and A. Witkin. Physically based motion transformation. In Proc. SIGGRAPH’ 99, pages 11–20. Assoc. Comp. Mach., 1999.

    Google Scholar 

  15. W. T. Powers. Behavior: The Control of Perception. Aldine de Gruyter, 1973.

    Google Scholar 

  16. S. Prillwitz, R. Leven, H. Zienert, T. Hanke, J. Henning, et al. HamNoSys Version 2.0: Hamburg Notation System for Sign Languages-An Introductory Guide. International Studies on Sign Language and the Communication of the Deaf, Volume 5. University of Hamburg, 1989. Version 3.0 is documented on the Web at http://www.sign-lang.uni-hamburg.de/Projects/HamNoSys.html.

  17. D. Tolani and N. I. Badler. Real-time inverse kinematics of the human arm. Presence, 5(4):393–401, 1996.

    Google Scholar 

  18. M. Wells, F. Pezeshkpour, I. Marshall, M. Tutt, and J. A. Bangham. Simon: an innovative approach to signing on television. In Proc. Int. Broadcasting Convention, 1999.

    Google Scholar 

  19. A. Witkin and Z. Popović. Motion warping. In Proc. SIGGRAPH’ 95, 1995.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kennaway, R. (2002). Synthetic Animation of Deaf Signing Gestures. In: Wachsmuth, I., Sowa, T. (eds) Gesture and Sign Language in Human-Computer Interaction. GW 2001. Lecture Notes in Computer Science(), vol 2298. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-47873-6_15

Download citation

  • DOI: https://doi.org/10.1007/3-540-47873-6_15

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43678-2

  • Online ISBN: 978-3-540-47873-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics