Abstract
Computer-generated animations of American Sign Language (ASL) can improve the accessibility of information, communication, and services for the significant number of deaf adults in the US with difficulty in reading English text. Unfortunately, there are several linguistic aspects of ASL that current automatic generation or translation systems cannot produce (or are time-consuming for human animators to create). To determine how important such phenomena are to user satisfaction and the comprehension of ASL animations, studies were conducted in which native ASL signers evaluated ASL animations with and without: establishment of spatial reference points around the virtual human signer representing entities under discussion, pointing pronoun signs, contrastive role shift, and spatial inflection of ASL verbs. It was found that adding these phenomena to ASL animations led to a significant improvement in user comprehension of the animations, thereby motivating future research on automating the generation of these animations.
Similar content being viewed by others
Abbreviations
- ASL:
-
American sign language
- HCI:
-
Human-computer interaction
- MT:
-
Machine translation
- BSL:
-
British sign language
References
Huenerfauth, M.: Improving spatial reference in American sign language animation through data collection from native ASL signers. In: Proceedings of the Universal Access in Human Computer Interaction conference (UAHCI’09), pp. 530–539. (2009). doi:10.1007/978-3-642-02713-0_56
Mitchell, R., Young, T.A., Bachleda, B., Karchmer, M.A.: How many people use ASL in the United States? Why estimates need updating. Sign Lang. Stud. 6(4), 306–335 (2006)
Traxler, C.: The Stanford achievement test, ninth edition: national norming and performance standards for deaf and hard-of-hearing students. J. Deaf Stud. Deaf Educ. 5(4), 337–348 (2000). doi:10.1093/deafed/5.4.337
Huenerfauth, M., Hanson, V.L.: Sign language in the interface: access for Deaf signers. In: Stephanidis, C. (ed.) The Universal Access Handbook. Lawrence Erlbaum Associates, Mahwah (2009)
Lane, H., Hoffmeister, R., Bahan, B.: A Journey into the Deaf World. DawnSign Press, San Diego (1996)
Padden, C., Humphries, T.: Inside Deaf Culture. Harvard University Press, Cambridge (2005)
Elliott, R., Glauert, J.R.W., Kennaway, J.R., Marshall, I., Safar, E.: Linguistic modelling and language-processing technologies for Avatar-based sign language presentation. Univ. Access Inf. Soc. 6(4), 375–391 (2006). doi:10.1007/s10209-007-0102-z
Kennaway, J., Glauert, J., Zwitserlood, I.: Providing signed content on the Internet by synthesized animation. ACM Trans. Comput. Hum. Interact. 14(3), 1–29 (2007). doi:10.1145/1279700.1279705
VCom3D: Sign Smith Studio. http://www.vcom3d.com/signsmith.php. Accessed 11 Mar 2010 (2010)
Chiu, Y.H., Wu, C.H., Su, H.Y., Cheng, C.J.: Joint optimization of word alignment and epenthesis generation for Chinese to Taiwanese sign synthesis. IEEE Trans. Pattern Anal. Mach. Intell. 29(1):28–39. IEEE Press, New York (2007). doi:10.1109/TPAMI.2007.15
Fotinea, S.E., Efthimiou, E., Caridakis, G., Karpouzis, K.: A knowledge-based sign synthesis architecture. Univ. Access Inf. Soc. 6(4), 405–418 (2008). doi:10.1007/s10209-007-0094-8
Marshall, I., Safar, E.: Grammar development for sign language avatar-based synthesis. In: Stephanidis, C. (ed.) Universal Access in HCI: Exploring New Dimensions of Diversity—Volume 8 of the Proceedings of the 11th International Conference on Human-Computer Interaction, Lawrence Erlbaum Associates, Mahwah (2005)
Karpouzis, K., Caridakis, G., Fotinea, S.E., Efthimiou, E.: Educational resources and implementation of a Greek sign language synthesis architecture. Comput. Educ. 49(1), 54–74 (2007). doi:10.1016/j.compedu.2005.06.004
Stein, D., Bungeroth, J., Ney, H.: Morpho-syntax based statistical methods for sign language translation. In: Proceedings of the European Association for Machine Translation, pp. 169–177. European Association for Machine Translation, Allschwil (2006)
Morrissey, S., Way, A.: An example-based approach to translating sign language. In: Proceedings of the Workshop on Example-Based Machine Translation, pp 109–116 (2005)
Shionome, T., Kamata, K., Yamamoto, H., Fischer, S.: Effects of display size on perception of Japanese sign language—mobile access in signed language. In: Proceedings of the Human-Computer Interaction Conference, pp 22–27 (2005)
Sumihiro, K., Yoshihisa, S., Takao, K.: Synthesis of sign animation with facial expression and its effects on understanding of sign language. IEIC Tech. Rep. 100(331), 31–36 (2000)
van Zijl, L., Barker, D.: South African sign language MT system. In: Proceedings of AFRIGRAPH, pp. 49–52 (2003). doi:10.1145/602330.602339
Zhao, L., Kipper, K., Schuler, W., Vogler, C., Badler, N., Palmer, M.: A machine translation system from English to American sign language. In: Proceedings of the 4th Conference of the Association for Machine Translation in the Americas on Envisioning Machine Translation in the Information Future (Lecture Notes in Computer Science 1934). Springer, Heidelberg, pp. 54–67 (2000). doi:10.1007/3-540-39965-8_6
Neidle, C., Kegl, J., MacLaughlin, D., Bahan, B., Lee, R.: The Syntax of American Sign Language: Functional Categories and Hierarchical Structure. MIT Press, Cambridge (2000)
Liddell, S.: Grammar Gesture and Meaning in American Sign Language. Cambridge University Press, Cambridge (2003)
Padden, C.: Interaction of Morphology and Syntax in American Sign Language. Outstanding Dissertations in Linguistics, Series IV. Garland Press, New York (1988)
Braffort, A., Dalle, P.: Sign language applications: preliminary modeling. Univ. Access Inf. Soc. 6(4), 393–404 (2008). doi:10.1007/s10209-007-0103-y
Marshall, I., Safar, E.: A prototype text to British sign language (BSL) translation system. In: Companion Volume to the Proceedings of the Association for Computational Linguistics Conference, pp 113–116 (2003). doi:10.3115/1075178.1075194
Iwarsson, S., Stahl, A.: Accessibility, usability and universal design-positioning and definition of concepts describing person-environment relationships. Disabil. Rehabil. 25(2), 57–66 (2003). doi:10.1080/0963828021000007969
Nielsen, J.: Usability Engineering. Academic Press, Boston (1993)
International Organization for Standardization: ISO 9241-11: Guidance on Usability. International Organization for Standardization. http://www.iso.org/iso/en/CatalogueListPage (1998)
Huenerfauth, M.: Evaluation of a psycholinguistically motivated timing model for animations of american sign language. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 129–136, ACM Press, New York (2008). doi:10.1145/1530064.1530067
Huenerfauth, M.: A linguistically motivated model for speed and pausing in animations of American sign language. ACM Trans. Access. Comput. 2(2):1–31. ACM Press, New York (2009). doi:10.1145/1530064.1530067
Huenerfauth, M., Zhao, L., Gu, E., Allbeck, J.: Evaluation of American sign language generation by native ASL signers. ACM Trans. Access. Comput. 1(1):1–27. ACM Press, New York (2008). doi:10.1145/1361203.1361206
Lucas, C., Valli, C.: Language Contact in the American Deaf Community. Academic Press, San Diego (1992)
Campbell, N.: Speech synthesis evaluation. In: Human Language Technologies (HLT) Evaluation Workshop, European Language Resources Association (ELRA). http://www.elra.info/hltevaluationworkshop/img/pdf/Nick%20Campbell.ATR.Speech%20Synthesis%20Evaluation.pdf. Accessed 13 Mar 2010 (2005)
Van Bezooijen, R., Pols, L.: Evaluating text-to-speech systems: Some methodological aspects. Speech Commun. 9(4), 263–270 (1990). doi:10.1016/0167-6393(90)90002-Q
Huenerfauth, M., Lu, P.: Annotating spatial reference in a motion-capture corpus of American sign language discourse. In: Proceedings of the Fourth Workshop on the Representation and Processing of Signed Languages: Corpora and Sign Language Technologies, the 7th International Conference on Language Resources and Evaluation (LREC 2010). ELRA, Paris (2010)
Bungeroth, J., Stein, D., Dreuw, P., Zahedi, M., Ney, H.: A German sign language corpus of the domain weather report. In: Vettori, C. (ed.) 2nd Workshop on the Representation and Processing of Sign Languages, pp. 2000–2003. ELRA, Paris (2006)
Crasborn, O., Sloetjes, H., Auer, E., Wittenburg, P.: Combining video and numeric data in the analysis of sign languages within the ELAN annotation software. In: Vettori, C. (ed.) 2nd Workshop on the Representation and Processing of Sign Languages, the 5th International Conference on Language Resources and Evaluation (LREC 2006), pp. 82–87. ELRA, Paris (2006)
Efthimiou, E., Fotinea, S.E.: GSLC: Creation and annotation of a Greek sign language corpus for HCI. In: Universal Access in Human Computer Interaction. (Lecture Notes in Computer Science 4554), pp. 657–666. Springer, Heidelberg (2007)
Brashear, H., Starner, T., Lukowicz, P., Junker, H.: Using multiple sensors for mobile sign language recognition. IEEE International Symposium on Wearable Computers, p. 45, IEEE Press, New York (2003). doi:10.1109/ISWC.2003.1241392
Cox, S., Lincoln, M., Tryggvason, J., Nakisa, M., Wells, M., Tutt, M., Abbott, S.: Tessa, a system to aid communication with deaf people. In: 5th International ACM Conference on Assistive Technologies, pp. 205–212. ACM Press, New York (2002). doi:10.1145/638249.638287
Vogler, C., Metaxas, D.: Handshapes and movements: Multiple-channel ASL recognition. (Lecture Notes in Artificial Intelligence 2915), pp. 247–258, Springer, Heidelberg (2004). doi:10.1007/11678816
Acknowledgments
This research was supported in part by the US. National Science Foundation under award number 0746556, by The City University of New York PSC-CUNY Research Award Program, by Siemens A&D UGS PLM Software through a Go PLM Academic Grant, and by Visage Technologies AB through a free academic license for character animation software. Jonathan Lamberton prepared experimental materials and organized data collection for the ASL animation studies discussed in Sects. 2 and 3.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Huenerfauth, M., Lu, P. Effect of spatial reference and verb inflection on the usability of sign language animations. Univ Access Inf Soc 11, 169–184 (2012). https://doi.org/10.1007/s10209-011-0247-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10209-011-0247-7