skip to main content
10.1145/3334480.3375222acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
extended-abstract

Quantifying Sign Avatar Perception: How Imperfect is Insufficient?

Published: 25 April 2020 Publication History

Abstract

The aim of this study was to identify relevant aspects of sign avatar animation and to quantify their effect on human perception, to better address user expectation in the future. For this, 25 users assessed two types of avatar animations, one upper baseline generated utilizing motion capture data, and a reference with absolute total positional differences in the mm range generated utilizing machine learned synthetic data. As expected, user evaluation of the synthesized references showed a considerable loss in rating scores. We therefore computed a variety of signal-specific differences between both data types and investigated their correlations to the collected user ratings. Results reveal general statistically significant inter-dependencies of avatar movement and perception helpful for the generation of any type of virtual avatar in the future. However, results also suggest that it is difficult to determine concrete avatar features with a high influence on the user perception in the current study design.

Supplemental Material

MP4 File
Supplemental video

References

[1]
Sedeeq Al-khazraji, Larwan Berke, Sushant Kafe, Peter Yeung, and Matt Huenerfauth. 2018. Modeling the Speed and Timing of American Sign Language to Generate Realistic Animations. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers & Accessibility. 259--270.
[2]
Heike Brock and Kazuhiro Nakadai. 2018. Deep JSLC: A Multimodal Corpus Collection for Data-driven Generation of Japanese Sign Language Expressions. In Proceedings of the 11th International Conference on Language Resources and Evaluation.
[3]
Sarah Ebling and John Glauert. 2016. Building a Swiss German Sign Language avatar with JASigning and evaluating it among the Deaf community. Universal Access in the Information Society 15, 4 (2016), 577--587.
[4]
Jigar Gohel, Sedeeq Al-khazraji, and Matt Huenerfauth. 2018. Modeling the Use of Space for Pointing in American Sign Language Animation. (2018).
[5]
Matt Huenerfauth and Hernisa Kacorri. 2014. Release of experimental stimuli and questions for evaluating facial expressions in animations of American Sign Language. In Proceedings of the 6th Workshop on the Representation and Processing of Sign Languages: Beyond the Manual Channel, The 9th International Conference on Language Resources and Evaluation.
[6]
Matt Huenerfauth and Hernisa Kacorri. 2015. Best practices for conducting evaluations of sign language animation. Journal on Technology & Persons with Disabilities 3 (2015).
[7]
Matt Huenerfauth, Liming Zhao, Erdan Gu, and Jan Allbeck. 2008. Evaluation of American Sign Language Generation by Native ASL Signers. ACM Transactions on Accessible Computing 1, 1 (2008), 3:1--3:27.
[8]
Hernisa Kacorri and Matt Huenerfauth. 2016. Continuous profle models in asl syntactic facial expression synthesis. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Vol. 1. 2084--2093.
[9]
Hernisa Kacorri, Matt Huenerfauth, Sarah Ebling, Kasmira Patel, Kellie Menzies, and Mackenzie Willard. 2017. Regression Analysis of Demographic and Technology-Experience Factors Infuencing Acceptance of Sign Language Animation. ACM Transactions on Accessible Computing 10, 1 (2017), 3:1--3:33.
[10]
Hernisa Kacorri, Pengfei Lu, and Matt Huenerfauth. 2013. Evaluating facial expressions in American Sign Language animations for accessible online information. In Universal Access in Human-Computer Interaction. Design Methods, Tools, and Interaction Techniques for eInclusion. Springer, 510--519.
[11]
Michael Kipp, Alexis Heloir, and Quan Nguyen. 2011a. Sign language avatars: Animation and comprehensibility. In International Workshop on Intelligent Virtual Agents. Springer, 113--126.
[12]
Michael Kipp, Quan Nguyen, Alexis Heloir, and Silke Matthes. 2011b. Assessing the Deaf User Perspective on Sign Language Avatars. In Proceedings of the 13th International ACM SIGACCESS Conference on Computers & Accessibility. 107--114.

Cited By

View all
  • (2024)An exploratory user study towards developing a unified, comprehensive assessment apparatus for deaf signers, specifically tailored for signing avatars evaluation: challenges, findings, and recommendationsMultimedia Tools and Applications10.1007/s11042-024-20365-xOnline publication date: 29-Oct-2024
  • (2023)"I Want To Be Able To Change The Speed And Size Of The Avatar": Assessing User Requirements For Animated Sign Language Translation InterfacesExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585675(1-7)Online publication date: 19-Apr-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI EA '20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems
April 2020
4474 pages
ISBN:9781450368193
DOI:10.1145/3334480
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 April 2020

Check for updates

Author Tags

  1. avatar animation quality
  2. human perception
  3. sign avatar
  4. user acceptance

Qualifiers

  • Extended-abstract

Conference

CHI '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)34
  • Downloads (Last 6 weeks)3
Reflects downloads up to 19 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)An exploratory user study towards developing a unified, comprehensive assessment apparatus for deaf signers, specifically tailored for signing avatars evaluation: challenges, findings, and recommendationsMultimedia Tools and Applications10.1007/s11042-024-20365-xOnline publication date: 29-Oct-2024
  • (2023)"I Want To Be Able To Change The Speed And Size Of The Avatar": Assessing User Requirements For Animated Sign Language Translation InterfacesExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585675(1-7)Online publication date: 19-Apr-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media