skip to main content
10.1145/2971485.2996753acmotherconferencesArticle/Chapter ViewAbstractPublication PagesnordichiConference Proceedingsconference-collections
research-article

Towards Using Gaze Properties to Detect Language Proficiency

Published: 23 October 2016 Publication History

Abstract

Humans are inherently skilled at using subtle physiological cues from other persons, for example gaze direction in a conversation. Personal computers have yet to explore this implicit input modality. In a study with 14 participants, we investigate how a user's gaze can be leveraged in adaptive computer systems. In particular, we examine the impact of different languages on eye movements by presenting simple questions in multiple languages to our participants. We found that fixation duration is sufficient to ascertain if a user is highly proficient in a given language. We propose how these findings could be used to implement adaptive visualizations that react implicitly on the user's gaze.

References

[1]
Council of Europe. Common European Framework of Reference for Languages: Learning, Teaching, Assessment. Applied Linguistics Non Series. Cambridge University Press, 2001.
[2]
Deutscher, G. The Unfolding of Language: An Evolutionary Tour of Mankind's Greatest Invention. Henry Holt and Company, 2006.
[3]
Feusner, M., and Lukoff, B. Testing for statistically significant differences between groups of scan patterns. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, ETRA '08, ACM (2008), 43--46.
[4]
Foulsham, T., Cheng, J. T., Tracy, J. L., Henrich, J., and Kingstone, A. Gaze allocation in a dynamic situation: Effects of social status and speaking. Cognition 117, 3 (Dec. 2010), 319--331.
[5]
Hirvenkari, L., Ruusuvuori, J., Saarinen, V.-M., Kivioja, M., Peräkylä, A., and Hari, R. Influence of Turn-Taking in a Two-Person Conversation on the Gaze of a Viewer. PLoS ONE 8, 8 (Aug. 2013).
[6]
International Organization for Standardization. Codes for the representation of names of languages -- Part 1: Alpha-2 code. Standard, Geneva, CH, July 2002.
[7]
Jarodzka, H., Holmqvist, K., and Nyström, M. A Vector-based, Multidimensional Scanpath Similarity Measure. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ETRA '10, ACM (New York, NY, USA, 2010), 211--218.
[8]
Marschner, L., Pannasch, S., Schulz, J., and Graupner, S.-T. Social communication with virtual agents: The effects of body and gaze direction on attention and emotional responding in human observers. International Journal of Psychophysiology 97, 2 (Aug. 2015), 85--92.
[9]
Martínez-Gómez, P., and Aizawa, A. Recognition of Understanding Level and Language Skill Using Measurements of Reading Behavior. In Proceedings of the 19th International Conference on Intelligent User Interfaces, IUI '14, ACM (New York, NY, USA, 2014), 95--104.
[10]
Rayner, K., Slattery, T. J., and Bélanger, N. N. Eye movements, the perceptual span, and reading speed. Psychonomic bulletin & review 17, 6 (2010), 834--839.
[11]
Salvucci, D. D., and Goldberg, J. H. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, ACM (2000), 71--78.
[12]
Sereno, S. C., and Rayner, K. Measuring word recognition in reading: Eye movements and event-related potentials. Trends in cognitive sciences 7, 11 (2003), 489--493.
[13]
Steptoe, W., Wolff, R., Murgia, A., Guimaraes, E., Rae, J., Sharkey, P., Roberts, D., and Steed, A. Eye-tracking for Avatar Eye-gaze and Interactional Analysis in Immersive Collaborative Virtual Environments. In Proceedings of the 2008 ACM Conference on Computer Supported Cooperative Work, CSCW '08, ACM (New York, NY, USA, 2008), 197--200.
[14]
Vertegaal, R., Slagter, R., van der Veer, G., and Nijholt, A. Eye Gaze Patterns in Conversations: There is More to Conversational Agents Than Meets the Eyes. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '01, ACM (New York, NY, USA, 2001), 301--308.
[15]
Wood, E., and Bulling, A. EyeTab: Model-based gaze estimation on unmodified tablet computers. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA '14, ACM (2014), 207--210.
[16]
Yamazoe, H., Utsumi, A., Yonezawa, T., and Abe, S. Remote gaze estimation with a single camera based on facial-feature tracking without special calibration actions. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, ETRA '08, ACM (2008), 245--250.
[17]
Yoshimura, K., Kise, K., and Kunze, K. The eye as the window of the language ability: Estimation of English skills by analyzing eye movement while reading documents. In Document Analysis and Recognition (ICDAR), 2015 13th International Conference on, IEEE (2015), 251--255.

Cited By

View all
  • (2019)Using Eye Tracked Virtual Reality to Classify Understanding of Vocabulary in Recall Tasks2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)10.1109/AIVR46125.2019.00019(66-667)Online publication date: Dec-2019

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
NordiCHI '16: Proceedings of the 9th Nordic Conference on Human-Computer Interaction
October 2016
1045 pages
ISBN:9781450347631
DOI:10.1145/2971485
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 October 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Eye tracking
  2. adaptive visualization
  3. pattern recognition

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

NordiCHI '16

Acceptance Rates

NordiCHI '16 Paper Acceptance Rate 58 of 231 submissions, 25%;
Overall Acceptance Rate 379 of 1,572 submissions, 24%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)2
  • Downloads (Last 6 weeks)1
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2019)Using Eye Tracked Virtual Reality to Classify Understanding of Vocabulary in Recall Tasks2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)10.1109/AIVR46125.2019.00019(66-667)Online publication date: Dec-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media