skip to main content
10.1145/3448018.3458018acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Did you Understand this?: Leveraging Gaze Behavior to Assess Questionnaire Comprehension

Published: 25 May 2021 Publication History

Abstract

We investigate how problems in understanding text – specifically a word or a sentence – while filling in questionnaires are reflected in gaze behaviour. To identify text comprehension problems, while filling a questionnaire, and their correlation with the gaze features, we collected data from 42 participant. In a follow-up study (N=30), we evoked comprehension problems and features they affect and quantified users’ gaze behaviour. Our findings implies that comprehension problems could be reflected in a set of gaze features, namely, in the number of fixations, duration of fixations, and number of regressions. Our findings not only demonstrate the potential of eye tracking for assessing reading comprehension but also pave the way for researchers and designers to build novel questionnaire tools that instantly mitigate problems in reading comprehension.

References

[1]
Herman J Adèr. 2008. Advising on research methods: A consultant’s companion. Johannes van Kessel Publishing.
[2]
T Armstrong and BO Olatunji. 2009. What They See Is What You Get: Eye Tracking of Attention in the Anxiety Disorders. Psychological Science Agenda 23, 3 (2009).
[3]
Othman Asiry, Haifeng Shen, and Paul Calder. 2015. Extending Attention Span of ADHD Children Through an Eye Tracker Directed Adaptive User Interface. In Proceedings of the ASWEC 2015 24th Australasian Software Engineering Conference (Adelaide, SA, Australia) (ASWEC ’ 15 Vol. II). ACM, New York, NY, USA, 149–152. https://doi.org/10.1145/2811681.2824997
[4]
Ralf Biedert, Georg Buscher, and Andreas Dengel. 2010. The eyebook using eye tracking to enhance the reading experience. Informatik-Spektrum 33, 3 (2010), 272–281.
[5]
Thomas G Dietterich. 1997. Machine-learning research. AI magazine 18, 4 (1997), 97.
[6]
Thomas G Dietterich. 2000. Ensemble methods in machine learning. In International workshop on multiple classifier systems. Springer, 1–15.
[7]
Albert Hoang Duc, Paul Bays, and Masud Husain. 2008. Chapter 5.5 - Eye movements as a Probe of Attention. In Using Eye Movements as an Experimental Probe of Brain Function, Christopher Kennard and R. John Leigh (Eds.). Progress in Brain Research, Vol. 171. Elsevier, 403 – 411. https://doi.org/10.1016/S0079-6123(08)00659-6
[8]
William Foddy and William H Foddy. 1994. Constructing questions for interviews and questionnaires: Theory and practice in social research. Cambridge university press.
[9]
Floyd J Fowler Jr. 2013. Survey research methods. Sage publications.
[10]
Maite Frutos-Pascual and Begonya Garcia-Zapirain. 2015. Assessing Visual Attention Using Eye Tracking Sensors in Intelligent Cognitive Therapies Based on Serious Games. Sensors 15, 5 (2015), 11092–11117. https://doi.org/10.3390/s150511092
[11]
Robin L Hill. [n.d.]. Roger PG van Gompel Martin H. Fischer Wayne S. Murray University of Dundee, UK. ([n. d.]).
[12]
Mohamed Khamis, Florian Alt, and Andreas Bulling. 2018. The Past, Present, and Future of Gaze-enabled Handheld Mobile Devices: Survey and Lessons Learned. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services (Barcelona, Spain) (MobileHCI ’18). ACM, New York, NY, USA, Article 38, 17 pages. https://doi.org/10.1145/3229434.3229452
[13]
Ilias G Maglogiannis. 2007. Emerging artificial intelligence applications in computer engineering: real word ai systems with applications in ehealth, hci, information retrieval and pervasive technologies. Vol. 160. Ios Press.
[14]
Matei Mancas, Vincent P Ferrera, Nicolas Riche, and John G Taylor. 2016. From Human Attention to Computational Attention: A Multidisciplinary Approach. Vol. 10. Springer.
[15]
George W McConkie and David Zola. 1986. Eye movement techniques in studying differences among developing readers. Center for the Study of Reading Technical Report; no. 377 (1986).
[16]
Vidhya Navalpakkam, Justin Rao, and Malcolm Slaney. 2011. Using gaze patterns to study and predict reading struggles due to distraction. In CHI’11 Extended Abstracts on Human Factors in Computing Systems. ACM, 1705–1710.
[17]
Cornelia Eva Neuert and Timo Lenzner. 2016. Incorporating eye tracking into cognitive interviewing to pretest survey questions. International Journal of Social Research Methodology 19, 5(2016), 501–519. https://doi.org/10.1080/13645579.2015.1049448 arXiv:https://doi.org/10.1080/13645579.2015.1049448
[18]
A Poole. 2005. LJ Ball Eye Tracking in Human-Computer Interaction and Usability Research: Current Status and Future Prospects. Chapter in C. Chaoui (Ed.): Encyclopedia of HCI. Pennsylvania: Idea Group.
[19]
Alex Poole and Linden J Ball. 2006. Eye tracking in HCI and usability research. In Encyclopedia of human computer interaction. IGI Global, 211–219.
[20]
Helmut Prendinger, Aulikki Hyrskykari, Minoru Nakayama, Howell Istance, Nikolaus Bee, and Yosiyuki Takahasi. 2009. Attentive interfaces for users with disabilities: eye gaze for intention and uncertainty estimation. Universal Access in the Information Society 8, 4 (2009), 339–354.
[21]
Keith Rayner, Barbara J Juhasz, and Alexander Pollatsek. 2005. Eye movements during reading. The science of reading: A handbook(2005), 79–97.
[22]
K Rayner and A Pollatsek. 1989. The psychology of reading Lawrence Erlbaum Associates Hillsdale. New Jersey Google Scholar(1989).
[23]
John L Sibert, Mehmet Gokturk, and Robert A Lavine. 2000. The reading assistant: eye gaze triggered auditory prompting for reading remediation. In Proceedings of the 13th annual ACM symposium on User interface software and technology. ACM, 101–107.
[24]
Namrata Srivastava, Joshua Newn, and Eduardo Velloso. 2018. Combining Low and Mid-Level Gaze Features for Desktop Activity Recognition. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2, 4, Article 189 (Dec. 2018), 27 pages. https://doi.org/10.1145/3287067
[25]
Linda A Suskie. 1992. Questionnaire Survey Research: What Works. Resources for Institutional Research, Number Six.(1992).
[26]
Mélodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2012. Wearable eye tracking for mental health monitoring. Computer Communications 35, 11 (2012), 1306 – 1311. https://doi.org/10.1016/j.comcom.2011.11.002

Cited By

View all
  • (2024)Public Security User InterfacesProceedings of the New Security Paradigms Workshop10.1145/3703465.3703470(56-70)Online publication date: 16-Sep-2024
  • (2023)Predicting Users’ Mental Effort in Drawing Tasks Using Gesture RecognitionCompanion Proceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581754.3584104(205-207)Online publication date: 27-Mar-2023

Index Terms

  1. Did you Understand this?: Leveraging Gaze Behavior to Assess Questionnaire Comprehension
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '21 Short Papers: ACM Symposium on Eye Tracking Research and Applications
    May 2021
    232 pages
    ISBN:9781450383455
    DOI:10.1145/3448018
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 May 2021

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Gaze Behaviour
    2. Questionnaire
    3. Reading Comprehension

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Funding Sources

    • Studienstiftung des deutschen Volkes

    Conference

    ETRA '21
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    ETRA '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)24
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 08 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Public Security User InterfacesProceedings of the New Security Paradigms Workshop10.1145/3703465.3703470(56-70)Online publication date: 16-Sep-2024
    • (2023)Predicting Users’ Mental Effort in Drawing Tasks Using Gesture RecognitionCompanion Proceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581754.3584104(205-207)Online publication date: 27-Mar-2023

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media