skip to main content
10.1145/3379156.3391351acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Optimizing user interfaces in food production: gaze tracking is more sensitive for A-B-testing than behavioral data alone

Published: 02 June 2020 Publication History

Abstract

Eye-tracking data often provide access to information about users’ strategies and preferences that extend beyond purely behavioral data. Thanks to modern eye-tracking technology, gaze can be tracked rather unobtrusively in real-world settings. Here we examine the usefulness of gaze tracking with a mobile eye-tracker for interface design in an industrial setting, specifically the operation of a food production line. We use a mock task that is similar in its interface usage to the actual production task in routine machine operation. We compare two interface designs to each other as well as two levels of user expertise. We do not find any effects of experience or interface type in the behavioral data - in particular, both user groups needed the same time to complete the task on average. However, gaze data reveals different strategies: users with high experience in using the interface spend significantly less time looking at the screen – that is, actually interacting with the interface – in absolute terms as well as expressed as fraction of the total time needed to complete the task. This exemplifies how gaze tracking can be utilized to uncover different user-dependent strategies that would not be accessible through behavioral data alone.

References

[1]
Ali Borji, Hamed R Tavakoli, Dicky N Sihite, and Laurent Itti. 2013. Analysis of scores, datasets, and models in visual saliency prediction. In Proceedings of the IEEE international conference on computer vision. 921–928.
[2]
Guy Thomas Buswell. 1935. How people look at pictures: a study of the psychology and perception in art.Univ. Chicago Press.
[3]
Monica S Castelhano, Michael L Mack, and John M Henderson. 2009. Viewing task influences eye movement control during active scene perception. Journal of Vision 9, 3 (2009), 6.
[4]
Benjamin De Haas, Alexios L Iakovidis, D Samuel Schwarzkopf, and Karl R Gegenfurtner. 2019. Individual differences in visual salience vary along semantic dimensions. Proceedings of the National Academy of Sciences 116, 24(2019), 11687–11692.
[5]
Claudia Ehmke and Stephanie Wilson. 2007. Identifying web usability problems from eye-tracking data. In Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI... but not as we know it-Volume 1. British Computer Society, 119–128.
[6]
Wolfgang Einhäuser, Ueli Rutishauser, Christof Koch, 2008. Task-demands can immediately reverse the effects of sensory-driven saliency in complex visual stimuli. Journal of Vision 8, 2 (2008), 2.
[7]
Mica R Endsley. 1995. Measurement of situation awareness in dynamic systems. Human factors 37, 1 (1995), 65–84.
[8]
Andreas Gegenfurtner, Erno Lehtinen, and Roger Säljö. 2011. Expertise differences in the comprehension of visualizations: A meta-analysis of eye-tracking research in professional domains. Educational Psychology Review 23, 4 (2011), 523–552.
[9]
Laurent Itti, Christof Koch, and Ernst Niebur. 1998. A model of saliency-based visual attention for rapid scene analysis. IEEE Transactions on pattern analysis and machine intelligence 20, 11(1998), 1254–1259.
[10]
Kai Kaspar, Ricardo Ramos Gameiro, and Peter König. 2015. Feeling good, searching the bad: Positive priming increases attention and memory for negative stimuli on webpages. Computers in Human Behavior 53 (2015), 332–343.
[11]
Jonathan Samir Matthis, Jacob L Yates, and Mary M Hayhoe. 2018. Gaze and the control of foot placement when walking in natural terrain. Current Biology 28, 8 (2018), 1224–1233.
[12]
Antje Nuthmann and John M Henderson. 2010. Object-based attentional selection in scene viewing. Journal of Vision 10, 8 (2010), 20.
[13]
Judith Schomaker, Daniel Walper, Bianca C Wittmann, and Wolfgang Einhäuser. 2017. Attention in natural scenes: affective-motivational factors guide gaze independently of visual salience. Vision Research 133(2017), 161–175.
[14]
Josef Stoll, Michael Thrun, Antje Nuthmann, and Wolfgang Einhäuser. 2015. Overt attention in natural scenes: Objects dominate features. Vision Research 107(2015), 36–48.
[15]
B. Marius ’t Hart and Wolfgang Einhäuser. 2012. Mind the step: complementary effects of an implicit task on eye and head movements in real-life gaze allocation. Experimental brain research 223, 2 (2012), 233–249.
[16]
Benjamin W Tatler, Mary M Hayhoe, Michael F Land, and Dana H Ballard. 2011. Eye guidance in natural vision: Reinterpreting salience. Journal of Vision 11, 5 (2011), 5.
[17]
Volker Thoma and Jon Dodd. 2019. Web Usability and Eyetracking. In Eye Movement Research. Springer, 883–927.
[18]
Alfred L Yarbus. 1967. Eye movements and vision. Plenum Press.

Cited By

View all
  • (2024)Use of Lean Management Methods based on Eye-Tracking Information to make User Interfaces in Production more Human-centeredProcedia CIRP10.1016/j.procir.2024.04.014128(514-519)Online publication date: 2024
  • (2024)A/B testingJournal of Systems and Software10.1016/j.jss.2024.112011211:COnline publication date: 2-Jul-2024
  • (2023)Understanding iPusnas User Experience Among Students, Workers, and HousewivesLeveraging Generative Intelligence in Digital Libraries: Towards Human-Machine Collaboration10.1007/978-981-99-8088-8_2(12-29)Online publication date: 4-Dec-2023
  • Show More Cited By
  1. Optimizing user interfaces in food production: gaze tracking is more sensitive for A-B-testing than behavioral data alone

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '20 Short Papers: ACM Symposium on Eye Tracking Research and Applications
    June 2020
    305 pages
    ISBN:9781450371346
    DOI:10.1145/3379156
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 June 2020

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. attention
    2. eye tracking
    3. human-machine interaction
    4. interface design
    5. usability

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Conference

    ETRA '20

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    ETRA '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)26
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 05 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Use of Lean Management Methods based on Eye-Tracking Information to make User Interfaces in Production more Human-centeredProcedia CIRP10.1016/j.procir.2024.04.014128(514-519)Online publication date: 2024
    • (2024)A/B testingJournal of Systems and Software10.1016/j.jss.2024.112011211:COnline publication date: 2-Jul-2024
    • (2023)Understanding iPusnas User Experience Among Students, Workers, and HousewivesLeveraging Generative Intelligence in Digital Libraries: Towards Human-Machine Collaboration10.1007/978-981-99-8088-8_2(12-29)Online publication date: 4-Dec-2023
    • (2022)Opportunities for using eye tracking technology in manufacturing and logisticsComputers and Industrial Engineering10.1016/j.cie.2022.108444171:COnline publication date: 1-Sep-2022

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media