skip to main content
10.1145/3313831.3376449acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Quantification of Users' Visual Attention During Everyday Mobile Device Interactions

Published: 23 April 2020 Publication History

Abstract

We present the first real-world dataset and quantitative evaluation of visual attention of mobile device users in-situ, i.e. while using their devices during everyday routine. Understanding user attention is a core research challenge in mobile HCI but previous approaches relied on usage logs or self-reports that are only proxies and consequently do neither reflect attention completely nor accurately. Our evaluations are based on Everyday Mobile Visual Attention (EMVA) – a new 32-participant dataset containing around 472 hours of video snippets recorded over more than two weeks in real life using the front-facing camera as well as associated usage logs, interaction events, and sensor data. Using an eye contact detection method, we are first to quantify the highly dynamic nature of everyday visual attention across users, mobile applications, and usage contexts. We discuss key insights from our analyses that highlight the potential and inform the design of future mobile attentive user interfaces.

Supplementary Material

ZIP File (pn5015vfc.zip)
Video figure captions
SBV File (paper322pvc.sbv)
Preview video captions
MP4 File (paper322pv.mp4)
Preview video
MP4 File (pn5015vf.mp4)
Supplemental video
MP4 File (a322-bace-presentation.mp4)

References

[1]
Mihai Bâce, Sander Staal, and Andreas Bulling. 2019. Accurate and Robust Eye Contact Detection During Everyday Mobile Device Interactions. CoRR abs/1907.11115 (2019). http://arxiv.org/abs/1907.11115
[2]
A. Borji and L. Itti. 2013. State-of-the-Art in Visual Attention Modeling. IEEE Transactions on Pattern Analysis and Machine Intelligence 35, 1 (Jan 2013), 185--207.
[3]
A. Bulling. 2016. Pervasive Attentive User Interfaces. Computer 49, 1 (Jan 2016), 94--98.
[4]
Minsoo Choy, Daehoon Kim, Jae-Gil Lee, Heeyoung Kim, and Hiroshi Motoda. 2016. Looking Back on the Current Day: Interruptibility Prediction Using Daily Behavioral Features. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16). ACM, NY, NY, USA, 1004--1015.
[5]
Rodrigo de Oliveira, Christopher Pentoney, and Mika Pritchard-Berman. 2018. YouTube Needs: Understanding User's Motivations to Watch Videos on Mobile Devices. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '18). ACM, NY, NY, USA, Article 36, 11 pages.
[6]
Elena Di Lascio, Shkurta Gashi, and Silvia Santini. 2018. Unobtrusive Assessment of Students' Emotional Engagement During Lectures Using Electrodermal Activity Sensors. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2, 3, Article 103 (Sept. 2018), 21 pages.
[7]
Connor Dickie, Roel Vertegaal, Jeffrey S. Shell, Changuk Sohn, Daniel Cheng, and Omar Aoudeh. 2004. Eye Contact Sensing Glasses for Attention-sensitive Wearable Video Blogging. In CHI '04 Extended Abstracts on Human Factors in Computing Systems (CHI EA '04). ACM, NY, NY, USA, 769--770.
[8]
Connor Dickie, Roel Vertegaal, Changuk Sohn, and Daniel Cheng. 2005. eyeLook: Using Attention to Facilitate Mobile Media Consumption. In Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology (UIST '05). ACM, NY, NY, USA, 103--106.
[9]
Tilman Dingler and Martin Pielot. 2015. I'Ll Be There for You: Quantifying Attentiveness Towards Mobile Messaging. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '15). ACM, NY, NY, USA, 1--5.
[10]
M. E. Fathy, V. M. Patel, and R. Chellappa. 2015. Face-based Active Authentication on mobile devices. In 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 1687--1691.
[11]
James Fogarty, Scott E. Hudson, Christopher G. Atkeson, Daniel Avrahami, Jodi Forlizzi, Sara Kiesler, Johnny C. Lee, and Jie Yang. 2005. Predicting Human Interruptibility with Sensors. ACM Trans. Comput.-Hum. Interact. 12, 1 (March 2005), 119--146.
[12]
Simone Frintrop, Erich Rome, and Henrik I. Christensen. 2010. Computational Visual Attention Systems and Their Cognitive Foundations: A Survey. ACM Trans. Appl. Percept. 7, 1, Article 6 (Jan. 2010), 39 pages.
[13]
Oliver Hohlfeld, André Pomp, Jó Ágila Bitsch Link, and Dennis Guse. 2015. On the Applicability of Computer Vision Based Gaze Tracking in Mobile Scenarios. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '15). ACM, NY, NY, USA, 427--434.
[14]
Michael Xuelin Huang, Jiajia Li, Grace Ngai, and Hong Va Leong. 2017. ScreenGlint: Practical, In-situ Gaze Estimation on Smartphones. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, NY, NY, USA, 2546--2557.
[15]
Laurent Itti and Christof Koch. 2000. A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research 40, 10 (2000), 1489 -- 1506.
[16]
Z. Jiang, J. Han, C. Qian, W. Xi, K. Zhao, H. Ding, S. Tang, J. Zhao, and P. Yang. 2016. VADS: Visual attention detection with a smartphone. In IEEE INFOCOM 2016 - The 35th Annual IEEE International Conference on Computer Communications. 1--9.
[17]
Simon L. Jones, Denzil Ferreira, Simo Hosio, Jorge Goncalves, and Vassilis Kostakos. 2015. Revisitation Analysis of Smartphone App Use. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '15). ACM, NY, NY, USA, 1197--1208.
[18]
Ashish Kapoor and Eric Horvitz. 2008. Experience Sampling for Building Predictive User Models: A Comparative Study. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). ACM, NY, NY, USA, 657--666.
[19]
Mohamed Khamis, Anita Baier, Niels Henze, Florian Alt, and Andreas Bulling. 2018. Understanding Face and Eye Visibility in Front-Facing Cameras of Smartphones Used in the Wild. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). ACM, NY, NY, USA, Article 280, 12 pages.
[20]
Peter Kiefer, Ioannis Giannopoulos, Dominik Kremer, Christoph Schlieder, and Martin Raubal. 2014. Starting to Get Bored: An Outdoor Eye Tracking Study of Tourists Exploring a City Panorama. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, NY, NY, USA, 315--318.
[21]
Nataliya Kosmyna, Caitlin Morris, Utkarsh Sarawgi, and Pattie Maes. 2019. AttentivU: A Biofeedback System for Real-time Monitoring and Improvement of Engagement. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (CHI EA '19). ACM, NY, NY, USA, Article VS07, 2 pages.
[22]
K. Krafka, A. Khosla, P. Kellnhofer, H. Kannan, S. Bhandarkar, W. Matusik, and A. Torralba. 2016. Eye Tracking for Everyone. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2176--2184.
[23]
Chia-Hsun Jackie Lee, Chiun-Yi Ian Jang, Ting-Han Daniel Chen, Jon Wetzel, Yang-Ting Bowbow Shen, and Ted Selker. 2006. Attention Meter: A Vision-based Input Toolkit for Interaction Designers. In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06). ACM, NY, NY, USA, 1007--1012.
[24]
Alexander Mariakakis, Mayank Goel, Md Tanvir Islam Aumi, Shwetak N. Patel, and Jacob O. Wobbrock. 2015. SwitchBack: Using Focus and Saccade Tracking to Guide Users' Attention for Mobile Task Resumption. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, NY, NY, USA, 2953--2962.
[25]
Akhil Mathur, Nicholas D. Lane, and Fahim Kawsar. 2016. Engagement-aware Computing: Modelling User Engagement from Mobile Contexts. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16). ACM, NY, NY, USA, 622--633.
[26]
Abhinav Mehrotra, Veljko Pejovic, Jo Vermeulen, Robert Hendley, and Mirco Musolesi. 2016. My Phone and Me: Understanding People's Receptivity to Mobile Notifications. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, NY, NY, USA, 1021--1032.
[27]
Emiliano Miluzzo, Tianyu Wang, and Andrew T. Campbell. 2010. EyePhone: Activating Mobile Phones with Your Eyes. In Proceedings of the Second ACM SIGCOMM Workshop on Networking, Systems, and Applications on Mobile Handhelds (MobiHeld '10). ACM, NY, NY, USA, 15--20.
[28]
Philipp Müller, Michael Xuelin Huang, Xucong Zhang, and Andreas Bulling. 2018. Robust Eye Contact Detection in Natural Multi-Person Interactions Using Gaze and Speaking Behaviour. In Proc. International Symposium on Eye Tracking Research and Applications (ETRA). 31:1--31:10.
[29]
Matei Negulescu, Jaime Ruiz, Yang Li, and Edward Lank. 2012. Tap, Swipe, or Move: Attentional Demands for Distracted Smartphone Input. In Proceedings of the International Working Conference on Advanced Visual Interfaces (AVI '12). ACM, NY, NY, USA, 173--180.
[30]
Mikio Obuchi, Wataru Sasaki, Tadashi Okoshi, Jin Nakazawa, and Hideyuki Tokuda. 2016. Investigating Interruptibility at Activity Breakpoints Using Smartphone Activity Recognition API. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (UbiComp '16). ACM, NY, NY, USA, 1602--1607.
[31]
Antti Oulasvirta, Sakari Tamminen, Virpi Roto, and Jaana Kuorelahti. 2005. Interaction in 4-second Bursts: The Fragmented Nature of Attentional Resources in Mobile HCI. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '05). ACM, NY, NY, USA, 919--928.
[32]
Lucas Paletta, Helmut Neuschmied, Michael Schwarz, Gerald Lodron, Martin Pszeida, Patrick Luley, Stefan Ladstätter, Stephanie M. Deutsch, Jan Bobeth, and Manfred Tscheligi. 2014. Attention in Mobile Interactions: Gaze Recovery for Large Scale Studies. In CHI '14 Extended Abstracts on Human Factors in Computing Systems (CHI EA '14). ACM, NY, NY, USA, 1717--1722.
[33]
Martin Pielot, Bruno Cardoso, Kleomenis Katevas, Joan Serrà, Aleksandar Matic, and Nuria Oliver. 2017. Beyond Interruptibility: Predicting Opportune Moments to Engage Mobile Phone Users. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3, Article 91 (Sept. 2017), 25 pages.
[34]
Martin Pielot, Rodrigo de Oliveira, Haewoon Kwak, and Nuria Oliver. 2014. Didn'T You See My Message?: Predicting Attentiveness to Mobile Instant Messages. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, NY, NY, USA, 3319--3328.
[35]
Martin Pielot, Tilman Dingler, Jose San Pedro, and Nuria Oliver. 2015. When Attention is Not Scarce Detecting Boredom from Mobile Phone Usage. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '15). ACM, NY, NY, USA, 825--836.
[36]
Henning Pohl and Roderick Murray-Smith. 2013. Focused and Casual Interactions: Allowing Users to Vary Their Level of Engagement. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, NY, NY, USA, 2223--2232.
[37]
H. R. Schiffmann. 2001. Sensation and Perception: An integrated approach. John Wiley & Sons, New York.
[38]
Jeffrey S. Shell, Roel Vertegaal, and Alexander W. Skaburskis. 2003. EyePliances: Attention-seeking Devices That Respond to Visual Attention. In CHI '03 Extended Abstracts on Human Factors in Computing Systems (CHI EA '03). ACM, NY, NY, USA, 770--771.
[39]
Brian A. Smith, Qi Yin, Steven K. Feiner, and Shree K. Nayar. 2013. Gaze Locking: Passive Eye Contact Detection for Human-object Interaction. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST '13). ACM, NY, NY, USA, 271--280.
[40]
Julian Steil, Philipp Müller, Yusuke Sugano, and Andreas Bulling. 2018. Forecasting User Attention During Everyday Mobile Interactions Using Device-integrated and Wearable Sensors. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '18). ACM, NY, NY, USA, Article 1, 13 pages.
[41]
Y. Sugano, Y. Matsushita, and Y. Sato. 2010. Calibration-free gaze sensing using saliency maps. In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. 2667--2674.
[42]
Yusuke Sugano, Xucong Zhang, and Andreas Bulling. 2016. AggreGaze: Collective Estimation of Audience Attention on Public Displays. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16). ACM, NY, NY, USA, 821--831.
[43]
Gaper Urh and Veljko Pejovic. ´ 2016. TaskyApp: Inferring Task Engagement via Smartphone Sensing. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (UbiComp '16). ACM, NY, NY, USA, 1548--1553.
[44]
Niels van Berkel, Denzil Ferreira, and Vassilis Kostakos. 2017. The Experience Sampling Method on Mobile Devices. ACM Comput. Surv. 50, 6, Article 93 (Dec. 2017), 40 pages.
[45]
Roel Vertegaal. 2003. Attentive User Interfaces. Commun. ACM 46, 3 (March 2003), 30--33.
[46]
Aku Visuri and Niels van Berkel. 2019. Attention Computing: Overview of Mobile Sensing Applied to Measuring Attention. In Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers (UbiComp/ISWC '19). ACM, NY, NY, USA, 1079--1082.
[47]
Erroll Wood and Andreas Bulling. 2014. EyeTab: Model-based Gaze Estimation on Unmodified Tablet Computers. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, NY, NY, USA, 207--210.
[48]
Pingmei Xu, Yusuke Sugano, and Andreas Bulling. 2016. Spatio-Temporal Modeling and Prediction of Visual Attention in Graphical User Interfaces. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, NY, NY, USA, 3299--3310.
[49]
Fengpeng Yuan, Xianyi Gao, and Janne Lindqvist. 2017. How Busy Are You?: Predicting the Interruptibility Intensity of Mobile Users. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, NY, NY, USA, 5346--5360.
[50]
Stefanos Zafeiriou, Cha Zhang, and Zhengyou Zhang. 2015. A survey on face detection in the wild: Past, present and future. Computer Vision and Image Understanding 138 (2015), 1 -- 24.
[51]
Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2017. Everyday Eye Contact Detection Using Unsupervised Gaze Target Discovery. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST '17). ACM, NY, NY, USA, 193--203.
[52]
X. Zhang, Y. Sugano, M. Fritz, and A. Bulling. 2017. It's Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation. In 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). 2299--2308.

Cited By

View all
  • (2025)Mobile use in an age of interruption: Implications of capacity and structural interference for mobile usersInformation & Management10.1016/j.im.2024.10406962:1(104069)Online publication date: Jan-2025
  • (2024)Person-Specific Analyses of Smartphone Use and Mental Health: An Intensive Longitudinal Study Over One Year (Preprint)JMIR Formative Research10.2196/59875Online publication date: 24-Apr-2024
  • (2024)PrivatEyes: Appearance-based Gaze Estimation Using Federated Secure Multi-Party ComputationProceedings of the ACM on Human-Computer Interaction10.1145/36556068:ETRA(1-23)Online publication date: 28-May-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems
April 2020
10688 pages
ISBN:9781450367080
DOI:10.1145/3313831
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 April 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. attentive user interfaces
  2. eye contact detection
  3. in-the-wild study
  4. mobile devices
  5. visual attention

Qualifiers

  • Research-article

Funding Sources

Conference

CHI '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)139
  • Downloads (Last 6 weeks)40
Reflects downloads up to 13 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Mobile use in an age of interruption: Implications of capacity and structural interference for mobile usersInformation & Management10.1016/j.im.2024.10406962:1(104069)Online publication date: Jan-2025
  • (2024)Person-Specific Analyses of Smartphone Use and Mental Health: An Intensive Longitudinal Study Over One Year (Preprint)JMIR Formative Research10.2196/59875Online publication date: 24-Apr-2024
  • (2024)PrivatEyes: Appearance-based Gaze Estimation Using Federated Secure Multi-Party ComputationProceedings of the ACM on Human-Computer Interaction10.1145/36556068:ETRA(1-23)Online publication date: 28-May-2024
  • (2024)SalChartQA: Question-driven Saliency on Information VisualisationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642942(1-14)Online publication date: 11-May-2024
  • (2024)MoodCapture: Depression Detection using In-the-Wild Smartphone ImagesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642680(1-18)Online publication date: 11-May-2024
  • (2024)Deep Learning based Eye Tracking on Smartphones for Dynamic Visual StimuliProcedia Computer Science10.1016/j.procs.2024.09.183246(3733-3742)Online publication date: 2024
  • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/3606947Online publication date: 30-Jun-2023
  • (2023)Understanding How Deaf and Hard of Hearing Viewers Visually Explore Captioned Live TV NewsProceedings of the 20th International Web for All Conference10.1145/3587281.3587287(54-65)Online publication date: 30-Apr-2023
  • (2022)Bifurcating Cognitive Attention from Visual Concentration: Utilizing Cooperative Audiovisual Sensing for Demarcating Inattentive Online Meeting ParticipantsProceedings of the ACM on Human-Computer Interaction10.1145/35556566:CSCW2(1-34)Online publication date: 11-Nov-2022
  • (2022)Eye-tracking Technologies in Mobile Devices Using Edge Computing: A Systematic ReviewACM Computing Surveys10.1145/354693855:8(1-33)Online publication date: 23-Dec-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media