skip to main content
research-article

Automatic Generation of Customized Areas of Interest and Evaluation of Observers' Gaze in Portrait Videos

Published: 13 May 2022 Publication History

Abstract

We present a novel framework for the evaluation of eye tracking data in portrait videos including the automatic generation of customized areas of interest (AOIs) based on facial landmarks. In contrast to previous work, our framework allows the user to flexibly create AOIs by grouping the detected landmarks. Moreover, their shape and size can be modified to better fit both the research question and the precision of the eye tracker. The framework can be used as an integrated solution to not only generate AOIs but also to evaluate viewing behavior like the overall fixation times, the similarity of scanpaths, and the number of saccades between AOIs. Other functionalities include the visualization of gaze paths and the creation of heatmaps. We demonstrate the benefits of our framework and user-defined AOI layouts via an exemplary application, i.e., the investigation of face swapping artifacts.

Supplementary Material

MP4 File (v6etra144.mp4)
Supplemental video
MP4 File (S7_Automatic Generation of Customized Areas of Interest.mp4)
Conference Presentation (ETRA Long Papers) of title={Automatic Generation of Customized Areas of Interest and Evaluation of Observers' Gaze in Portrait Videos}; authors={Leslie Woehler, Moritz von Estorff, Susana Castillo, and Marcus Magnor}; doi=10.1145/3530885

References

[1]
Robert R. Althoff and Neal J. Cohen. 1999. Eye-Movement-Based Memory Effect: A Reprocessing Effect in Face Perception. Journal of Experimental Psychology: Learning, Memory, and Cognition, Vol. 25, 4 (7 1999), 997--1010. https://doi.org/10.1037/0278--7393.25.4.997
[2]
Tadas Baltruvs aitis, Peter Robinson, and Louis-Philippe Morency. 2016. Openface: an open source facial behavior analysis toolkit. In IEEE Winter Conference on Applications of Computer Vision. IEEE, IEEE, Manhattan, NY, USA, 1--10. https://doi.org/10.1109/WACV.2016.7477553
[3]
Justin K. Bennett, Srinivas Sridharan, Brendan John, and Reynold Bailey. 2016. Looking at faces: autonomous perspective invariant facial gaze analysis. In Proceedings of the ACM Symposium on Applied Perception (Anaheim, California) (SAP '16). Association for Computing Machinery, New York, NY, USA, 105--112. https://doi.org/10.1145/2931002.2931005
[4]
Elina Birmingham and Alan Kingstone. 2009. Human social attention. Progress in brain research, Vol. 176 (2009), 309--320. https://doi.org/10.1016/S0079--6123(09)17618--5
[5]
Dario Bombari, Fred W. Mast, and Janek S. Lobmaier. 2009. Featural, Configural, and Holistic Face-Processing Strategies Evoke Different Scan Patterns. Perception, Vol. 38, 10 (2009), 1508--1521. https://doi.org/10.1068/p6117
[6]
Martin vC ad'ik, Robert Herzog, Rafał Mantiuk, Radosław Mantiuk, Karol Myszkowski, and Hans-Peter Seidel. 2013. Learning to predict localized distortions in rendered images. In Computer Graphics Forum, Vol. 32. Wiley Online Library, Hoboken, New Jersey, USA, 401--410. https://doi.org/10.1111/cgf.12248
[7]
Manuel G. Calvo and Lauri Nummenmaa. 2009. Eye-movement assessment of the time course in facial expression recognition: Neurophysiological implications . Cognitive, Affective, & Behavioral Neuroscience, Vol. 9, 4 (2009), 398--411. https://doi.org/10.3758/CABN.9.4.398
[8]
Susana Castillo, Tilke Judd, and Diego Gutierrez. 2011. Using eye-tracking to assess different image retargeting methods. In ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization. ACM, New York, NY, USA, 7--14. https://doi.org/10.1145/2077451.2077453
[9]
Andrew T. Duchowski, Nina A. Gehrer, Michael Schönenberg, and Krzysztof Krejtz. 2019. Art facing science: Artistic heuristics for face detection: tracking gaze when looking at faces. In Proceedings of the ACM Symposium on Eye Tracking Research & Applications (Denver, Colorado) (ETRA '19). Association for Computing Machinery, New York, NY, USA, Article 80, bibinfonumpages5 pages. https://doi.org/10.1145/3317958.3319809
[10]
Hedwig Eisenbarth and Georg W. Alpers. 2011. Happy mouth and sad eyes: scanning emotional facial expressions. Emotion, Vol. 11, 4 (2011), 860. https://doi.org/10.1037/a0022758
[11]
Ulrich Engelke, Daniel P. Darcy, Grant H. Mulliken, Sebastian Bosse, Maria G. Martini, Sebastian Arndt, Jan-Niklas Antons, Kit Yan Chan, Naeem Ramzan, and Kjell Brunnström. 2016. Psychophysiology-based QoE assessment: A survey. IEEE Journal of Selected Topics in Signal Processing, Vol. 11, 1 (2016), 6--21. https://doi.org/10.1109/JSTSP.2016.2609843
[12]
Zhen-Hua Feng, Josef Kittler, Muhammad Awais, and Xiao-Jun Wu. 2020. Rectified wing loss for efficient and robust facial landmark localisation with convolutional neural networks. International Journal of Computer Vision, Vol. 128, 8 (2020), 2126--2145. https://doi.org/10.1007/s11263-019-01275-0
[13]
Parul Gupta, Komal Chugh, Abhinav Dhall, and Ramanathan Subramanian. 2020. The eyes know it: FakeET-An Eye-tracking Database to Understand Deepfake Perception. In International Conference on Multimodal Interaction. ACM, New York, NY, USA, 519--527. https://doi.org/10.1145/3382507.3418857
[14]
Roy S. Hessels, Jeroen S. Benjamins, Tim H.W. Cornelissen, and Ignace T.C. Hooge. 2018a. A validation of automatically-generated Areas-of-Interest in videos of a face for eye-tracking research. Frontiers in psychology, Vol. 9 (2018), 1367. https://doi.org/10.3389/fpsyg.2018.01367
[15]
Roy S. Hessels, Gijs A. Holleman, Tim H.W. Cornelissen, Ignace T.C. Hooge, and Chantal Kemner. 2018b. Eye contact takes two--autistic and social anxiety traits predict gaze behavior in dyadic interaction. Journal of Experimental Psychopathology, Vol. 9, 2 (2018), jep--062917. https://doi.org/10.5127/jep.062917
[16]
Roy S. Hessels, Chantal Kemner, Carlijn van den Boomen, and Ignace T.C. Hooge. 2016. The area-of-interest problem in eyetracking research: A noise-robust solution for face and sparse stimuli. Behavior research methods, Vol. 48, 4 (2016), 1694--1712. https://doi.org/10.3758/s13428-015-0676-y
[17]
Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost Van de Weijer. 2011. Eye tracking: A comprehensive guide to methods and measures .Oxford University Press, United Kingdom.
[18]
Ignace T.C. Hooge and Guido Camps. 2013. Scan path entropy and arrow plots: Capturing scanning behavior of multiple observers. Frontiers in Psychology, Vol. 4 (2013), 996. https://doi.org/10.3389/fpsyg.2013.00996
[19]
Seyed Mehdi Iranmanesh, Ali Dabouei, Sobhan Soleymani, Hadi Kazemi, and Nasser Nasrabadi. 2020. Robust facial landmark detection via aggregation on geometrically manipulated faces. In IEEE Winter Conference on Applications of Computer Vision. IEEE, Manhattan, NY, USA, 330--340. https://doi.org/10.1109/WACV45572.2020.9093508
[20]
Stephen W. Janik, A. Rodney Wellens, Myron L. Goldberg, and Louis F. Dell'Osso. 1978. Eyes as the center of focus in the visual examination of human faces. Perceptual and Motor Skills, Vol. 47, 3 (1978), 857--858. https://doi.org/10.2466/pms.1978.47.3.857
[21]
Halszka Jarodzka, Kenneth Holmqvist, and Marcus Nyström. 2010. A vector-based, multidimensional scanpath similarity measure. In ACM Symposium on Eye Tracking Research & Applications (Austin, Texas) (ETRA '10). Association for Computing Machinery, New York, NY, USA, 211--218. https://doi.org/10.1145/1743666.1743718
[22]
Haibo Jin, Shengcai Liao, and Ling Shao. 2021. Pixel-in-pixel net: Towards efficient facial landmark detection in the wild. International Journal of Computer Vision (2021), 1--21. https://doi.org/10.1007/s11263-021-01521--4
[23]
Davis E. King. 2009. Dlib-ml: A Machine Learning Toolkit. Journal of Machine Learning Research, Vol. 10 (2009), 1755--1758.
[24]
Steven M. Gillespie Laura J. Wells and Pia Rotshtein. 2016. Identification of emotional facial expressions: Effects of expression, intensity, and sex on eye gaze. PloS ONE, Vol. 11, 12 (2016). https://doi.org/10.1371/journal.pone.0168307
[25]
Vladimir I. Levenshtein. 1966. Binary codes capable of correcting deletions, insertions, and reversals. In Soviet physics doklady, Vol. 10. 707--710.
[26]
Tsoey Wun Man and Peter J. Hills. 2016. Eye-tracking the own-gender bias in face recognition: Other-gender faces are viewed differently to own-gender faces. Visual Cognition, Vol. 24, 9--10 (2016), 447--458. https://doi.org/10.1080/13506285.2017.1301614
[27]
Albert Mehrabian. 2008. Communication without words. Communication theory, Vol. 6 (2008), 193--200.
[28]
I. Mertens, H. Siegmund, and O.-J. Grüsser. 1993. Gaze motor asymmetries in the perception of faces during a memory task. Neuropsychologia, Vol. 31, 9 (1993), 989--998. https://doi.org/10.1016/0028--3932(93)90154-R
[29]
Jacob L. Orquin and Kenneth Holmqvist. 2018. Threats to the validity of eye-movement research in psychology. Behavior research methods, Vol. 50, 4 (2018), 1645--1656.
[30]
Guillaume A. Rousselet, Marc J.-M. Macé, and Michéle Fabre-Thorpe. 2003. Is it an animal? Is it a human face? Fast processing in upright and inverted natural scenes. Journal of Vision, Vol. 3, 6 (2003), 5--5. https://doi.org/10.1167/3.6.5
[31]
K. H. Ruddock, D. S. Wooding, and S. Mannan. 1995. Automatic control of saccadic eye movements made in visual inspection of briefly presented 2-D images. Spatial vision, Vol. 9, 3 (1995), 363--386. https://doi.org/10.1163/156856895X00052
[32]
K. H. Ruddock, D. S. Wooding, and S. K. Mannan. 1996. The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images. Spatial vision, Vol. 10, 3 (1996), 165--188.
[33]
Hannah Scott, Jonathan P. Batten, and Gustav Kuhn. 2019. Why are you looking at me? It's because I'm talking, but mostly because I'm staring or not doing much. Attention, Perception, & Psychophysics, Vol. 81, 1 (2019), 109--118. https://doi.org/10.3758/s13414-018--1588--6
[34]
Jan-Philipp Tauscher, Maryam Mustafa, and Marcus Magnor. 2017. Comparative analysis of three different modalities for perception of artifacts in videos. ACM Transactions on Applied Perception, Vol. 14, 4 (2017), 1--12. https://doi.org/10.1145/3129289
[35]
Goedele Van Belle, Meike Ramon, Philippe Lefèvre, and Bruno Rossion. 2010. Fixation patterns during recognition of personally familiar and unfamiliar faces. Frontiers in Psychology, Vol. 1 (2010), 20. https://doi.org/10.3389/fpsyg.2010.00020
[36]
Melissa L.-H. V o, Tim J. Smith, Parag K. Mital, and John M. Henderson. 2012. Do the eyes really have it? Dynamic allocation of attention when viewing moving faces. Journal of Vision, Vol. 12, 13 (2012), 3--3. https://doi.org/10.1167/12.13.3
[37]
Nannan Wang, Xinbo Gao, Dacheng Tao, Heng Yang, and Xuelong Li. 2018. Facial feature point detection: A comprehensive survey. Neurocomputing, Vol. 275 (2018), 50--65. https://doi.org/10.1016/j.neucom.2017.05.013
[38]
Leslie Wöhler, Jann-Ole Henningson, Susana Castillo, and Marcus Magnor. 2020. PEFS: A Validated Dataset for Perceptual Experiments on Face Swap Portrait Videos. In International Conference on Computer Animation and Social Agents. Springer, Springer, Cham, 120--127. https://doi.org/10.1007/978--3-030--63426--1_13
[39]
Leslie Wöhler, Martin Zembaty, Susana Castillo, and Marcus Magnor. 2021. Towards Understanding Perceptual Differences between Genuine and Face-Swapped Videos. In ACM Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 1--13. https://doi.org/10.1145/3411764.3445627

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Human-Computer Interaction
Proceedings of the ACM on Human-Computer Interaction  Volume 6, Issue ETRA
ETRA
May 2022
198 pages
EISSN:2573-0142
DOI:10.1145/3537904
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 May 2022
Published in PACMHCI Volume 6, Issue ETRA

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. areas of interest
  2. eye tracking
  3. face tracking
  4. gaze analytics

Qualifiers

  • Research-article

Funding Sources

  • L3S Research Center, Hannover

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 107
    Total Downloads
  • Downloads (Last 12 months)28
  • Downloads (Last 6 weeks)1
Reflects downloads up to 27 Jan 2025

Other Metrics

Citations

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media