skip to main content
research-article

Comparative Analysis of Three Different Modalities for Perception of Artifacts in Videos

Published: 14 September 2017 Publication History

Abstract

This study compares three popular modalities for analyzing perceived video quality; user ratings, eye tracking, and EEG. We contrast these three modalities for a given video sequence to determine if there is a gap between what humans consciously see and what we implicitly perceive. Participants are shown a video sequence with different artifacts appearing at specific distances in their field of vision; near foveal, middle peripheral, and far peripheral. Our results show distinct differences between what we saccade to (eye tracking), how we consciously rate video quality, and our neural responses (EEG data). Our findings indicate that the measurement of perceived quality depends on the specific modality used.

Supplementary Material

tauscher (tauscher.zip)
Supplemental movie, appendix, image and software files for, Comparative Analysis of Three Different Modalities for Perception of Artifacts in Videos

References

[1]
Laura Acqualagna, Sebastian Bosse, Anne K. Porbadnigk, Gabriel Curio, Klaus-Robert Müller, Thomas Wiegand, and Benjamin Blankertz. 2015. EEG-based classification of video quality perception using steady state visual evoked potentials (SSVEPs). J. Neural Eng. 12, 2 (2015), 026012.
[2]
Elena Arabadzhiyska, Okan Tarhan Tursun, Karol Myszkowski, Hans-Peter Seidel, and Piotr Didyk. 2017. Saccade landing position prediction for gaze-contingent rendering. ACM Trans. 36, 4 (July 2017), 1--12.
[3]
Sebastian Arndt, Jan-Niklas Antons, Robert Schleicher, Sebastian Möller, and Gabriel Curio. 2012. Perception of low-quality videos analyzed by means of electroencephalography. In Proceedings of the 2012 4th International Workshop on Quality of Multimedia Experience (QoMEX’12). IEEE, 284--289.
[4]
Sebastian Arndt, Jan-Niklas Antons, Robert Schleicher, Sebastian Moller, and Gabriel Curio. 2014. Using electroencephalography to measure perceived video quality. IEEE J. Select. Topics. Signal Process. 8, 3 (2014), 366--376.
[5]
Sebastian Arndt, Jan-Niklas Antons, Robert Schleicher, Sebastian Möller, Simon Scholler, and Gabriel Curio. 2011. A physiological approach to determine video quality. In Proceedings of the 2011 IEEE International Symposium on Multimedia (ISM’11). IEEE, 518--523.
[6]
Sebastian Bosse, Laura Acqualagna, Anne K. Porbadnigk, Benjamin Blankertz, Gabriel Curio, Klaus-Robert Müller, and Thomas Wiegand. 2014. Neurally informed assessment of perceived natural texture image quality. In Proceedings of the 2014 IEEE International Conference on Image Processing (ICIP’14). IEEE, 1987--1991.
[7]
Connie C. Duncan, Robert J. Barry, John F. Connolly, Catherine Fischer, Patricia T. Michie, Risto Näätänen, John Polich, Ivar Reinvang, and Cyma Van Petten. 2009. Event-related potentials in clinical research: Guidelines for eliciting, recording, and quantifying mismatch negativity, P300, and N400. Clin. Neurophysiol. 120, 11 (2009), 1883--1908.
[8]
Ulrich Engelke, Daniel P. Darcy, Grant H. Mulliken, Sebastian Bosse, Maria G. Martini, Sebastian Arndt, Jan-Niklas Antons, Kit Yan Chan, Naeem Ramzan, and Kjell Brunnström. 2017. Psychophysiology-based QoE assessment: A survey. IEEE J. Select. Topics Signal Process. 11, 1 (2017), 6--21.
[9]
E. J. Engelken, K. W. Stevens, and J. D. Enderle. 1991. Relationships between manual reaction time and saccade latency in response to visual and auditory stimuli. Aviat. Space Environ. Med. 62, 4 (1991), 315--318.
[10]
Stephen Gould, Joakim Arfvidsson, Adrian Kaehler, Benjamin Sapp, Marius Messner, Gary R. Bradski, Paul Baumstarck, Sukwon Chung, Andrew Y. Ng et al. 2007. Peripheral-foveal vision for real-time object recognition and tracking in video. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI’07), Vol. 7. 2115--2121.
[11]
Stephen R. Gulliver and George Ghinea. 2004. Stars in their eyes: What eye-tracking reveals about multimedia perceptual quality. IEEE Trans. Syst. Man Cybernet.—Part A: Syst. Humans 34, 4 (2004), 472--482.
[12]
G. A. Horridge. 1987. The evolution of visual processing and the construction of seeing systems. Proc. Roy. Soc. London B: Biol. Sci. 230, 1260 (1987), 279--292.
[13]
Parikshit Juluri, Venkatesh Tamarapalli, and Deep Medhi. 2016. Measurement of quality of experience of video-on-demand services: A survey. IEEE Commun. Surveys Tutor. 18, 1 (2016), 401--418.
[14]
Bonkon Koo and Seungjin Choi. 2015. SSVEP response on oculus rift. In Proceedings of the 2015 3rd International Winter Conference on Brain-Computer Interface (BCI’15). IEEE, 1--4.
[15]
Robert Kosara, Christopher G. Healey, Victoria Interrante, David H. Laidlaw, and Colin Ware. 2003. Thoughts on user studies: Why, how, and when. IEEE Comput. Graph. Appl. 23, 4 (2003), 20--25.
[16]
Vassilios Krassanakis, Vassiliki Filippakopoulou, and Byron Nakos. 2014. EyeMMV toolbox: An eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification. J. Eye Move. Res. 7, 1 (2014).
[17]
Eleni Kroupi, Philippe Hanhart, Jong-Seok Lee, Martin Rerabek, and Touradj Ebrahimi. 2014. EEG correlates during video quality perception. In Proceedings of the 22nd European Signal Processing Conference (EUSIPCO’14). IEEE, 2135--2139.
[18]
Olivier Le Meur, Alexandre Ninassi, Patrick Le Callet, and Dominique Barba. 2010. Overt visual attention for free-viewing and quality assessment tasks: Impact of the regions of interest on a video quality metric. Signal Process.: Image Commun. 25, 7 (2010), 547--558.
[19]
Zhicheng Li, Shiyin Qin, and Laurent Itti. 2011. Visual attention guided bit allocation in video compression. Image Vision Comput. 29, 1 (2011), 1--14.
[20]
Lea Lindemann and Marcus Magnor. 2011. Assessing the quality of compressed images using EEG. In Proceedings of the 18th IEEE International Symposium on Image Processing (ICIP’11). IEEE, 3109--3112.
[21]
Lea Lindemann, Stephan Wenger, and Marcus Magnor. 2011. Evaluation of video artifact perception using event-related potentials. In Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization. ACM, 53--58.
[22]
Hantao Liu and Ingrid Heynderickx. 2011. Visual attention in objective image quality assessment: Based on eye-tracking data. IEEE Trans. Circ. Syst. Video Technol. 21, 7 (2011), 971--982.
[23]
Sidi Liu, Jinglei Lv, Yimin Hou, Ting Shoemaker, Qinglin Dong, Kaiming Li, and Tianming Liu. 2016. What makes a good movie trailer?: Interpretation from simultaneous EEG and eye-tracker recording. In Proceedings of the 2016 ACM on Multimedia Conference. ACM, 82--86.
[24]
Steven J. Luck. 2014. An Introduction to the Event-related Potential Technique. MIT Press.
[25]
Arghir-Nicolae Moldovan, Ioana Ghergulescu, Stephan Weibelzahl, and Cristina Hava Muntean. 2013. User-centered EEG-based multimedia quality assessment. In Proceedings of the IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB’13). IEEE, 1--8.
[26]
Maryam Mustafa, Stefan Guthe, and Marcus Magnor. 2012. Single-trial EEG classification of artifacts in videos. ACM Trans. Appl. Percept. (TAP) 9, 3 (2012), 12.
[27]
Paul L. Nunez and Ramesh Srinivasan. 2006. Electric Fields of the Brain: The Neurophysics of EEG. Oxford University Press.
[28]
Terence W. Picton. 1992. The P300 wave of the human event-related potential.J. Clin. Neurophysiol. 9, 4 (1992), 456--479.
[29]
Nikolay Ponomarenko, Vladimir Lukin, Alexander Zelensky, Karen Egiazarian, Marco Carli, and Federica Battisti. 2009. TID2008-a database for evaluation of full-reference visual quality assessment metrics. Adv. Modern Radioelectron. 10, 4 (2009), 30--45.
[30]
D. G. E. Robertson, J. M. Barden, and James Dowling. 1993. Response characteristics of different butterworth low-pass digital filters. J. Biomech. 26, 3 (1993), 299--299.
[31]
David A. Robinson. 1968. The oculomotor control system: A review. Proc. IEEE 56, 6 (1968), 1032--1049.
[32]
Kalpana Seshadrinathan, Rajiv Soundararajan, Alan Conrad Bovik, and Lawrence K. Cormack. 2010. Study of subjective and objective quality assessment of video. IEEE Trans. Image Process. 19, 6 (2010), 1427--1441.
[33]
Markus A. Wenzel, Jan-Eike Golenia, and Benjamin Blankertz. 2016. Classification of eye fixation related potentials for variable stimulus saliency. Frontiers in Neuroscience 10 (2016), 23.
[34]
Wei Zhang and Hantao Liu. 2017. Study of saliency in objective video quality assessment. IEEE Trans. Image Process. 26, 3 (2017), 1275--1288.
[35]
Yi Zhu, Ingrid Heynderickx, and Judith A. Redi. 2015. Understanding the role of social context and user factors in video quality of experience. Comput. Human Behav. 49 (2015), 412--426.

Cited By

View all
  • (2023)A deep perceptual framework for affective video tagging through multiband EEG signals modelingNeural Computing and Applications10.1007/s00521-023-09086-8Online publication date: 17-Oct-2023
  • (2023)Neural correlates of affective content: application to perceptual tagging of videoNeural Computing and Applications10.1007/s00521-021-06591-635:11(7925-7941)Online publication date: 1-Apr-2023
  • (2022)Affective Video Tagging Framework using Human Attention Modelling through EEG SignalsInternational Journal of Intelligent Information Technologies10.4018/IJIIT.30696818:1(1-18)Online publication date: 4-Aug-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Applied Perception
ACM Transactions on Applied Perception  Volume 14, Issue 4
Special Issue SAP 2017
October 2017
63 pages
ISSN:1544-3558
EISSN:1544-3965
DOI:10.1145/3140462
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 September 2017
Accepted: 01 July 2017
Received: 01 July 2017
Published in TAP Volume 14, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. EEG
  2. ERP
  3. Eye tracking
  4. artifacts
  5. implicit perception
  6. perceptual quality
  7. user rating
  8. video

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • Immersive Digital Reality and DFG INST
  • German Science Foundation

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)2
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2023)A deep perceptual framework for affective video tagging through multiband EEG signals modelingNeural Computing and Applications10.1007/s00521-023-09086-8Online publication date: 17-Oct-2023
  • (2023)Neural correlates of affective content: application to perceptual tagging of videoNeural Computing and Applications10.1007/s00521-021-06591-635:11(7925-7941)Online publication date: 1-Apr-2023
  • (2022)Affective Video Tagging Framework using Human Attention Modelling through EEG SignalsInternational Journal of Intelligent Information Technologies10.4018/IJIIT.30696818:1(1-18)Online publication date: 4-Aug-2022
  • (2022)Automatic Generation of Customized Areas of Interest and Evaluation of Observers' Gaze in Portrait VideosProceedings of the ACM on Human-Computer Interaction10.1145/35308856:ETRA(1-14)Online publication date: 13-May-2022
  • (2021)Evaluating Study Design and Strategies for Mitigating the Impact of Hand Tracking LossACM Symposium on Applied Perception 202110.1145/3474451.3476235(1-12)Online publication date: 16-Sep-2021
  • (2021)Towards Understanding Perceptual Differences between Genuine and Face-Swapped VideosProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445627(1-13)Online publication date: 6-May-2021
  • (2021)Dynamic Graph Modeling Of Simultaneous EEG And Eye-Tracking Data For Reading Task IdentificationICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)10.1109/ICASSP39728.2021.9414343(1250-1254)Online publication date: 6-Jun-2021
  • (2018)Analysis of neural correlates of saccadic eye movementsProceedings of the 15th ACM Symposium on Applied Perception10.1145/3225153.3225164(1-9)Online publication date: 10-Aug-2018
  • (2018)Proceedings of the 15th ACM Symposium on Applied PerceptionundefinedOnline publication date: 10-Aug-2018

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media