skip to main content
10.1145/2964284.2967187acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
short-paper

What Makes a Good Movie Trailer?: Interpretation from Simultaneous EEG and Eyetracker Recording

Published: 01 October 2016 Publication History

Abstract

What makes a good movie trailer? It's a big challenge to answer this question because of the complexity of multimedia in both low level sensory features and high level semantic features. However, human perception and reactivity could be straightforward evidence for evaluation. Modern Electro-encephalography (EEG) technology provides measurement of consequential brain neural activity to external stimuli. Meanwhile, visual perception and attention could be captured and interpreted by Eye Tracking technology. Intuitively, simultaneous EEG and Eye Tracker recording of human audience with multimedia stimuli could bridge the gap between human comprehension and multimedia analysis, and provide a new way for movie trailer evaluation. In this paper, we propose a novel platform to simultaneously record EEG and eye movement for participants with video stimuli by integrating 256-channel EEG, Eye Tracker and video display device as a system. Based on the proposed system a novel experiment has been designed, in which independent and joint features of EEG and Eye tracking data were mined to evaluate the movie trailer. Our analysis has shown interesting features that are corresponding with trailer quality and video shoot changes.

References

[1]
Tianming Liu, Xintao Hu, Xiaojin Li, Mo Chen, Junwei Han, Lei Guo. Merging Neuroimaging and Multimedia: Methods, Opportunities and Challenges. IEEE Transactions on Human-Machine Systems. 2014.
[2]
E. Niedermeyer, F. H. Lopes da Silva. Electroencephalography: Basic principles, clinical applications and related fields. 3rd edition, Lippincott, Williams & Wilkins, Philadelphia, 1993.
[3]
S. Scholler, S. Bosse, M. S. Treder et al., "Toward a Direct Measure of Video Quality Perception Using EEG," Image Processing, IEEE Transactions, vol. 21, no. 5, pp. 2619--2629, 2012.
[4]
J. Antons, R. Schleicher, S. Arndt et al., "Analyzing speech quality perception using electro-encephalography," IEEE Journal of Selected Topics in Signal Processing, vol. 6, no. 6, pp. 721--731, 2012.
[5]
Vassiliki Filippakopoulou and Byron Nakos. "Eyemmv toolbox: An eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification". Journal of Eye Movement Research, 7(1):1--10, 2014.
[6]
Salvucci, Dario D. "An interactive model-based environment for eye-movement protocol analysis and visualization." Proceedings of the 2000 symposium on Eye tracking research & applications. ACM, 2000.
[7]
P. Sajda, E. Pohlmeyer, J. Wang et al., "In a blink of an eye and a switch of a transistor: cortically coupled computer vision," Proceedings of the IEEE, vol. 98, no. 3, pp. 462--478, 2010.
[8]
S. Koelstra, C. Muhl, and I. Patras, "EEG analysis for implicit tagging of video data" in Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 3rd International Conference on, 2009, pp. 1--6.
[9]
Hou, Y., Xiao, T., Zhang, S., Jiang, X., Li, X., Hu, X., Han, J., Guo, L., Miller, L.S., Neupert, R. and Liu, T., 2016. Predicting Movie Trailer Viewer's "Like/Dislike" via Learned Shot Editing Patterns. IEEE Transactions on Affective Computing, 7(1), pp.29--44.
[10]
Hald, L. A., Bastiaansen, M. C. M., & Hagoort, P. EEG theta and gamma responses to semantic violations in online sentence processing. Brain and Language, 96, 90--105. 2006.
[11]
Patel, Nilesh V., and Ishwar K. Sethi. "Video shot detection and characterization for video databases." Pattern Recognition 30.4 (1997): 583--592.

Cited By

View all
  • (2024)BI-AVAN: A Brain-Inspired Adversarial Visual Attention Network for Characterizing Human Visual Attention From Neural ActivityIEEE Transactions on Multimedia10.1109/TMM.2024.344362326(11191-11203)Online publication date: 2024
  • (2023)EEG and peripheral markers of viewer ratings: a study of short filmsFrontiers in Neuroscience10.3389/fnins.2023.114820517Online publication date: 12-Jun-2023
  • (2023)Viewer Emotional Response to Webtoon-Based Drama: An EEG AnalysisInternational Journal of Human–Computer Interaction10.1080/10447318.2023.228564740:24(8623-8637)Online publication date: 29-Nov-2023
  • Show More Cited By

Index Terms

  1. What Makes a Good Movie Trailer?: Interpretation from Simultaneous EEG and Eyetracker Recording

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MM '16: Proceedings of the 24th ACM international conference on Multimedia
    October 2016
    1542 pages
    ISBN:9781450336031
    DOI:10.1145/2964284
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 01 October 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. electroencephalography
    2. eye-tracker
    3. multimedia evaluation.

    Qualifiers

    • Short-paper

    Funding Sources

    • Tianming Liu

    Conference

    MM '16
    Sponsor:
    MM '16: ACM Multimedia Conference
    October 15 - 19, 2016
    Amsterdam, The Netherlands

    Acceptance Rates

    MM '16 Paper Acceptance Rate 52 of 237 submissions, 22%;
    Overall Acceptance Rate 2,145 of 8,556 submissions, 25%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)80
    • Downloads (Last 6 weeks)18
    Reflects downloads up to 01 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)BI-AVAN: A Brain-Inspired Adversarial Visual Attention Network for Characterizing Human Visual Attention From Neural ActivityIEEE Transactions on Multimedia10.1109/TMM.2024.344362326(11191-11203)Online publication date: 2024
    • (2023)EEG and peripheral markers of viewer ratings: a study of short filmsFrontiers in Neuroscience10.3389/fnins.2023.114820517Online publication date: 12-Jun-2023
    • (2023)Viewer Emotional Response to Webtoon-Based Drama: An EEG AnalysisInternational Journal of Human–Computer Interaction10.1080/10447318.2023.228564740:24(8623-8637)Online publication date: 29-Nov-2023
    • (2022)EEGG: An Analytic Brain-Computer Interface AlgorithmIEEE Transactions on Neural Systems and Rehabilitation Engineering10.1109/TNSRE.2022.314965430(643-655)Online publication date: 2022
    • (2022)Understanding action concepts from videos and brain activity through subjects’ consensusScientific Reports10.1038/s41598-022-23067-212:1Online publication date: 9-Nov-2022
    • (2021)Joint Analysis of Eye Blinks and Brain Activity to Investigate Attentional Demand during a Visual Search TaskBrain Sciences10.3390/brainsci1105056211:5(562)Online publication date: 28-Apr-2021
    • (2020)Visual-Texual Emotion Analysis With Deep Coupled Video and Danmu Neural NetworksIEEE Transactions on Multimedia10.1109/TMM.2019.294647722:6(1634-1646)Online publication date: Jun-2020
    • (2020)Estimation of Interest Levels From Behavior Features via Tensor Completion Including Adaptive Similar User SelectionIEEE Access10.1109/ACCESS.2020.30079638(126109-126118)Online publication date: 2020
    • (2019)Interest Level Estimation Based on Tensor Completion via Feature Integration for Partially Paired User’s Behavior and VideosIEEE Access10.1109/ACCESS.2019.29469127(148576-148585)Online publication date: 2019
    • (2018)Favorite Video Classification Based on Multimodal Bidirectional LSTMIEEE Access10.1109/ACCESS.2018.28767106(61401-61409)Online publication date: 2018
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media