skip to main content
10.1145/3305367.3327988acmconferencesArticle/Chapter ViewAbstractPublication PagessiggraphConference Proceedingsconference-collections
abstract
Public Access

EyeHacker: gaze-based automatic reality manipulation

Published: 28 July 2019 Publication History

Abstract

In this study, we introduce EyeHacker, which is an immersive virtual reality (VR) system that spatiotemporally mixes the live and recorded/edited scenes based on the measurement of the users' gaze. This system updates the transition risk in real time by utilizing the gaze information of the users (i.e., the locus of attention) and the optical flow of scenes. Scene transitions are allowed when the risk is less than the threshold, which is modulated by the head movement data of the users (i.e., the faster their head movement, the higher will be the threshold). Using this algorithm and experience scenario prepared in advance, visual reality can be manipulated without being noticed by users (i.e., eye hacking). For example, consider a situation in which the objects around the users perpetually disappear and appear. The users would often have a strange feeling that something was wrong and, sometimes, would even find what happened but only later; they cannot visually perceive the changes in real time. Further, with the other variant of risk algorithms, the system can implement a variety of experience scenarios, resulting in reality confusion.

Supplementary Material

MP4 File (a12-ito.mp4)

References

[1]
Kirsten Cater, Alan Chalmers, and Patrick Ledda. 2002. Selective Quality Rendering by Exploiting Human Inattentional Blindness: Looking but Not Seeing. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology (VRST '02). ACM, New York, NY, USA, 17--24.
[2]
Ronald Rensink. 2009. Attention: Change Blindness and Inattentional Blindness. 47--59.
[3]
Evan A. Suma, Seth Clark, David Krum, Samantha Finkelstein, Mark Bolas, and Zachary Warte. 2011. Leveraging change blindness for redirection in virtual environments. In 2011 IEEE Virtual Reality Conference. 159--166.
[4]
Keisuke Suzuki, Sohei Wakisaka, and Naotaka Fujii. 2012. Substitutional Reality System: A Novel Experimental Platform for Experiencing Alternative Reality. Scientific Reports 2 (21 Jun 2012), 459.
[5]
Jochen Triesch, Dana H. Ballard, Mary M. Hayhoe, and Brian T. Sullivan. 2003. What you see is what you need. Journal of Vision 3, 1 (02 2003), 9--9.

Cited By

View all
  • (2023)Using Extended Reality to Study the Experience of PresenceVirtual Reality in Behavioral Neuroscience: New Insights and Methods10.1007/7854_2022_401(255-285)Online publication date: 3-Jan-2023
  • (2022)Parallel Ping-Pong: Exploring Parallel Embodiment through Multiple Bodies by a Single UserProceedings of the Augmented Humans International Conference 202210.1145/3519391.3519408(121-130)Online publication date: 13-Mar-2022
  • (2021)To See or Not to SeeProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34481235:1(1-25)Online publication date: 30-Mar-2021
  • Show More Cited By

Index Terms

  1. EyeHacker: gaze-based automatic reality manipulation

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGGRAPH '19: ACM SIGGRAPH 2019 Emerging Technologies
    July 2019
    54 pages
    ISBN:9781450363082
    DOI:10.1145/3305367
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 July 2019

    Check for updates

    Author Tags

    1. change blindness
    2. gaze measurement
    3. inattentional blindness
    4. substitutional reality

    Qualifiers

    • Abstract

    Funding Sources

    Conference

    SIGGRAPH '19
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 1,822 of 8,601 submissions, 21%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)138
    • Downloads (Last 6 weeks)16
    Reflects downloads up to 23 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Using Extended Reality to Study the Experience of PresenceVirtual Reality in Behavioral Neuroscience: New Insights and Methods10.1007/7854_2022_401(255-285)Online publication date: 3-Jan-2023
    • (2022)Parallel Ping-Pong: Exploring Parallel Embodiment through Multiple Bodies by a Single UserProceedings of the Augmented Humans International Conference 202210.1145/3519391.3519408(121-130)Online publication date: 13-Mar-2022
    • (2021)To See or Not to SeeProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34481235:1(1-25)Online publication date: 30-Mar-2021
    • (2020)SlideFusionSpecial Interest Group on Computer Graphics and Interactive Techniques Conference Emerging Technologies10.1145/3388534.3407299(1-2)Online publication date: 12-Aug-2020

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media