skip to main content
10.1145/3123266.3123434acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
research-article

Is Foveated Rendering Perceivable in Virtual Reality?: Exploring the Efficiency and Consistency of Quality Assessment Methods

Published: 19 October 2017 Publication History

Abstract

Foveated rendering leverages human visual system to increase video quality under limited computing resources for Virtual Reality (VR). More specifically, it increases the frame rate and the video quality of the foveal vision via lowering the resolution of the peripheral vision. Optimizing foveated rendering systems is, however, not an easy task, because there are numerous parameters that need to be carefully chosen, such as the number of layers, the eccentricity degrees, and the resolution of the peripheral region. Furthermore, there is no standard and efficient way to evaluate the Quality of Experiment (QoE) of foveated rendering systems. In this paper, we propose a framework to compare the performance of different subjective assessment methods on foveated rendering systems. We consider two performance metrics: efficiency and consistency, using the perceptual ratio, which is the probability of the foveated rendering is perceivable by users. A regression model is proposed to model the relationship between the human perceived quality and foveated rendering parameters. Our comprehensive study and analysis reveal several insights: 1) there is no absolute superior subjective assessment method, 2) subjects need to make more observations to confirm the foveated rendering is imperceptible than perceptible, 3) subjects barely notice the foveated rendering with an eccentricity degree of 7.5 degrees+ and peripheral region of a resolution of 540p+, and 4) QoE levels are highly dependent on the individuals and scenes. Our findings are crucial for optimizing the foveated rendering systems for future VR applications.

References

[1]
Maria Schultheis and Albert Rizzo. The application of Virtual Reality technology in rehabilitation. Rehabilitation Psychology, 46(3):296, 2001.
[2]
Tariq Mujber, Tamas Szecsi, and Mohammed Hashmi. Virtual Reality applications in manufacturing process simulation. Journal of materials processing technology, 155:1834--1838, 2004.
[3]
Christine Youngblut. Educational uses of Virtual Reality technology. Technical report, DTIC Document, 1998.
[4]
Daniel Guttentag. Virtual Reality: Applications and implications for tourism. Tourism Management, 31(5):637--651, 2010.
[5]
Doug Bowman and Ryan McMahan. Virtual Reality: How much immersion is enough? Computer, 40(7):36--43, 2007.
[6]
Asynchronous Timewarp on Oculus Rift. https://developer3.oculus.com/blog/asynchronous-timewarp-on-oculus-rift/.
[7]
Brian Wandell. Foundations of Vision. Sinauer Associates, 1995.
[8]
Markus Fiedler, Tobias Hossfeld, and Phuoc Tran-Gia. A generic quantitative relationship between Quality of Experience and Quality of Service. IEEE Network, 24(2):36--41, 2010.
[9]
Karlene Ball, Bettina Beard, Daniel Roenker, Richard Miller, and David Griggs. Age and visual search: Expanding the useful field of view. JOSA A, 5(12):2210--2219, 1988.
[10]
Hans Strasburger, Ingo Rentschler, and Martin Jüttner. Peripheral vision and pattern recognition: A review. Journal of Vision, 11(5):13--13, 2011.
[11]
Eyal Reingold, Lester Loschky, George McConkie, and David Stampe. Gaze-contingent multiresolutional displays: An integrative review. Human Factors: The Journal of the Human Factors and Ergonomics Society, 45(2):307--328, 2003.
[12]
Brian Guenter, Mark Finch, Steven Drucker, Desney Tan, and John Snyder. Foveated 3D graphics. ACM Transactions on Graphics, 31(6):164:1--164:10, 2012.
[13]
Nicholas Swafford, José Iglesias-Guitian, Charalampos Koniaris, Bochang Moon, Darren Cosker, and Kenny Mitchell. User, metric, and computational evaluation of foveated rendering methods. In Proceedings of the ACM Symposium on Applied Perception, pages 7--14, 2016.
[14]
Anjul Patney, Marco Salvi, Joohwan Kim, Anton Kaplanyan, Chris Wyman, Nir Benty, David Luebke, and Aaron Lefohn. Towards foveated rendering for gaze-tracked Virtual Reality. ACM Transactions on Graphics, 35(6):179:1--179:12, 2016.
[15]
Pietro Lungaro and Konrad Tollmar. Eye-gaze based service provision and QoE optimization. In Proceedings of the Workshop on Perceptual Quality of Systems, pages 6--10, 2016.
[16]
Sanghoon Lee, Marios Pattichis, and Alan Bovik. Foveated video quality assessment. IEEE Transactions on Multimedia, 4(1):129--132, 2002.
[17]
Recommendation ITU-R BT.500--13, 2012.
[18]
Rafał Mantiuk, Anna Tomaszewska, and Radosław Mantiuk. Comparison of four subjective methods for image quality assessment. Computer Graphics Forum, 31(8):2478--2491, 2012.
[19]
George Gescheider. Psychophysics: The Fundamentals. Psychology Press, 3rd edition, 1997.
[20]
Wanmin Wu, Ahsan Arefin, Gregorij Kurillo, Pooja Agarwal, Klara Nahrstedt, and Ruzena Bajcsy. Color-plus-depth level-of-detail in 3D tele-immersive video: A psychophysical approach. In Proceedings of the ACM International Conference on Multimedia, pages 13--22, 2011.

Cited By

View all
  • (2024)Theia: Gaze-driven and Perception-aware Volumetric Content Delivery for Mixed Reality HeadsetsProceedings of the 22nd Annual International Conference on Mobile Systems, Applications and Services10.1145/3643832.3661858(70-84)Online publication date: 3-Jun-2024
  • (2024)Individualized foveated rendering with eye-tracking head-mounted displayVirtual Reality10.1007/s10055-023-00931-828:1Online publication date: 19-Jan-2024
  • (2023)Learning GAN-Based Foveated Reconstruction to Recover Perceptually Important Image FeaturesACM Transactions on Applied Perception10.1145/358307220:2(1-23)Online publication date: 21-Apr-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
MM '17: Proceedings of the 25th ACM international conference on Multimedia
October 2017
2028 pages
ISBN:9781450349062
DOI:10.1145/3123266
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 19 October 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. foveated rendering
  2. human perception
  3. quality of experience
  4. virtual reality

Qualifiers

  • Research-article

Funding Sources

  • Ministry of Science and Technology of Taiwan

Conference

MM '17
Sponsor:
MM '17: ACM Multimedia Conference
October 23 - 27, 2017
California, Mountain View, USA

Acceptance Rates

MM '17 Paper Acceptance Rate 189 of 684 submissions, 28%;
Overall Acceptance Rate 2,145 of 8,556 submissions, 25%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)65
  • Downloads (Last 6 weeks)4
Reflects downloads up to 27 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Theia: Gaze-driven and Perception-aware Volumetric Content Delivery for Mixed Reality HeadsetsProceedings of the 22nd Annual International Conference on Mobile Systems, Applications and Services10.1145/3643832.3661858(70-84)Online publication date: 3-Jun-2024
  • (2024)Individualized foveated rendering with eye-tracking head-mounted displayVirtual Reality10.1007/s10055-023-00931-828:1Online publication date: 19-Jan-2024
  • (2023)Learning GAN-Based Foveated Reconstruction to Recover Perceptually Important Image FeaturesACM Transactions on Applied Perception10.1145/358307220:2(1-23)Online publication date: 21-Apr-2023
  • (2023)Is Foveated Rendering Perception Affected by Users’ Motion?2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00127(1104-1112)Online publication date: 16-Oct-2023
  • (2023)VRS-NeRF: Accelerating Neural Radiance Field Rendering with Variable Rate Shading2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00039(243-252)Online publication date: 16-Oct-2023
  • (2023)Foveated rendering: A state-of-the-art surveyComputational Visual Media10.1007/s41095-022-0306-49:2(195-228)Online publication date: 3-Jan-2023
  • (2022)Modeling the User Experience of Watching 360° Videos with Head-Mounted DisplaysACM Transactions on Multimedia Computing, Communications, and Applications10.1145/346382518:1(1-23)Online publication date: 27-Jan-2022
  • (2022)An Effective Foveated 360° Image Assessment Based on Graph Convolution NetworkIEEE Access10.1109/ACCESS.2022.320476610(98165-98178)Online publication date: 2022
  • (2022)Empirical comparison of spatial experience between photo-based IVE and real spaceArchitectural Science Review10.1080/00038628.2022.213408966:1(1-16)Online publication date: 17-Oct-2022
  • (2022)A survey of challenges and methods for Quality of Experience assessment of interactive VR applicationsJournal on Multimodal User Interfaces10.1007/s12193-022-00388-016:3(257-291)Online publication date: 29-Apr-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media