Abstract
In handheld AR, users have only a small screen to see the augmented scene, making decisions about scene layout and rendering techniques crucial. Traditional device-perspective rendering (DPR) uses the device camera's full field of view, enabling fast scene exploration, but ignoring what the user sees around the device screen. In contrast, user-perspective rendering (UPR) emulates the feeling of looking through the device like a glass pane, which enhances depth perception, but severely limits the field of view in which virtual objects are displayed, impeding scene exploration and search.
We introduce the notion of User-Aware Rendering. By following the principles of UPR, but pretending the device is larger than it actually is, it combines the strengths of UPR and DPR. We present two studies showing that User-Aware AR imitating a 50% larger device successfully achieves both enhanced depth perception and fast scene exploration in typical search and selection tasks.
Supplemental Material
- Daniel Andersen, Voicu Popescu, Chengyuan Lin, Maria Eugenia Cabrera, Aditya Shanghavi, and Juan Wachs. 2016. A Hand-Held, Self-Contained Simulated Transparent Display. In 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct). 96--101. https://doi.org/10.1109/ISMAR-Adjunct.2016.0049Google Scholar
- Domagoj Baricevic, Tobias Höllerer, Pradeep Sen, and Matthew Turk. 2014. User-Perspective Augmented Reality Magic Lens from Gradients. In Proceedings of the 20th ACM Symposium on Virtual Reality Software and Technology (Edinburgh, Scotland) (VRST '14). Association for Computing Machinery, New York, NY, USA, 87--96. https://doi.org/10.1145/2671015.2671027Google ScholarDigital Library
- Domagoj Baricevic, Cha Lee, Matthew Turk, Tobias Hö, and Doug A. Bowman. 2012. A Hand-Held AR Magic Lens with User-Perspective Rendering. In 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 197--206. https://doi.org/10.1109/ISMAR.2012.6402557Google ScholarDigital Library
- Eric A. Bier, Maureen C. Stone, Ken Pier, William Buxton, and Tony D. DeRose. 1993. Toolglass and Magic Lenses: The See-through Interface. In Proceedings of the 20th Annual Conference on Computer Graphics and Interactive Techniques (Anaheim, CA) (SIGGRAPH '93). Association for Computing Machinery, New York, NY, USA, 73--80. https://doi.org/10.1145/166117.166126Google ScholarDigital Library
- Laura Boccardo. 2021. Viewing distance of smartphones in presbyopic and non-presbyopic age. Journal of Optometry 14, 2 (2021), 120--126. https://doi.org/10.1016/j.optom.2020.08.001Google ScholarCross Ref
- James E. Cutting and Peter M. Vishton. 1995. Chapter 3 - Perceiving Layout and Knowing Distances: The Integration, Relative Potency, and Contextual Use of Different Information about Depth*. In Perception of Space and Motion, William Epstein and Sheena Rogers (Eds.). Academic Press, San Diego, 69--117. https://doi.org/10.1016/B978-012240530--3/50005--5Google Scholar
- Arindam Dey, Mark Billinghurst, Robert W. Lindeman, and J. Edward Swan. 2018. A Systematic Review of 10 Years of Augmented Reality Usability Studies: 2005 to 2014. Frontiers in Robotics and AI 5 (2018), 37. https://doi.org/10.3389/frobt.2018.00037Google ScholarCross Ref
- Catherine Diaz, Michael Walker, Danielle Albers Szafir, and Daniel Szafir. 2017. Designing for Depth Perceptions in Augmented Reality. In 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 111--122. https://doi.org/10.1109/ISMAR.2017.28Google Scholar
- Tiffany D. Do, Joseph J. LaViola, and Ryan P. McMahan. 2020. The Effects of Object Shape, Fidelity, Color, and Luminance on Depth Perception in Handheld Mobile Augmented Reality. In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 64--72. https://doi.org/10.1109/ISMAR50242.2020.00026Google ScholarCross Ref
- David Drascic and Paul Milgram. 1996. Perceptual issues in augmented reality. In Stereoscopic Displays and Virtual Reality Systems III, Mark T. Bolas, Scott S. Fisher, Mark T. Bolas, Scott S. Fisher, and John O. Merritt (Eds.), Vol. 2653. International Society for Optics and Photonics, SPIE, 123--134. https://doi.org/10.1117/12.237425Google Scholar
- C. Furmanski, R. Azuma, and M. Daily. 2002. Augmented-reality visualizations guided by cognition: perceptual heuristics for combining visible and obscured information. In Proceedings. International Symposium on Mixed and Augmented Reality. 215--320. https://doi.org/10.1109/ISMAR.2002.1115091Google ScholarCross Ref
- H. Hecht, R. Schwartz, and M. Atherton. 2003. Looking into pictures : an interdisciplinary approach to pictorial space.Google Scholar
- Alex Hill, Jacob Schiefer, Jeff Wilson, Brian Davidson, Maribeth Gandy, and Blair MacIntyre. 2011. Virtual transparency: Introducing parallax view into video see-through AR. In 2011 10th IEEE International Symposium on Mixed and Augmented Reality. 239--240. https://doi.org/10.1109/ISMAR.2011.6092395Google ScholarDigital Library
- J. Edward Swan II, Liisa Kuparinen, Scott Rapson, and Christian Sandor. 2017. Visually Perceived Distance Judgments: Tablet-Based Augmented Reality Versus the Real World. International Journal of Human--Computer Interaction 33, 7 (2017), 576--591. https://doi.org/10.1080/10447318.2016.1265783 arXiv:https://doi.org/10.1080/10447318.2016.1265783Google ScholarCross Ref
- Paul B. Kline and Bob G. Witmer. 1996. Distance Perception in Virtual Environments: Effects of Field of View and Surface Texture at Near Distances. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 40, 22 (1996), 1112--1116. https://doi.org/10.1177/154193129604002201 arXiv:https://doi.org/10.1177/154193129604002201Google Scholar
- Robert Kooima. 2009. Generalized perspective projection. J. Sch. Electron. Eng. Comput. Sci 6 (2009).Google Scholar
- Ernst Kruijff, J. Edward Swan, and Steven Feiner. 2010. Perceptual issues in augmented reality revisited. In 2010 IEEE International Symposium on Mixed and Augmented Reality. 3--12. https://doi.org/10.1109/ISMAR.2010.5643530Google ScholarCross Ref
- Victor Kyriazakos and Konstantinos Moustakas. 2015. A User-Perspective View for Mobile AR Systems Using Discrete Depth Segmentation. In 2015 International Conference on Cyberworlds (CW). 69--72. https://doi.org/10.1109/CW.2015.67Google ScholarDigital Library
- Jingjing May Liu, Gayathri Narasimham, Jeanine K. Stefanucci, Sarah Creem-Regehr, and Bobby Bodenheimer. 2020. Distance Perception in Modern Mobile Augmented Reality. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). 196--200. https://doi.org/10.1109/VRW50115.2020.00042Google Scholar
- Jack M. Loomis, Joshua M. Knapp, et al . 2003. Visual perception of egocentric distance in real and virtual environments. Virtual and adaptive environments 11 (2003), 21--46.Google Scholar
- Peter Mohr, Markus Tatzgern, Jens Grubert, Dieter Schmalstieg, and Denis Kalkofen. 2017. Adaptive user perspective rendering for Handheld Augmented Reality. In 2017 IEEE Symposium on 3D User Interfaces (3DUI). 176--181. https://doi.org/10.1109/3DUI.2017.7893336Google ScholarCross Ref
- Ji-young Oh and Hong Hua. 2006. User evaluations on form factors of tangible magic lenses. In 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality. 23--32. https://doi.org/10.1109/ISMAR.2006.297790Google ScholarDigital Library
- D Paillé. 2015. Impact of new digital technologies on posture. Points de Vue 72 (2015), 22--30.Google Scholar
- Yue Qin, Chun Yu, Wentao Yao, Jiachen Yao, Chen Liang, Yueting Weng, Yukang Yan, and Yuanchun Shi. 2023. Selecting Real-World Objects via User-Perspective Phone Occlusion. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (Hamburg, Germany) (CHI '23). Association for Computing Machinery, New York, NY, USA, Article 531, 13 pages. https://doi.org/10.1145/3544548.3580696Google ScholarDigital Library
- Donghao Ren, Tibor Goldschwendt, YunSuk Chang, and Tobias Höllerer. 2016. Evaluating wide-field-of-view augmented reality with mixed reality simulation. In 2016 IEEE Virtual Reality (VR). 93--102. https://doi.org/10.1109/VR.2016.7504692Google Scholar
- Brian Rogers and Maureen Graham. 1979. Motion Parallax as an Independent Cue for Depth Perception. Perception 8, 2 (1979), 125--134. https://doi.org/10.1068/p080125Google ScholarCross Ref
- Joan Sol Roo, Jean Basset, Pierre-Antoine Cinquin, and Martin Hachet. 2018. Understanding Users' Capability to Transfer Information between Mixed and Virtual Reality: Position Estimation across Modalities and Perspectives. Association for Computing Machinery, New York, NY, USA, 1--12. https://doi.org/10.1145/3173574.3173937Google ScholarDigital Library
- Ali Samini and Karljohan Lundin Palmerius. 2014. A Perspective Geometry Approach to User-Perspective Rendering in Hand-Held Video See-through Augmented Reality. In Proceedings of the 20th ACM Symposium on Virtual Reality Software and Technology (Edinburgh, Scotland) (VRST '14). Association for Computing Machinery, New York, NY, USA, 207--208. https://doi.org/10.1145/2671015.2671127Google ScholarDigital Library
- Ian Stavness, Billy Lam, and Sidney Fels. 2010. PCubee: A Perspective-Corrected Handheld Cubic Display. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Atlanta, Georgia, USA) (CHI '10). Association for Computing Machinery, New York, NY, USA, 1381--1390. https://doi.org/10.1145/1753326.1753535Google ScholarDigital Library
- Xuetong Sun and Amitabh Varshney. 2018. Investigating Perception Time in the Far Peripheral Vision for Virtual and Augmented Reality. In Proceedings of the 15th ACM Symposium on Applied Perception (Vancouver, British Columbia, Canada) (SAP '18). Association for Computing Machinery, New York, NY, USA, Article 13, 8 pages. https://doi.org/10.1145/3225153.3225160Google ScholarDigital Library
- William B. Thompson, Peter Willemsen, Amy A. Gooch, Sarah H. Creem-Regehr, Jack M. Loomis, and Andrew C. Beall. 2004. Does the Quality of the Computer Graphics Matter when Judging Distances in Visually Immersive Environments? Presence 13, 5 (2004), 560--571. https://doi.org/10.1162/1054746042545292Google ScholarDigital Library
- Makoto Tomioka, Sei Ikeda, and Kosuke Sato. 2013. Approximated user-perspective rendering in tablet-based augmented reality. In 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 21--28. https://doi.org/10.1109/ISMAR.2013.6671760Google ScholarCross Ref
- Klen Copic Pucihar, Paul Coulton, and Jason Alexander. 2013. Evaluating Dual-View Perceptual Issues in Handheld Augmented Reality: Device vs. User Perspective Rendering. In Proceedings of the 15th ACM on International Conference on Multimodal Interaction (Sydney, Australia) (ICMI '13). Association for Computing Machinery, New York, NY, USA, 381--388. https://doi.org/10.1145/2522848.2522885Google ScholarDigital Library
- Simon Voelker, Sebastian Hueber, Christian Holz, Christian Remy, and Nicolai Marquardt. 2020. GazeConduits: Calibration-Free Cross-Device Collaboration through Gaze and Touch. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 1--10. https://doi.org/10.1145/3313831.3376578Google ScholarDigital Library
- Philipp Wacker, Adrian Wagner, Simon Voelker, and Jan Borchers. 2020. Heatmaps, Shadows, Bubbles, Rays: Comparing Mid-Air Pen Position Visualizations in Handheld AR. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 1--11. https://doi.org/10.1145/3313831.3376848Google ScholarDigital Library
- Jing Yang, Shiheng Wang, and Gábor Sörös. 2018. User-Perspective Rendering for Handheld Applications. In 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). 270--274. https://doi.org/10.1109/ISMAR-Adjunct.2018.00084Google Scholar
Index Terms
- User-Aware Rendering: Merging the Strengths of Device- and User-Perspective Rendering in Handheld AR
Recommendations
Multiple light field rendering
GRAPHITE '03: Proceedings of the 1st international conference on Computer graphics and interactive techniques in Australasia and South East AsiaA light field is a 4D function describing the radiance across a boundary between the volume containing a scene, and the disjoint volume in which the eyepoint may be placed. Light field rendering is the process of rendering novel views of a scene ...
Comments