skip to main content
research-article

Virtual Big Heads in Extended Reality: Estimation of Ideal Head Scales and Perceptual Thresholds for Comfort and Facial Cues

Published: 11 January 2023 Publication History

Abstract

Extended reality (XR) technologies, such as virtual reality (VR) and augmented reality (AR), provide users, their avatars, and embodied agents a shared platform to collaborate in a spatial context. Although traditional face-to-face communication is limited by users’ proximity, meaning that another human’s non-verbal embodied cues become more difficult to perceive the farther one is away from that person, researchers and practitioners have started to look into ways to accentuate or amplify such embodied cues and signals to counteract the effects of distance with XR technologies. In this article, we describe and evaluate the Big Head technique, in which a human’s head in VR/AR is scaled up relative to their distance from the observer as a mechanism for enhancing the visibility of non-verbal facial cues, such as facial expressions or eye gaze. To better understand and explore this technique, we present two complimentary human-subject experiments in this article. In our first experiment, we conducted a VR study with a head-mounted display to understand the impact of increased or decreased head scales on participants’ ability to perceive facial expressions as well as their sense of comfort and feeling of “uncannniness” over distances of up to 10 m. We explored two different scaling methods and compared perceptual thresholds and user preferences. Our second experiment was performed in an outdoor AR environment with an optical see-through head-mounted display. Participants were asked to estimate facial expressions and eye gaze, and identify a virtual human over large distances of 30, 60, and 90 m. In both experiments, our results show significant differences in minimum, maximum, and ideal head scales for different distances and tasks related to perceiving faces, facial expressions, and eye gaze, and we also found that participants were more comfortable with slightly bigger heads at larger distances. We discuss our findings with respect to the technologies used, and we discuss implications and guidelines for practical applications that aim to leverage XR-enhanced facial cues.

References

[1]
Matt Adcock and Chris Gunn. 2010. Annotating with ‘sticky’ light for remote guidance. In ACM SIGGRAPH ASIA 2010 Posters. 1.
[2]
Sean Andrist, Tomislav Pejsa, Bilge Mutlu, and Michael Gleicher. 2012. Designing effective gaze mechanisms for virtual agents. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 705–714.
[3]
Sean Andrist, Tomislav Pejsa, Bilge Mutlu, and Michael Gleicher. 2012. A head-eye coordination model for animating gaze shifts of virtual characters. In Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction. 1–6.
[4]
Ferran Argelaguet Sanz, Anne Helene Olivier, Gerd Bruder, Julien Pettre, and Anatole Lecuyer. 2015. Virtual proxemics: Locomotion in the presence of obstacles in large immersive projection environments. In Proceedings of IEEE Virtual Reality (VR’15). 75–80.
[5]
Jeremy N. Bailenson, Jim Blascovich, Andrew C. Beall, and Jack M. Loomis. 2003. Interpersonal distance in immersive virtual environments. Personality and Social Psychology Bulletin 29, 7 (2003), 819–833.
[6]
Inessa Bekerman, Paul Gottlieb, and Michael Vaiman. 2014. Variations in eyeball diameters of the healthy adults. Journal of Ophthalmology 2014 (2014), 503645.
[7]
Jean-François Bélisle and H. Onur Bodur. 2010. Avatars as information: Perception of consumers based on their avatars in virtual worlds. Psychology & Marketing 27, 8 (2010), 741–765.
[8]
Mark Billinghurst, Hirokazu Kato, and Ivan Poupyrev. 2001. The MagicBook—Moving seamlessly between reality and virtuality. IEEE Computer Graphics and Applications 21, 3 (2001), 6–8.
[9]
Philipp Breuss-Schneeweis. 2016. “The speaking celt” augmented reality avatars guide through a museum–case study. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. 1484–1491.
[10]
Gerd Bruder, Andreas Pusch, and Frank Steinicke. 2012. Analyzing effects of geometric rendering parameters on size and distance estimation in on-axis stereographics. In Proceedings of the ACM Symposium on Applied Perception (SAP’12). 111–118.
[11]
Angelo Cafaro, Hannes Högni Vilhjálmsson, Timothy Bickmore, Dirk Heylen, Kamilla Rún Jóhannsdóttir, and Gunnar Steinn Valgarðsson. 2012. First impressions: Users’ judgments of virtual agents’ personality and interpersonal attitude in first encounters. In Proceedings of the International Conference on Intelligent Virtual Agents. 67–80.
[12]
Zubin Choudhary, Gerd Bruder, and Gregory Welch. 2021. Scaled user embodied representations in virtual and augmented reality. In Proceedings of the Workshop on User-Embodied Interaction in Virtual Reality (UIVR’21).
[13]
Zubin Choudhary, Matthew Gottsacker, Kangsoo Kim, Ryan Schubert, Jeanine Stefanucci, Gerd Bruder, and Gregory F. Welch. 2021. Revisiting distance perception with scaled embodied cues in social virtual reality. In Proceedings of IEEE Virtual Reality (VR’21). 1–10.
[14]
Zubin Choudhary, Kangsoo Kim, Ryan Schubert, Gerd Bruder, and Gregory F. Welch. 2020. Virtual big heads: Analysis of human perception and comfort of head scales in social virtual reality. In Proceedings of IEEE Virtual Reality (VR’20). 425–433.
[15]
Zubin Choudhary, Jesus Ugarte, Gerd Bruder, and Greg Welch. 2021. Real-time magnification in augmented reality. In Proceedings of the ACM Symposium on Spatial User Interaction (SUI’21). 1–2.
[16]
Alex Colburn, Michael F. Cohen, and Steven Drucker. 2000. The Role of Eye Gaze in Avatar Mediated Conversational Interfaces. Technical Report. Microsoft Research.
[17]
Matthew Christopher Davis, Dang D. Can, Jonathan Pindrik, Brandon G. Rocque, and James M. Johnston. 2016. Virtual interactive presence in global surgical education: International collaboration through augmented reality. World Neurosurgery 86 (2016), 103–111.
[18]
Celso M. de Melo, Peter Carnevale, and Jonathan Gratch. 2012. The effect of virtual agents’ emotion displays and appraisals on people’s decision making in negotiation. In Proceedings of the International Conference on Intelligent Virtual Agents. 53–66.
[19]
Carl F. DiSalvo, Francine Gemperle, Jodi Forlizzi, and Sara Kiesler. 2002. All robots are not created equal: The design and perception of humanoid robot heads. In Proceedings of the ACM Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques. 321–326.
[20]
Shichuan Du and Aleix M. Martinez. 2013. Wait, are you sad or angry? Large exposure time differences required for the categorization of facial expressions of emotion. Journal of Vision 13, 4 (2013), 13.
[21]
Nicolas Ducheneaut, Ming-Hui Wen, Nicholas Yee, and Greg Wadley. 2009. Body and mind: A study of avatar personalization in three virtual worlds. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems. 1151–1160.
[22]
Robert Andrew Dunn and Rosanna E. Guadagno. 2012. My avatar and me—Gender and personality predictors of avatar-self discrepancy. Computers in Human Behavior 28, 1 (2012), 97–106.
[23]
Paul Ekman. 1989. The argument and evidence about universals in facial expressions. In Handbook of Social Psychophysiology. Wiley, 143–164.
[24]
Paul Ekman. 1993. Facial expression and emotion. American Psychologist 48, 4 (1993), 384.
[25]
Rosenberg Ekman. 1997. What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS). Oxford University Press.
[26]
Austin Erickson, Kangsoo Kim, Gerd Bruder, and Gregory Welch. 2020. Exploring the limitations of environment lighting on optical see-through head-mounted displays. In Proceedings of the ACM Conference on Spatial User Interfaces (SUI’20). 1–8.
[27]
Austin Erickson, Kangsoo Kim, Gerd Bruder, and Gregory F. Welch. 2020. A review of visual perception research in optical see-through augmented reality. In Proceedings of the International Conference on Artificial Reality and Telexistence and the Eurographics Symposium on Virtual Environments. 1–9.
[28]
Austin Erickson, Kangsoo Kim, Gerd Bruder, and Gregory F. Welch. 2020. Effects of dark mode graphics on visual acuity and fatigue with virtual reality head-mounted displays. In Proceedings of IEEE Virtual Reality (VR’20). 434–442.
[29]
Austin Erickson, Kangsoo Kim, Alexis Lambert, Gerd Bruder, Michael P. Browne, and Greg Welch. 2021. An extended analysis on the benefits of dark mode user interfaces in optical see-through head-mounted displays. ACM Transactions on Applied Perception 18, 3 (2021), 1–22.
[30]
Antonio Fernández-Caballero, Elena Navarro, Patricia Fernández-Sotos, Pascual González, Jorge J. Ricarte, José M. Latorre, and Roberto Rodriguez-Jimenez. 2017. Human-avatar symbiosis for the treatment of auditory verbal hallucinations in schizophrenia through virtual/augmented reality and brain-computer interfaces. Frontiers in Neuroinformatics 11 (2017), 64.
[31]
Joseph Gabbard, J. Edward Swan, Jason Zedlitz, and Woodrow W. Winchester. 2010. More than meets the eye: An engineering study to empirically examine the blending of real and virtual color spaces. In Proceedings of IEEE Virtual Reality (VR’10). 79–86.
[32]
Maia Garau, Mel Slater, Vinoba Vinayagamoorthy, Andrea Brogni, Anthony Steed, and M. Angela Sasse. 2003. The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems. 529–536.
[33]
Giant Bomb. 2021. Big Head Mode. Retrieved November 12, 2022 from https://www.giantbomb.com/big-head-mode/3015-403/.
[34]
Sinem Guven, Ohan Oda, Mark Podlaseck, Harry Stavropoulos, Sai Kolluri, and Gopal Pingali. 2009. Social mobile augmented reality for retail. In Proceedings of the 2009 IEEE International Conference on Pervasive Computing and Communications. IEEE, Los Alamitos, CA, 1–3.
[35]
Joseph C. Hager and Paul Ekman. 1979. Long-distance of transmission of facial affect signals. Ethology and Sociobiology 1, 1 (1979), 77–82.
[36]
Edward Hall. 1963. A system for the notation of proxemic behavior. American Anthropologist 65, 5 (1963), 1003–1026.
[37]
Edward Hall. 1969. The Hidden Dimension: Man’s Use of Space in Public and in Private. Anchor Books.
[38]
Tina Iachini, Yann Coello, Francesca Frassinetti, and Gennaro Ruggiero. 2014. Body space in social interactions: A comparison of reaching and comfort distance in immersive virtual reality. PloS One 9 (2014), e111511.
[39]
Daniel P. Kennedy, Jan Gläscher, J. Michael Tyszka, and Ralph Adolphs. 2009. Personal space regulation by the human amygdala. Nature Neuroscience 12 (2009), 1226–1227.
[40]
Kangsoo Kim, Mark Billinghurst, Gerd Bruder, Henry Been-Lirn Duh, and Gregory F. Welch. 2018. Revisiting trends in augmented reality research: A review of the 2nd decade of ISMAR (2008–2017). IEEE Transactions on Visualization and Computer Graphics 24, 11 (2018), 2947–2962.
[41]
Kangsoo Kim, Nahal Norouzi, Tiffany Losekamp, Gerd Bruder, Mindi Anderson, and Gregory Welch. 2019. Effects of patient care assistant embodiment and computer mediation on user experience. In Proceedings of the IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR’19). 17–177.
[42]
Kiyoshi Kiyokawa, Haruo Takemura, and Naokazu Yokoya. 1999. A collaboration support technique by integrating a shared virtual reality and a shared augmented reality. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics. 48–53.
[43]
Gregory Kramida. 2015. Resolving the vergence-accommodation conflict in head-mounted displays. IEEE Transactions on Visualization and Computer Graphics 22, 7 (2015), 1912–1931.
[44]
Veronika Krauß, Alexander Boden, Leif Oppermann, and René Reiners. 2021. Current practices, challenges, and design implications for collaborative AR/VR application development. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–15.
[45]
Bernard Kress, Ehsan Saeedi, and Vincent Brac-de-la Perriere. 2014. The segmentation of the HMD market: Optics for smart glasses, smart eyewear, AR and VR headsets. In Photonics Applications for Aviation, Aerospace, Commercial, and Harsh Environments V, Vol. 9202. International Society for Optics and Photonics, 92020D.
[46]
Bernard C. Kress. 2019. Digital optical elements and technologies (EDO19): Applications to AR/VR/MR. In Digital Optical Technologies 2019, Vol. 11062. International Society for Optics and Photonics, 1106222.
[47]
Ernst Kruijff, J. Edward Swan, and Steven Feiner. 2010. Perceptual issues in augmented reality revisited. In Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR’10). 3–12.
[48]
Marc Erich Latoschik, Daniel Roth, Dominik Gall, Jascha Achenbach, Thomas Waltemate, and Mario Botsch. 2017. The effect of avatar realism in immersive social virtual realities. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology. 1–10.
[49]
Jong-Eun Roselyn Lee, Clifford I. Nass, and Jeremy N. Bailenson. 2014. Does the mask govern the mind?: Effects of arbitrary gender representation on quantitative task performance in avatar-represented virtual groups. Cyberpsychology, Behavior, and Social Networking 17, 4 (2014), 248–254.
[50]
Myungho Lee, Gerd Bruder, and Gregory F. Welch. 2017. Exploring the effect of vibrotactile feedback through the floor on social presence in an immersive virtual environment. In Proceedings of IEEE Virtual Reality (VR’17).105–111.
[51]
Myungho Lee, Nahal Norouzi, Gerd Bruder, Pamela Wisniewski, and Gregory Welch. 2019. Mixed reality tabletop gameplay: Social interaction with a virtual human capable of physical influence. IEEE Transactions on Visualization and Computer Graphics 24, 8 (2019), 1–12.
[52]
Michael D. Lee, Robyn L. Vast, and Marcus A. Butavicius. 2006. Face matching under time pressure and task demands. In Proceedings of the 28th Annual Conference of the Cognitive Science Society. 1675–1680.
[53]
Rachel McDonnell, Martin Breidt, and Heinrich H. Bülthoff. 2012. Render me real? Investigating the effect of render style on the perception of animated virtual humans. ACM Transactions on Graphics 31, 4 (2012), 1–11.
[54]
Mark McGill and Stephen A. Brewster. 2017. I am the passenger: Challenges in supporting AR/VR HMDs in-motion. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct. 251–251.
[55]
Masahiro Mori, Karl F. MacDorman, and Norri Kageki. 2012. The uncanny valley [from the field]. IEEE Robotics & Automation Magazine 19, 2 (2012), 98–100.
[56]
Nahal Norouzi, Gerd Bruder, Brandon Belna, Stefanie Mutter, Damla Turgut, and Gregory Welch. 2019. A systematic review of the convergence of augmented reality, intelligent virtual agents, and the Internet of Things. In Artificial Intelligence in IoT. Transactions on Computational Science and Computational Intelligence. Springer, 1–24.
[57]
Nahal Norouzi, Austin Erickson, Kangsoo Kim, Ryan Schubert, Joseph LaViola, Gerd Bruder, and Greg Welch. 2019. Effects of shared gaze parameters on visual target identification task performance in augmented reality. In Proceedings of the ACM Symposium on Spatial User Interaction (SUI’19). 1–11.
[58]
Soo Youn Oh, Jeremy Bailenson, Nicole Krämer, and Benjamin Li. 2016. Let the avatar brighten your smile: Effects of enhancing facial expressions in virtual environments. PloS One 11, 9 (2016), e0161794.
[59]
Riccardo Palmarini, John Ahmet Erkoyuncu, Rajkumar Roy, and Hosein Torabmostaedi. 2018. A systematic review of augmented reality applications in maintenance. Robotics and Computer-Integrated Manufacturing 49 (2018), 215–228.
[60]
Tabitha C. Peck, Jessica J. Good, Austin Erickson, Isaac Bynum, and Gerd Bruder. 2022. Effects of transparency on perceived humanness: Implications for rendering skin tones using optical see-through displays. IEEE Transactions on Visualization and Computer Graphics 28, 5 (2022), 2179–2189.
[61]
Catlin Pidel and Philipp Ackermann. 2020. Collaboration in virtual and augmented reality: A systematic overview. In Proceedings of the International Conference on Augmented Reality, Virtual Reality, and Computer Graphics. 141–156.
[62]
Thammathip Piumsomboon, Arindam Day, Barrett Ens, Youngho Lee, Gun Lee, and Mark Billinghurst. 2017. Exploring enhancements for remote mixed reality collaboration. In Proceedings of ACM SIGGRAPH Asia Mobile Graphics and Interactive Applications. 16.
[63]
Thammathip Piumsomboon, Gun A. Lee, Jonathon D. Hart, Barrett Ens, Robert W. Lindeman, Bruce H. Thomas, and Mark Billinghurst. 2018. Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–13.
[64]
Thammathip Piumsomboon, Youngho Lee, Gun A. Lee, Arindam Dey, and Mark Billinghurst. 2017. Empathic mixed reality: Sharing what you feel and interacting with what you see. In Proceedings of the IEEE International Symposium on Ubiquitous Virtual Reality. 38–41.
[65]
Rebekka S. Renner, Boris M. Velichkovsky, and Jens R. Helmert. 2013. The perception of egocentric distances in virtual environments—A review. ACM Computing Surveys 46, 2 (2013), Article 23, 40 pages.
[66]
Kerstin Ruhland, Christopher E. Peters, Sean Andrist, Jeremy B. Badler, Norman I. Badler, Michael Gleicher, Bilge Mutlu, and Rachel McDonnell. 2015. A review of eye gaze in virtual agents, social robotics and HCI: Behaviour generation, user interaction and perception. Computer Graphics Forum 34 (2015), 299–326.
[67]
Neil J. Salkind. 2010. Triangulation. In Encyclopedia of Research Design. SAGE, Thousand Oaks, CA, 1.
[68]
Franziska Schrammel, Sebastian Pannasch, Sven-Thomas Graupner, Andreas Mojzisch, and Boris M. Velichkovsky. 2009. Virtual friend or threat? The effects of facial expression and gaze interaction on psychophysiological responses and emotional experience. Psychophysiology 46, 5 (2009), 922–931.
[69]
Eva-Maria Seidel, Ute Habel, Michaela Kirschner, Ruben C. Gur, and Birgit Derntl. 2010. The impact of facial emotional expressions on behavioral tendencies in women and men. Journal of Experimental Psychology: Human Perception and Performance 36, 2 (2010), 500.
[70]
Jun’ichiro Seyama and Ruth S. Nagayama. 2007. The uncanny valley: Effect of realism on the impression of artificial human faces. Presence 16, 4 (2007), 337–351.
[71]
Takashi Shibata, Joohwan Kim, David M. Hoffman, and Martin S. Banks. 2011. The zone of comfort: Predicting visual discomfort with stereo displays. Journal of Vision 11, 8 (2011), 11.
[72]
Mel Slater, D.-P. Pertaub, and Anthony Steed. 1999. Public speaking in virtual reality: Facing an audience of avatars. IEEE Computer Graphics and Applications 19, 2 (1999), 6–9.
[73]
Fraser W. Smith and Philippe G. Schyns. 2009. Smile through your fear and sadness: Transmitting and identifying facial expression signals over a range of viewing distances. Psychological Science 20, 10 (2009), 1202–1208.
[74]
Harrison Jesse Smith and Michael Neff. 2018. Communication behavior in embodied virtual reality. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems. 1–12.
[75]
Lucía Vera, Jesús Gimeno, Inmaculada Coma, and Marcos Fernández. 2011. Augmented mirror: Interactive augmented reality system based on kinect. In Proceedings of the IFIP Conference on Human-Computer Interaction. 483–486.
[76]
Michael E. Walker, Daniel Szafir, and Irene Rae. 2019. The influence of size in augmented reality telepresence avatars. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR’19). IEEE, Los Alamitos, CA, 538–546.
[77]
Christian Wallraven, Heinrich H. Bülthoff, Douglas W. Cunningham, Jan Fischer, and Dirk Bartz. 2007. Evaluation of real-world and computer-generated stylized facial expressions. ACM Transactions on Applied Perception 4, 3 (2007), 16–es.
[78]
Colin Ware. 2019. Information Visualization: Perception for Design. Morgan Kaufmann.
[79]
Gregory F. Welch, Gerd Bruder, Peter Squire, and Ryan Schubert. 2019. Anticipating Widespread Augmented Reality: Insights from the 2018 AR Visioning Workshop. Technical Report. University of Central Florida and Office of Naval Research.
[80]
Nick Yee, Jeremy N. Bailenson, Mark Urbanek, Francis Chang, and Dan Merget. 2007. The unbearable likeness of being digital: The persistence of nonverbal social norms in online virtual environments. Cyberpsychology & Behavior 10, 1 (2007), 115–121. DOI:
[81]
Boram Yoon, Hyung-Il Kim, Gun A. Lee, Mark Billinghurst, and Woontack Woo. 2019. The effect of avatar appearance on social presence in an augmented reality remote collaboration. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR’19). IEEE, Los Alamitos, CA, 547–556.
[82]
Roberts Zabels, Krišs Osmanis, Mārtiņ Narels, Uģis Gertners, Ainārs Ozols, Kārlis Rūtenbergs, and Ilmārs Osmanis. 2019. AR displays: Next-generation technologies to solve the vergence–accommodation conflict. Applied Sciences 9, 15 (2019), 3147. https://www.mdpi.com/2076-3417/9/15/3147.
[83]
Eduard Zell, Carlos Aliaga, Adrian Jarabo, Katja Zibrek, Diego Gutierrez, Rachel McDonnell, and Mario Botsch. 2015. To stylize or not to stylize? The effect of shape and material stylization on the perception of computer-generated faces. ACM Transactions on Graphics 34, 6 (2015), 1–12.
[84]
Yulei Gavin Zhang, Yan Mandy Dang, Susan A. Brown, and Hsinchun Chen. 2017. Investigating the impacts of avatar gender, avatar age, and region theme on avatar physical activity in the virtual world. Computers in Human Behavior 68 (2017), 378–387.

Cited By

View all
  • (2023)Gestures vs. Emojis: Comparing Non-Verbal Reaction Visualizations for Immersive CollaborationIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.332025429:11(4772-4781)Online publication date: 2-Oct-2023
  • (2023)Visual Facial Enhancements Can Significantly Improve Speech Perception in the Presence of NoiseIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.332024729:11(4751-4760)Online publication date: 2-Oct-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Applied Perception
ACM Transactions on Applied Perception  Volume 20, Issue 1
January 2023
122 pages
ISSN:1544-3558
EISSN:1544-3965
DOI:10.1145/3584022
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 11 January 2023
Online AM: 10 November 2022
Accepted: 02 November 2022
Revised: 12 September 2022
Received: 05 May 2022
Published in TAP Volume 20, Issue 1

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Virtual environments
  2. social virtual reality
  3. outdoor augmented reality
  4. non verbal communication

Qualifiers

  • Research-article
  • Refereed

Funding Sources

  • National Science Foundation
  • Collaborative Award
  • University of Central Florida, University of Florida, and Stanford University, respectively; the Office of Naval Research under Award
  • AdventHealth Endowed Chair in Healthcare Simulation

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)110
  • Downloads (Last 6 weeks)17
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Gestures vs. Emojis: Comparing Non-Verbal Reaction Visualizations for Immersive CollaborationIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.332025429:11(4772-4781)Online publication date: 2-Oct-2023
  • (2023)Visual Facial Enhancements Can Significantly Improve Speech Perception in the Presence of NoiseIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.332024729:11(4751-4760)Online publication date: 2-Oct-2023

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media