skip to main content
10.1145/3458709.3458960acmotherconferencesArticle/Chapter ViewAbstractPublication PagesahsConference Proceedingsconference-collections
research-article

Exploring Pseudo Hand-Eye Interaction on the Head-Mounted Display

Published:11 July 2021Publication History

ABSTRACT

Virtual and augmented reality devices and applications have enabled the user to experience a variety of simulated real-life experiences through first-person visual, auditory, and haptic feedback. However, among the numerous everyday interactions that have been emulated, the familiar interaction of touching or rubbing the eyes is yet to be explored and remains to be understood. In this paper, we aim to understand the components of natural hand-eye interaction, propose an interaction technique through a proof-of-concept prototype head-mounted display, and evaluate the user experience of the prototype through a user study. In addition, we share insights emerged from the studies with suggestions for further development of interaction techniques based on combinations of hardware and software.

References

  1. FW Campbell and DG Green. 1965. Optical and retinal factors affecting visual resolution.The Journal of physiology 181, 3 (1965), 576–593.Google ScholarGoogle Scholar
  2. Liwei Chan and Kouta Minamizawa. 2017. FrontFace: facilitating communication between HMD users and outsiders using front-facing-screen HMDs. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services. 1–5.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Hugo Chow-Wing-Bom, Tessa M Dekker, and Pete R Jones. 2020. The worse eye revisited: evaluating the impact of asymmetric peripheral vision loss on everyday function. Vision Research 169(2020), 49–57.Google ScholarGoogle ScholarCross RefCross Ref
  4. Victor Adriel de Jesus Oliveira, Luciana Nedel, and Anderson Maciel. 2018. Assessment of an articulatory interface for tactile intercommunication in immersive virtual environments. Computers & Graphics 76(2018), 18–28.Google ScholarGoogle ScholarCross RefCross Ref
  5. Jan Gugenheimer, David Dobbelstein, Christian Winkler, Gabriel Haas, and Enrico Rukzio. 2016. Facetouch: Enabling touch interaction in display fixed uis for mobile virtual reality. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. 49–60.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Jan Gugenheimer, Evgeny Stemasov, Harpreet Sareen, and Enrico Rukzio. 2018. FaceDisplay: towards asymmetric multi-user interaction for nomadic virtual reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Michael R Heim. 2017. Virtual reality wave 3. In Boundaries of self and reality online. Elsevier, 261–277.Google ScholarGoogle Scholar
  8. Seungwoo Je, Hyunseung Lim, Kongpyung Moon, Shan-Yuan Teng, Jas Brooks, Pedro Lopes, and Andrea Bianchi. [n.d.]. Elevate: AWalkable Pin-Array for Large Shape-Changing Terrains. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems.Google ScholarGoogle Scholar
  9. Pete R Jones and Giovanni Ometto. 2018. Degraded reality: using VR/AR to simulate visual impairments. In 2018 IEEE Workshop on Augmented and Virtual Realities for Good (VAR4Good). IEEE, 1–4.Google ScholarGoogle ScholarCross RefCross Ref
  10. Pete R Jones, Tamás Somoskeöy, Hugo Chow-Wing-Bom, and David P Crabb. 2020. Seeing other perspectives: evaluating the use of virtual and augmented reality to simulate visual impairments (OpenVisSim). NPJ digital medicine 3, 1 (2020), 1–9.Google ScholarGoogle Scholar
  11. Katharina Krösl, Carmine Elvezio, Matthias Hürbe, Sonja Karst, Steven Feiner, and Michael Wimmer. 2020. XREye: Simulating Visual Impairments in Eye-Tracked XR. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 831–832.Google ScholarGoogle Scholar
  12. DoYoung Lee, Youryang Lee, Yonghwan Shin, and Ian Oakley. 2018. Designing socially acceptable hand-to-face input. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology. 711–723.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Juyoung Lee, Hui-Shyong Yeo, Murtaza Dhuliawala, Jedidiah Akano, Junichi Shimizu, Thad Starner, Aaron Quigley, Woontack Woo, and Kai Kunze. 2017. Itchy nose: discreet gesture interaction using EOG sensors in smart eyewear. In Proceedings of the 2017 ACM International Symposium on Wearable Computers. 94–97.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Mona Hosseinkhani Loorak, Wei Zhou, Ha Trinh, Jian Zhao, and Wei Li. 2019. Hand-Over-Face Input Sensing for Interaction with Smartphones through the Built-in Camera. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services. 1–12.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Katsutoshi Masai, Yuta Sugiura, and Maki Sugimoto. 2018. Facerubbing: Input technique by rubbing face using optical sensors on smart eyewear for facial expression recognition. In Proceedings of the 9th Augmented Human International Conference. 1–5.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Charles W McMonnies. 2008. Management of chronic habits of abnormal eye rubbing. Contact Lens and Anterior Eye 31, 2 (2008), 95–102.Google ScholarGoogle ScholarCross RefCross Ref
  17. Charles W McMonnies. 2016. Eye rubbing type and prevalence including contact lens ‘removal-relief’rubbing. Clinical and Experimental Optometry 99, 4 (2016), 366–372.Google ScholarGoogle ScholarCross RefCross Ref
  18. Uran Oh and Leah Findlater. 2014. Design of and subjective response to on-body input for people with visual impairments. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. 115–122.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Neung Ryu, Hye-Young Jo, Michel Pahud, Mike Sinclair, and Andrea Bianchi. [n.d.]. GamesBond: Bimanual Haptic Illusion of Physically Connected Objects for Immersive VR Using Grip Deformation. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems.Google ScholarGoogle Scholar
  20. Marcos Serrano, Barrett M Ens, and Pourang P Irani. 2014. Exploring the use of hand-to-face input for interacting with head-worn displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 3181–3190.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Ryosuke Takada, Toshiya Isomoto, Wataru Yamada, Hiroyuki Manabe, and Buntarou Shizuki. 2018. ExtensionClip: Touch point transfer device linking both sides of a smartphone for mobile VR environments. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. 1–6.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Wen-Jie Tseng, Yi-Chen Lee, Roshan Lalintha Peiris, and Liwei Chan. 2020. A Skin-Stroke Display on the Eye-Ring Through Head-Mounted Displays. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Wen-Jie Tseng, Li-Yang Wang, and Liwei Chan. 2019. FaceWidgets: Exploring tangible interaction on face with head-mounted displays. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. 417–427.Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. David Turbert. 2021. Blurriness. https://www.aao.org/eye-health/symptoms/blurriness-2Google ScholarGoogle Scholar
  25. Jacob O Wobbrock, Meredith Ringel Morris, and Andrew D Wilson. 2009. User-defined gestures for surface computing. In Proceedings of the SIGCHI conference on human factors in computing systems. 1083–1092.Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Haojie Wu, Daniel H Ashmead, Haley Adams, and Bobby Bodenheimer. 2018. Using virtual reality to assess the street crossing behavior of pedestrians with simulated macular degeneration at a roundabout. Frontiers in ICT 5(2018), 27.Google ScholarGoogle ScholarCross RefCross Ref
  27. Koki Yamashita, Takashi Kikuchi, Katsutoshi Masai, Maki Sugimoto, Bruce H Thomas, and Yuta Sugiura. 2017. CheekInput: turning your cheek into an input surface by embedded optical sensors on a head-mounted display. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology. 1–8.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Exploring Pseudo Hand-Eye Interaction on the Head-Mounted Display
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          AHs '21: Proceedings of the Augmented Humans International Conference 2021
          February 2021
          321 pages

          Copyright © 2021 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 11 July 2021

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article
          • Research
          • Refereed limited

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format .

        View HTML Format