ABSTRACT
Virtual and augmented reality devices and applications have enabled the user to experience a variety of simulated real-life experiences through first-person visual, auditory, and haptic feedback. However, among the numerous everyday interactions that have been emulated, the familiar interaction of touching or rubbing the eyes is yet to be explored and remains to be understood. In this paper, we aim to understand the components of natural hand-eye interaction, propose an interaction technique through a proof-of-concept prototype head-mounted display, and evaluate the user experience of the prototype through a user study. In addition, we share insights emerged from the studies with suggestions for further development of interaction techniques based on combinations of hardware and software.
- FW Campbell and DG Green. 1965. Optical and retinal factors affecting visual resolution.The Journal of physiology 181, 3 (1965), 576–593.Google Scholar
- Liwei Chan and Kouta Minamizawa. 2017. FrontFace: facilitating communication between HMD users and outsiders using front-facing-screen HMDs. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services. 1–5.Google ScholarDigital Library
- Hugo Chow-Wing-Bom, Tessa M Dekker, and Pete R Jones. 2020. The worse eye revisited: evaluating the impact of asymmetric peripheral vision loss on everyday function. Vision Research 169(2020), 49–57.Google ScholarCross Ref
- Victor Adriel de Jesus Oliveira, Luciana Nedel, and Anderson Maciel. 2018. Assessment of an articulatory interface for tactile intercommunication in immersive virtual environments. Computers & Graphics 76(2018), 18–28.Google ScholarCross Ref
- Jan Gugenheimer, David Dobbelstein, Christian Winkler, Gabriel Haas, and Enrico Rukzio. 2016. Facetouch: Enabling touch interaction in display fixed uis for mobile virtual reality. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. 49–60.Google ScholarDigital Library
- Jan Gugenheimer, Evgeny Stemasov, Harpreet Sareen, and Enrico Rukzio. 2018. FaceDisplay: towards asymmetric multi-user interaction for nomadic virtual reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarDigital Library
- Michael R Heim. 2017. Virtual reality wave 3. In Boundaries of self and reality online. Elsevier, 261–277.Google Scholar
- Seungwoo Je, Hyunseung Lim, Kongpyung Moon, Shan-Yuan Teng, Jas Brooks, Pedro Lopes, and Andrea Bianchi. [n.d.]. Elevate: AWalkable Pin-Array for Large Shape-Changing Terrains. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems.Google Scholar
- Pete R Jones and Giovanni Ometto. 2018. Degraded reality: using VR/AR to simulate visual impairments. In 2018 IEEE Workshop on Augmented and Virtual Realities for Good (VAR4Good). IEEE, 1–4.Google ScholarCross Ref
- Pete R Jones, Tamás Somoskeöy, Hugo Chow-Wing-Bom, and David P Crabb. 2020. Seeing other perspectives: evaluating the use of virtual and augmented reality to simulate visual impairments (OpenVisSim). NPJ digital medicine 3, 1 (2020), 1–9.Google Scholar
- Katharina Krösl, Carmine Elvezio, Matthias Hürbe, Sonja Karst, Steven Feiner, and Michael Wimmer. 2020. XREye: Simulating Visual Impairments in Eye-Tracked XR. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 831–832.Google Scholar
- DoYoung Lee, Youryang Lee, Yonghwan Shin, and Ian Oakley. 2018. Designing socially acceptable hand-to-face input. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology. 711–723.Google ScholarDigital Library
- Juyoung Lee, Hui-Shyong Yeo, Murtaza Dhuliawala, Jedidiah Akano, Junichi Shimizu, Thad Starner, Aaron Quigley, Woontack Woo, and Kai Kunze. 2017. Itchy nose: discreet gesture interaction using EOG sensors in smart eyewear. In Proceedings of the 2017 ACM International Symposium on Wearable Computers. 94–97.Google ScholarDigital Library
- Mona Hosseinkhani Loorak, Wei Zhou, Ha Trinh, Jian Zhao, and Wei Li. 2019. Hand-Over-Face Input Sensing for Interaction with Smartphones through the Built-in Camera. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services. 1–12.Google ScholarDigital Library
- Katsutoshi Masai, Yuta Sugiura, and Maki Sugimoto. 2018. Facerubbing: Input technique by rubbing face using optical sensors on smart eyewear for facial expression recognition. In Proceedings of the 9th Augmented Human International Conference. 1–5.Google ScholarDigital Library
- Charles W McMonnies. 2008. Management of chronic habits of abnormal eye rubbing. Contact Lens and Anterior Eye 31, 2 (2008), 95–102.Google ScholarCross Ref
- Charles W McMonnies. 2016. Eye rubbing type and prevalence including contact lens ‘removal-relief’rubbing. Clinical and Experimental Optometry 99, 4 (2016), 366–372.Google ScholarCross Ref
- Uran Oh and Leah Findlater. 2014. Design of and subjective response to on-body input for people with visual impairments. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. 115–122.Google ScholarDigital Library
- Neung Ryu, Hye-Young Jo, Michel Pahud, Mike Sinclair, and Andrea Bianchi. [n.d.]. GamesBond: Bimanual Haptic Illusion of Physically Connected Objects for Immersive VR Using Grip Deformation. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems.Google Scholar
- Marcos Serrano, Barrett M Ens, and Pourang P Irani. 2014. Exploring the use of hand-to-face input for interacting with head-worn displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 3181–3190.Google ScholarDigital Library
- Ryosuke Takada, Toshiya Isomoto, Wataru Yamada, Hiroyuki Manabe, and Buntarou Shizuki. 2018. ExtensionClip: Touch point transfer device linking both sides of a smartphone for mobile VR environments. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. 1–6.Google ScholarDigital Library
- Wen-Jie Tseng, Yi-Chen Lee, Roshan Lalintha Peiris, and Liwei Chan. 2020. A Skin-Stroke Display on the Eye-Ring Through Head-Mounted Displays. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarDigital Library
- Wen-Jie Tseng, Li-Yang Wang, and Liwei Chan. 2019. FaceWidgets: Exploring tangible interaction on face with head-mounted displays. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. 417–427.Google ScholarDigital Library
- David Turbert. 2021. Blurriness. https://www.aao.org/eye-health/symptoms/blurriness-2Google Scholar
- Jacob O Wobbrock, Meredith Ringel Morris, and Andrew D Wilson. 2009. User-defined gestures for surface computing. In Proceedings of the SIGCHI conference on human factors in computing systems. 1083–1092.Google ScholarDigital Library
- Haojie Wu, Daniel H Ashmead, Haley Adams, and Bobby Bodenheimer. 2018. Using virtual reality to assess the street crossing behavior of pedestrians with simulated macular degeneration at a roundabout. Frontiers in ICT 5(2018), 27.Google ScholarCross Ref
- Koki Yamashita, Takashi Kikuchi, Katsutoshi Masai, Maki Sugimoto, Bruce H Thomas, and Yuta Sugiura. 2017. CheekInput: turning your cheek into an input surface by embedded optical sensors on a head-mounted display. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology. 1–8.Google ScholarDigital Library
Index Terms
- Exploring Pseudo Hand-Eye Interaction on the Head-Mounted Display
Recommendations
Head-mounted display with mid-air tactile feedback
VRST '15: Proceedings of the 21st ACM Symposium on Virtual Reality Software and TechnologyVirtual and physical worlds are merging. Currently users of head-mounted displays cannot have unobtrusive tactile feedback while touching virtual objects. We present a mid-air tactile feedback system for head-mounted displays. Our prototype uses the ...
FaceWidgets: Exploring Tangible Interaction on Face with Head-Mounted Displays
UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and TechnologyWe present FaceWidgets, a device integrated with the backside of a head-mounted display (HMD) that enables tangible interactions using physical controls. To allow for near range-to-eye interactions, our first study suggested displaying the virtual ...
Design and evaluation of a layered handheld 3d display with touch-sensitive front and back
NordiCHI '14: Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, FoundationalTouch screens became truly pervasive through the success of smartphones and tablet PCs. Several approaches to further improve the interaction with touch screens have been proposed. In this paper we combine and extend two of these trends. We present a ...
Comments