ABSTRACT
While wearable devices have been developed that incorporate computing, sensing and display technology into a head-worn package, they often have limited input methods that might not be appropriate for natural 3D interaction which is necessary for Augmented Reality (AR) applications. In this paper we report on a prototype interface that supports natural 3D free-hand gestures on wearable computers. In addition to using hand gestures for AR interaction, we also look into allowing users to combine low resolution hand gestures in 3D with high resolution touch input. We show how this could be used in a wearable AR interface and present early pilot study results.
Supplemental Material
Available for Download
Zip file containing a PDF of the Accompanying Poster
- Harrison, C., Benko, H., and Wilson, A. D. Omnitouch: Wearable multitouch interaction everywhere. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, UIST '11, ACM (New York, NY, USA, 2011), 441--450. Google ScholarDigital Library
- Hinckley, K., Pausch, R., Profitt, D., and Kassell, N. F. Two-handed virtual manipulation. ACM Trans. Comput.-Hum. Interact. 5, 3 (September 1998), 260--302. Google ScholarDigital Library
- Kim, D., Hilliges, O., Izadi, S., Butler, A. D., Chen, J., Oikonomidis, I., and Olivier, P. Digits: Freehand 3d interactions anywhere using a wrist-worn gloveless sensor. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, UIST '12, ACM (New York, NY, USA, 2012), 167--176. Google ScholarDigital Library
- Kölsch, M. Vision Based Hand Gesture Interfaces for Wearable Computing and Virtual Environments. PhD thesis, University of California, Santa Barbara, 2004. AAI3143800.Google Scholar
- Kölsch, M., Bane, R., Höllerer, T., and Turk, M. Multimodal interaction with a wearable augmented reality system. Computer Graphics and Applications, IEEE 26, 3 (May 2006), 62--71. Google ScholarDigital Library
- Kurata, T., Okuma, T., Kourogi, M., Kato, T., and Sakaue, K. Vizwear: Toward human-centered interaction through wearable vision and visualization. Advances in Multimedia Information Processing PCM 2001 2195 (2001), 40--47. Google ScholarDigital Library
- Loclair, C., Gustafson, S., and Baudisch, P. Pinchwatch: A wearable device for one-handed microinteractions. In ACM MobileHCI'10 Workshop on Ensembles of on-body Devices, MobileHCI '10 (2010).Google Scholar
- Moeslund, T. B., and Norgaard, L. A Brief Overview of Hand Gestures Used in Wearable Human Computer Interfaces. 2003.Google Scholar
- Piekarski, W., and Thomas, B. H. The tinmith system: Demonstrating new techniques for mobile augmented reality modelling. In Proceedings of the Third Australasian Conference on User Interfaces Volume 7, AUIC '02, Australian Computer Society, Inc. (Darlinghurst, Australia, Australia, 2002), 61--70. Google ScholarDigital Library
- Rekimoto, J. Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In Proceedings of the 5th IEEE International Symposium on Wearable Computers, ISWC '01, IEEE Computer Society (Washington, DC, USA, 2001), 21--27. Google ScholarDigital Library
- Schöning, J., Steinicke, F., Krüger, A., Hinrichs, K., and Valkov, D. Bimanual interaction with interscopic multi-touch surfaces. In Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II, INTERACT '09, Springer-Verlag (Berlin, Heidelberg, 2009), 40--53. Google ScholarDigital Library
- Starner, T., Weaver, J., and Pentland, A. A wearable computer based american sign language recognizer. In Proceedings of the 1st IEEE International Symposium on Wearable Computers, ISWC '97, IEEE Computer Society (Washington, DC, USA, 1997), 130--137. Google ScholarDigital Library
Index Terms
- Using 3D hand gestures and touch input for wearable AR interaction
Recommendations
3D gesture interaction for handheld augmented reality
SA '14: SIGGRAPH Asia 2014 Mobile Graphics and Interactive ApplicationsIn this paper, we present a prototype for exploring natural gesture interaction with Handheld Augmented Reality (HAR) applications, using visual tracking based AR and freehand gesture based interaction detected by a depth camera. We evaluated this ...
Tactile interaction gestures: a wearable hand tactile system for interacting with ubiquitous information
UbiComp/ISWC'15 Adjunct: Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable ComputersThe aim of this research is to study how haptic and sensing on the hand enable ubiquitous interaction with information through hand gestures. A wearable tactile system for the hand is developed offering distributed sensing and tactile feedback around ...
Multi-scale gestural interaction for augmented reality
SA '17: SIGGRAPH Asia 2017 Mobile Graphics & Interactive ApplicationsWe present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to ...
Comments