ABSTRACT
This study is based on a user scenario where augmented reality targets could be found by scanning the environment with a mobile device and getting a tactile feedback exactly in the direction of the target. In order to understand how accurately and quickly the targets can be found, we prepared an experiment setup where a sensor-actuator device consisting of orientation tracking hardware and a tactile actuator were used. The targets with widths 5°, 10°, 15°, 20°, and 25° and various distances between each other were rendered in a 90° -wide space successively, and the task of the test participants was to find them as quickly as possible. The experiment consisted of two conditions: the first one provided tactile feedback only when pointing was on the target and the second one included also another cue indicating the proximity of the target. The average target finding time was 1.8 seconds. The closest targets appeared to be not the easiest to find, which was attributed to the adapted scanning velocity causing the missing the closest targets. We also found that our data did not correlate well with Fitts' model, which may have been caused by the non-normal data distribution. After filtering out 30% of the least representative data items, the correlation reached up to 0.71. Overall, the performance between conditions did not differ from each other significantly. The only significant improvement in the performance offered by the close-to-target cue occurred in the tasks where the targets where the furthest from each other.
- Kahl, G., Wasinger, R., Schwartz, T. and Spassova, L., Three Output Planning Strategies for Use in Context-aware Computing Scenarios, Proc. of the AISB Symposium on Multimodal Output Generation (MOG), Aberdeen, Scotland, UK, 2008, pp. 46--49.Google Scholar
- Wasinger, R., Stahl, C., and Kruger, A., Robust Speech Interaction in a Mobile Environment through the use of Multiple and Different Media Input Types, in Proceedings of the 8th European Conference on Speech Communication and Technology (Eurospeech), pp. 1049--1052, (2003).Google Scholar
- Marentakis, G. and Brewster, S.A. Gesture Interaction with Spatial Audio Displays: Effects of Target Size and Inter-Target Separation. In Proceedings of ICAD2005 (Limerick, Ireland), July 2005. ICAD, pp77--84.Google Scholar
- Marentakis, G.N. and Brewster, S.A. Effects of Feedback, Mobility and Index of Difficulty on Deictic Spatial Audio Target Acquisition in the Horizontal Plane. In Proceedings of ACM CHI 2006 (Montreal, Canada), ACM Press Addison-Wesley, pp 359--368 Google ScholarDigital Library
- Strachan, S. and Murray-Smith, R., GeoPoke: Rotational Mechanical Systems Metaphor for Embodied Geosocial Interaction, NordiCHI 2008: Using Bridges, 18--22 October, Lund, Sweden, 2008. Google ScholarDigital Library
- Strachan, S., Williamson, J. and Murray-Smith, R., Show me the way to Monte Carlo: density-based trajectory navigation, Proceedings of ACM SIG CHI Conference, San Jose, 2007, pp. 1245--1248 Google ScholarDigital Library
- Williamson, J., Strachan, S. and Murray-Smith, R., It's a Long Way to Monte-Carlo: Probabilistic GPS Navigation, Proceedings of Mobile HCI 2006, Helsinki, 2006. Google ScholarDigital Library
- Robinson, S., Eslambolchilar, P. and Jones, M., Evaluating Haptics for Information Discovery While Walking, to appear in Proceedings of BCS HCI 2009, Cambridge, UK, September 2009. Google ScholarDigital Library
- Pesqueux, L. and Rouaud, M. "Vibration level of mobile phones' silent alerts," in Department of Acoustics. Aalborg: Aalborg University, 2005.Google Scholar
- http://www.eaiinfo.com/Tactor%20Products.htm, 29.5.2009Google Scholar
- Laurila, K., Pylvänäinen, T., Silanto, S., and Virolainen, A. "Wireless Motion Bands", position paper at UbiComp'05 Workshop on "Ubiquitous computing to support monitoring, measuring and motivating exercise", Tokyo, Japan, September 11--14, 2005Google Scholar
- http://www.inition.co.uk/inition/pdf/ymocap_XSens_mt9.pdf, 29.5.2009.Google Scholar
- Ahmaniemi, T., Lantz, V. and Marila, J.: Dynamic Audiotactile Feedback in Gesture Interaction. In Proceedings of the Mobile HCI 2008 September 2--5, 2008, Amsterdam, Netherlands. pp. 339--342. Google ScholarDigital Library
- Ahmaniemi, T., Lantz, V. and Marila, J.: Perception of Dynamic Audiotactile Feedback to Gesture Input. Proceedings of the 10th International Conference on Multimodal Interfaces, October 20--22, 2008, Chania, Crete, Greece. pp. 85--92. Google ScholarDigital Library
- Akamatsu, M., MacKenzie, I. S. and Hasbrouq, T. (1995). A comparison of tactile, auditory, and visual feedback in a pointing task using a mouse-type device. Ergonomics, 38, 816--827.Google ScholarCross Ref
- Tähkäpää, E. and Raisamo, R. Evaluating Tactile Feedback in Graphical User Interfaces. In proceedings of Eurohaptics (Edinburgh, UK) 2002.Google Scholar
- Oron-Gilad, T.,Downs, J.L., Gilson, R.D. and Hancock, P.A.: Vibrotactile Guidance Cues for Target Acquisition. IEEE Transactions on Systems, Man, and Cybernetics, Part C 37(5): 993--1004 (2007). Google ScholarDigital Library
- Fitts, P.M., The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 47(6): p. 381--391 (1954).Google ScholarCross Ref
- Jagacinski, R.J. and Flach, J.M., Control Theory for Humans, Quantitative Approaches to Modeling Human Performance. Lawrence Erlbaum Associates, Inc. 2003, pp. 17--22, 75--76.Google Scholar
- Crossan, A., Williamson, J., Brewster, S.A. and Murray-Smith, R. Wrist Rotation for Interaction in Mobile Contexts. In Proceedings of MobileHCI 2008 (Amsterdam, Holland), ACM Press, pp 435--438. Google ScholarDigital Library
- Cabral, M.C., Morimoto, C.H., and Zuffo, M.K. On the usability of gesture interfaces in virtual reality environments. Proceedings of the 2005 Latin American conference on Human-computer interaction. Cuernavaca, Mexico. pp. 100--108. Google ScholarDigital Library
- Yee, K-P., Peephole displays: Pen interaction on spatially aware handheld computers. In Proc. CHI 2003, ACM Press (2003), 1--8. Google ScholarDigital Library
- Cao, X., Li, J.J. and Balakrishnan, R., Peephole Pointing: Modeling Acquisition of Dynamically Revealed Targets, Proceeding of the 26th SIGCHI conference on Human factors in computing systems. Florence, Italy, pp. 1699--1708. Google ScholarDigital Library
- Rohs, M. and Oulasvirta, A., Target Acquisition with Camera Phones when used as Magic Lenses, Proceeding of the 26th SIGCHI conference on Human factors in computing systems. Florence, Italy, pp 1409--1418. Google ScholarDigital Library
- Andersen, T. H., A Simple Movement Time Model for Scrolling, CHI 2005 extended abstracts on Human factors in computing systems. Portland, OR, USA, pp. 1180--1183. Google ScholarDigital Library
Index Terms
- Augmented reality target finding based on tactile cues
Recommendations
Haptics in Augmented Reality
ICMCS '99: Proceedings of the IEEE International Conference on Multimedia Computing and Systems - Volume 2An augmented reality system merges synthetic sensory information into a user's perception of a three-dimensional environment. An important performance goal for an augmented reality system is that the user perceives a single seamless environment. In most ...
Vibrotactile Experiences for Augmented Reality
MM '16: Proceedings of the 24th ACM international conference on MultimediaThis demo illustrates several use cases for utilizing vibrotactile feedback, i.e., tactile sensations created by small, body-worn vibration motors, in an augmented reality setting with head-mounted displays. In particular, we address how such feedback ...
Augmented reality as perceptual reality
VSMM'06: Proceedings of the 12th international conference on Interactive Technologies and Sociotechnical SystemsAs shown in Paul Milgram et al’s Reality-Virtuality Continuum (1994), Augmented Reality occupies a very unique status in the spectrum of Mixed Reality. Unlike Virtual Reality, which is completely made up of the virtual and has been the most important ...
Comments