ABSTRACT
Nowadays, mobile devices provide new possibilities for gesture interaction due to the large range of embedded sensors they have and their physical form factor. In addition, auditory interfaces can now be more easily supported through advanced mobile computing capabilities. Although different types of gesture techniques have been proposed for handheld devices, there is still little knowledge about the acceptability and use of some of these techniques, especially in the context of an auditory interface. In this paper, we propose a novel approach to the problem by studying the design space of gestures proposed by end-users for a mobile auditory interface. We discuss the results of this explorative study, in terms of the scope of the gestures proposed, the tangible aspects, and the users' preferences. This study delivers some initial gestures recommendations for eyes-free auditory interfaces.
- Bhandari, S. and Lim, Y. Exploring gestural mode of interaction with mobile phones. In Proc. CHI 2008, ACM Press (2008), 2979--2984. Google ScholarDigital Library
- Brewster, S., Lumsden, J., Bell, M., Hall, M., and Tasker, S. Multimodal 'eyes-free' interaction techniques for wearable devices. In Proc. CHI 2003, ACM Press (2003), 473--480. Google ScholarDigital Library
- Karam, M. and Schraefel, M. C. A Taxonomy of Gestures in Human Computer Interactions. Technical Report ECSTR-IAM05-009, Electronics and Computer Science, University of Southampton, 2005.Google Scholar
- Li, K. A., Baudisch, P., and Hinckley, K. Blindsight: eyes-free access to mobile phones. In Proc. CHI 2008, ACM Press (2008), 1389--1398. Google ScholarDigital Library
- OpenAL API. www.openal.org.Google Scholar
- Pirhonen, A., Brewster, S., and Holguin, C. Gestural and audio metaphors as a means of control for mobile devices. In Proc. CHI 2002, ACM Press (2002), 291--298. Google ScholarDigital Library
- Rico, J. and Brewster, S. 2010. Usable gestures for mobile interfaces: evaluating social acceptability. In Proc. CHI 2010, ACM Press (2010), 887--896. Google ScholarDigital Library
- Sawhney, N. and Schmandt, C. Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments. In Proc. ACM Transactions on Comput.-Hum. Interact. 2000, ACM Press (2000), 353--383. Google ScholarDigital Library
- Williamson, J., Murray-Smith, R., and Hughes, S. Shoogle: excitatory multimodal interaction on mobile devices. In Proc. CHI 2007, ACM Press (2007), 121--124. Google ScholarDigital Library
Index Terms
- Touching the void: gestures for auditory interfaces
Recommendations
Touching the void: direct-touch interaction for intangible displays
CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsIn this paper, we explore the challenges in applying and investigate methodologies to improve direct-touch interaction on intangible displays. Direct-touch interaction simplifies object manipulation, because it combines the input and display into a ...
PocketTouch: through-fabric capacitive touch input
UIST '11: Proceedings of the 24th annual ACM symposium on User interface software and technologyPocketTouch is a capacitive sensing prototype that enables eyes-free multitouch input on a handheld device without having to remove the device from the pocket of one's pants, shirt, bag, or purse. PocketTouch enables a rich set of gesture interactions, ...
The performance of touch screen soft buttons
CHI '09: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsThe introduction of a new generation of attractive touch screen-based devices raises many basic usability questions whose answers may influence future design and market direction. With a set of current mobile devices, we conducted three experiments ...
Comments