Abstract
Multi-modal interaction with Ubiquitous Computing needs to be carefully examined for its appropriate use within systems. The importance of this analysis is highlighted through the presentation of an experimental study that demonstrates that one modality could be implicit within another.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Bolt, R. (1980) “Put-That-There”: Voice and Gesture at the Graphics Interface, SIGGRAGH’80 Proceedings, 14, No. 3 (July 1980), 262–270.
Gaver, W. (1989) The SonicFinder: An interface that uses auditory icons, Human-Computer Interaction, 4, 1, 67–94.
Scaife, M. and Rogers, Y. (1996) External cognition: how do graphical representations work? Int. J. Human-Computer Studies, 45, 185–213.
Weiser, M (1993) Hot Topics: Ubiquitous Computing, IEEE Computer, October.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Haniff, D.J., Baber, C., Edmondson, W. (1999). Human Factors of Multi-modal Ubiquitous Computing. In: Gellersen, HW. (eds) Handheld and Ubiquitous Computing. HUC 1999. Lecture Notes in Computer Science, vol 1707. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-48157-5_42
Download citation
DOI: https://doi.org/10.1007/3-540-48157-5_42
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-66550-2
Online ISBN: 978-3-540-48157-7
eBook Packages: Springer Book Archive