Abstract
Unlike most desktop computer and laptop, mobile interface are designed to facilitate user operating the information easily with various situations that is standing, walking, and moving. However, almost mobile devices such like cell phones have a small key pad and small display because those devices should keep compact and light weight for bringing and pocketing. Therefore, they impose a lot of burdens to users in terms of watching a small display and typing with a small keyboard. Such devices do not focus to provide implicit and awareness information. In this paper, we describe features of body worn projector, which has capability for projecting information to user’s peripheral vision, and body worn camera, which has capability for recognizing user’s posture and estimating user’s behavior, is suitable interface for providing awareness, implicit, and even explicit information. Finally, we propose two mobile interfaces which are “Palm top display for glance information” and “Floor projection from Lumbar mounted projector”.
Chapter PDF
Similar content being viewed by others
References
Kourogi, M., Kurata, T.: Personal positioning based on walking locomotion analysis with self-contained sensors and a wearable camera. In: ISMAR 2003, pp. 103–112 (2003)
Rekimoto, J.: GestureWrist and GesturePad: Unobtrusive wearable interaction devices. In: ISWC 2001, pp. 21–27 (2001)
Kurata, T., Okuma, T., Kourogi, M., Sakaue, K.: The hand-mouse: A human interface suitable for augmentedreality environment enabled by visual wearables. TechnicalReport of IEICE (PRMU), pp. 69–76 (2000)
Feldman, A., Tapia, E.M., Sadi, S., Maes, P., Schmandt, C.: ReachMedia: On-the-move interaction with everyday objects. In: ISWC 2005, pp. 52–59, (2005)
Ueoka, T., Kawamura, T., Kono, Y., Kidode, M.: I’m Here!: a Wearable bject Remembrance Support System. In: MobileHCI 2003, Fifth International Symposium on Human Computer Interaction with obile Devices and Services, pp. 422–427 (2003)
Tsukubu, Y., Kosaka, T., Kameda, Y., Nakamura, Y., Ohta, Y.: Video-Based Media for Gently Giving Instructions: Object change detection and working process identification. In: PRMU 2004, pp. 13–18 (2004)
Rekimoto, J., Sciammarella, E.: ToolStone: Effective Use of the Physical Manipulation Vocabularies of Input Devices. In: Proc. of UIST 2000, pp. 109–117 (2000)
Kourogi, M., Sakata, N., Okuma, T., Kurata, T.: Indoor/Outdoor pedestrian navigation with an embedded gPS/RFID/Self-contained sensor system. In: Pan, Z., Cheok, D.A.D., Haller, M., Lau, R., Saito, H., Liang, R. (eds.) ICAT 2006. LNCS, vol. 4282, pp. 1310–1321. Springer, Heidelberg (2006)
Sakata, N., Kurata, T., Kato, T., Kourogi, M., Kuzuoka, H.: WACL: Supporting Telecommunications Using Wearable Active Camera with Laser Pointer. In: ISWC 2003, NY, USA, pp. 53–56 (2003)
Yamamoto, G., Xu, H., Sato, K.: PALMbit-Silhouette. In: Interaction 2008, pp. 109–116 (2008) (in Japanese)
Yamamoto, G., Nanbu, S., Xu, H., Sato, K.: PALMbit-Shadow: Accessing by Virtual Shadow. In: The 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan (2007)
Starner, T., Weaver, J., Pentland, A.: Real-time American SignLanguage recognition using desk and wearable computer-based video. IEEE Trans. Patt. Analy. and Mach. Intell. 20(12) (1998)
Mann, S.: Smart Clothing: TheWearable Computer and WearCam. Personal Technologies 1(1) (1997)
Starner, T., Auxier, J., Ashbrook, D., Gandy, M.: The gesture pendant: A self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring. In: The Fourth International Symposium on Wearable Computers, ISWC 2000 (2000)
Kolsch, M., Beall, A., Turk, M.: Postural comfort zone for reaching gestures. In: Human Factors and Ergonomics Society Annual Meeting (2003)
Kolsch, M., Beall, A., Turk, M.: An objective measure for postural comfort. In: Human Factors and Ergonomics Society Annual Meeting (2003)
Wither, J., DiVerdi, S., Hollerer, T.: Evaluating Display Types for AR Selection and Annotation. In: International Symposium on Mixed and Augmented Reality (2007)
Mayol, W.W., Tordoff, B., Murray, D.W.: Designing a miniature wearable visual robot. In: ICRA, pp. 3725–3730 (2002)
Chaffin, D.B.: Localized Muscle Fatigue Definition and Measurement. Journal of Occupational Medicine 15(4), 346–354 (1973)
Wither, J., Di Verdi, S., Hollerer, T.: Evaluating Display Types for AR Selection and Annotation. In: International Symposium on Mixed and Augmented Reality 2007, pp. 95–98 (2007)
ARToolkit Website, http://www.hitl.washington.edu/artoolkit/
Mistry, P., Maes, P., Chang, L.: WUW: Wear Ur World - A Wearable Gestural Interface. In: CHI 2009 extended abstracts on Human factors in computing systems, Boston, USA (to appear, 2009)
Maes, P., Mistry, P.: The Sixth Sense. TED talk in Reframe session. In: TED 2009, Long Beach, CA, USA (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Sakata, N., Konishi, T., Nishida, S. (2009). Mobile Interfaces Using Body Worn Projector and Camera. In: Shumaker, R. (eds) Virtual and Mixed Reality. VMR 2009. Lecture Notes in Computer Science, vol 5622. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02771-0_12
Download citation
DOI: https://doi.org/10.1007/978-3-642-02771-0_12
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-02770-3
Online ISBN: 978-3-642-02771-0
eBook Packages: Computer ScienceComputer Science (R0)