Abstract
In this paper, we propose a body attached system to capture the experience of a person in sequence as audio/visual information. The proposed system consists of two cameras (one IR (infra-red) camera and one wide-angle color camera) and a microphone. The IR camera image is used for capturing the user’s head motions. The wide-angle color camera is used for capturing frontal view images, and an image region approximately corresponding to the users’ view is selected according to the estimated human head motions. The selected image and head motion data are stored in a storage device with audio data. This system can overcome the disadvantages of systems using head-mounted cameras in terms of the ease in putting on/taking off the device and its less obtrusive visual impact on third persons. Using the proposed system, we can record audio data, images in the user’s view and head gestures (nodding, shaking, etc.) simultaneously. These data contain significant information for recording/analyzing human activities and can be used in wider application domains (such as a digital diary or interaction analysis). Experimental results show the effectiveness of the proposed system.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Lamming, M., Flynn, M.: Forget-me-not intimate computing in support of human memory. Technical Report EPC-1994-103, RXRC Cambridge Laboratory (1994)
Mann, S.: Wearable computing: A first step toward personal imaging. Computer 30, 25–32 (1999)
Starner, T., Schiele, B., Pentland, A.: Visual contextual awareness in wearable computers. In: Proc. of Intl. Symp. on Wearable Computers, pp. 50–57 (1998)
Clarkson, B., Mase, K., Pentland, A.: Recognizing user context via wearable sensors. In: Proc. of Intl. Symp. on Wearable Computers, pp. 69–75 (2000)
Sumi, Y., Matsuguchi, T., Ito, S., Fels, S., Mase, K.: Collaborative capturing of interactions by multiple sensors. In: Ubicomp 2003, 193–194 (2003)
Aizawa, K., Shiina, M., Ishijima, K.: Can we handle life-ling video? In: Int. Conf. Media Futures, pp. 239–242 (2001)
Yamazoe, H., Utsumi, A., Tetsutani, N., Yachida, M.: Vision-based human motion tracking using head-mounted cameras and fixed cameras for interaction analysis. In: Proc. of Asian Conference on Computer Vision 2004, 682–687 (2004)
Starner, T., Auxier, J., Ashbrook, D., Gandy, M.: Gesture pendant: A selfilluminating, wearable, infrared computer vision system for home automation control and medical monitoring. In: Proc. of Intl. Symp. on Wearable Computers, pp. 87–94 (2000)
Healey, J., Picard, R.W.: Startlecam: A cybernetic wearable camera. In: Proc. of Intl. Symp. on Wearable Computers, pp. 42–49 (1998)
Numazaki, S., Morishita, A., Umeki, N., Ishikawa, M., Doi, M.: A kinetic and 3d image input device. In: Proc of CHI 1998, pp. 237–238 (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Yamazoe, H., Utsumi, A., Tetsutani, N., Yachida, M. (2004). A Novel Wearable System for Capturing User View Images. In: Sebe, N., Lew, M., Huang, T.S. (eds) Computer Vision in Human-Computer Interaction. CVHCI 2004. Lecture Notes in Computer Science, vol 3058. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-24837-8_16
Download citation
DOI: https://doi.org/10.1007/978-3-540-24837-8_16
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22012-1
Online ISBN: 978-3-540-24837-8
eBook Packages: Springer Book Archive