Skip to main content
Log in

Collaborative capturing, interpreting, and sharing of experiences

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

This paper proposes a notion of interaction corpus, a captured collection of human behaviors and interactions among humans and artifacts. Digital multimedia and ubiquitous sensor technologies create a venue to capture and store interactions that are automatically annotated. A very large-scale accumulated corpus provides an important infrastructure for a future digital society for both humans and computers to understand verbal/non-verbal mechanisms of human interactions. The interaction corpus can also be used as a well-structured stored experience, which is shared with other people for communication and creation of further experiences. Our approach employs wearable and ubiquitous sensors, such as video cameras, microphones, and tracking tags, to capture all of the events from multiple viewpoints simultaneously. We demonstrate an application of generating a video-based experience summary that is reconfigured automatically from the interaction corpus.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. Throughout this paper, we use the term “ubiquitous” to describe sensors set up around the room and “wearable” to specify sensors carried by the users.

  2. A set of tools that can do cut-and-paste editing and MPEG compression of audio and video under Linux. http://www.mjpeg.sourceforge.net

  3. We used Procomp+ as an AD converter for transmitting sensed signals to the carried PC.

References

  1. Weiser M (1991) The computer for the 21st century. Sci Am 265(30):94–104

    Article  Google Scholar 

  2. Stiefelhagen R, Yang J, Waibel A (1999) Modeling focus of attention for meeting indexing. In: ACM multimedia ’99. ACM, New York, pp 3–10

  3. Kanda T, Ishiguro H, Imai M, Ono T, Mase K (2002) A constructive approach for developing interactive humanoid robots. In: 2002 IEEE/RSJ international conference on intelligent robots and systems (IROS 2002), pp 1265–1270

  4. Pentland A (1996) Smart rooms. Sci Am 274(4):68–76

    Article  Google Scholar 

  5. Brooks RA, Coen M, Dang D, De Bonet J, Kramer J, Lozano-Pérez T, Mellor J, Pook P, Stauffer C, Stein L, Torrance M, Wessler M (1997) The intelligent room project. In: Proceedings of the 2nd international cognitive technology conference (CT’97). IEEE, New York, pp 271–278

  6. Kidd CD, Orr R, Abowd GD, Atkeson CG, Essa IA, MacIntyre B, Mynatt E, Startner TE, Newstetter W (1999) The aware home: a living laboratory for ubiquitous computing research. In: Proceedings of CoBuild’99. Springer LNCS1670, pp 190–197

  7. Bobick AF, Intille SS, Davis JW, Baird F, Pinhanez CS, Campbell LW, Ivanov YA, Schütte A, Wilson A (1999) The KidsRoom: a perceptually-based interactive and immersive story environment. Presence 8(4):369–393

    Article  Google Scholar 

  8. Brumitt B, Meyers B, Krumm J, Kern A, Shafer S (2000) EasyLiving: technologies for intelligent environments. In: Proceedings of HUC 2000. Springer LNCS1927, pp 12–29

  9. Mann S (1998) Humanistic computing: “WearComp” as a new framework for intelligence signal processing. Proc IEEE 86(11):2123–2151

    Article  Google Scholar 

  10. Kawamura T, Kono Y, Kidode M (2002) Wearable interfaces for a video diary: towards memory retrieval, exchange, and transportation. In: The 6th international symposium on wearable computers (ISWC2002). IEEE, New York, pp 31–38

  11. Chiu P, Kapuskar A, Reitmeier S, Wilcox L (1999) Meeting capture in a media enriched conference room. In: Proceedings of CoBuild’99. Springer LNCS1670, pp 79–88

Download references

Acknowledgments

We thank our colleagues at ATR for their valuable discussion and help on the experiments described in this paper. Valuable contributions to the systems described in this paper were made by Tetsushi Yamamoto and Atsushi Nakahara. We also would like to thank Yasuyhiro Katagiri for his continuing support of our research. This research was supported in part by National Institute of Information and Communications Technology.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yasuyuki Sumi.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Sumi, Y., Ito, S., Matsuguchi, T. et al. Collaborative capturing, interpreting, and sharing of experiences. Pers Ubiquit Comput 11, 265–271 (2007). https://doi.org/10.1007/s00779-006-0088-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-006-0088-1

Keywords

Navigation