Skip to main content

MMRPet: Modular Mixed Reality Pet System Based on Passive Props

  • Conference paper
  • First Online:
Image and Graphics Technologies and Applications (IGTA 2019)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1043))

Included in the following conference series:

Abstract

We present MMRPet, a modular mixed reality pet system based on passive props. In addition to superimposing virtual pets onto pet entities to take advantages of physical interactions provided by pet entities and personalized appearance and rich expressional capabilities provided by virtual pets, the key idea behind MMRPet is the modular design of pet entities. The user can reconfigure limited modules to construct pet entities of various forms and structures. These modular pet entities can provide flexible haptic feedback and support the system to render virtual pets of personalized form and structure. By integrating tracking information from the head and hands of the user, as well as each module of pet entities, MMRPet can infer rich interaction intents and support rich human-pet interactions when the user touches, moves, rotates or gazes each module. We explore the design space for the construction of modular pet entities and the design space of the human-pet interaction enabled by MMRPet. Furthermore, a series of prototypes demonstrate the advantages of using modular entities in a mixed reality pet system.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Mallon, G.P.: Utilization of animals as therapeutic adjuncts with children and youth: a review of the literature. Child Youth Care Forum 21, 53–67 (1992)

    Article  Google Scholar 

  2. Kruger, K., Serpell, J., Fine, A.: Animal-assisted interventions in mental health: definitions and theoretical foundations. In: Handbook on Animal-Assisted Therapy: Theoretical Foundations and Guidelines for Practice, 2nd edn, pp. 21–38 (2006)

    Google Scholar 

  3. Wilson, C.C., Turner, D.C.: Companion Animals in Human Health London. Sage, Upper Saddle River (1998)

    Google Scholar 

  4. Stiehl, W.D., Breazeal, C., Han, K.H., et al.: The huggable: a therapeutic robotic companion for relational, affective touch. In: IEEE International Workshop on Robot and Human Interactive Communication, pp. 408–415. IEEE, New York (2006)

    Google Scholar 

  5. Yohanan, S., Maclean, K.E.: A tool to study affective touch: goals & design of the haptic creature. In: 27th International Conference Extended Abstracts on Human Factors in Computing Systems, pp. 4153–4158. ACM, New York (2009)

    Google Scholar 

  6. Lee, K.M., Jung, Y., Kim, J., Kim, S.R.: Are physically embodied social agents better than disembodied social agents?: the effects of physical embodiment, tactile interaction, and people’s loneliness in human–robot interaction. Int. J. Hum.-Comput. Stud. 64, 962–973 (2006)

    Article  Google Scholar 

  7. Goris, K., Saldien, J., Vanderniepen, I., Lefeber, D.: The huggable robot probo, a multi-disciplinary research platform. In: Gottscheber, A., Enderle, S., Obdrzalek, D. (eds.) EUROBOT 2008. CCIS, vol. 33, pp. 29–41. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-03558-6_4

    Chapter  Google Scholar 

  8. Raffle, H.S., Parkes, A.J., Ishii, H.: Topobo: a constructive assembly system with kinetic memory. In: SIGCHI Conference on Human Factors in Computing Systems, pp. 647–654. ACM, New York (2004)

    Google Scholar 

  9. Sato, M.: Development of string-based force display: SPIDAR. In: 8th International Conference on Virtual Systems and Multimedia (2002)

    Google Scholar 

  10. Massie, H.: The PHANToM haptic interface: a device for probing virtual objects. In: Proceedings of the ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, vol. 55, pp. 295–300 (1994)

    Google Scholar 

  11. Achibet, M., Marchal, M., Argelaguet, F., Lecuyer, A.: The virtual mitten: a novel interaction paradigm for visuo-haptic manipulation of objects using grip force. In: Symposium on 3D User Interfaces, pp. 59–66. IEEE, New York (2014)

    Google Scholar 

  12. Benko, H., Holz, C., Sinclair, M., Ofek, E.: NormalTouch and TextureTouch: high-fidelity 3D haptic shape rendering on handheld virtual reality controllers. In: Symposium on User Interface Software and Technology, pp. 717–728. ACM, New York (2016)

    Google Scholar 

  13. Gu, X., Zhang, Y., Sun, W., Bian, Y., Zhou, D., Kristensson, P.O.: Dexmo: An Inexpensive and Lightweight Mechanical Exoskeleton for Motion Capture and Force Feedback in VR. In: Conference on Human Factors in Computing Systems, pp. 1991–1995. ACM, New York (2016)

    Google Scholar 

  14. Schorr, S.B., Okamura, A.M.: Fingertip tactile devices for virtual object manipulation and exploration. In: Conference on Human Factors in Computing Systems, pp. 3115–3119. ACM, New York (2017)

    Google Scholar 

  15. Shibata, T.: An overview of human interactive robots for psychological enrichment. Proc. IEEE 92, 1749–1758 (2004)

    Article  Google Scholar 

  16. Chang, J., MacLean, K., Yohanan, S.: Gesture recognition in the haptic creature. In: Kappers, A.M.L., van Erp, J.B.F., Bergmann Tiest, W.M., van der Helm, F.C.T. (eds.) EuroHaptics 2010. LNCS, vol. 6191, pp. 385–391. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-14064-8_56

    Chapter  Google Scholar 

  17. Cheng, L.P., Roumen, T., Rantzsch, H., Sven K., Schmidt, P., Kovacs, R.: TurkDeck: physical virtual reality based on people. In: Symposium on User Interface Software and Technology, pp. 417–426. ACM, New York (2015)

    Google Scholar 

  18. Zhao, Y., Kim, L.H., Wang, Y., Le Goc, M., Follmer, S.: Robotic assembly of haptic proxy objects for tangible interaction and virtual reality. In: The Interactive Surfaces and Spaces, pp. 82–91. ACM, New York (2017)

    Google Scholar 

  19. Hoffman, H.G.: Physically touching virtual objects using tactile augmentation enhances the realism of virtual environments. In: IEEE 1998 Virtual Reality Annual International Symposium, pp. 59–63. IEEE, New York (1998)

    Google Scholar 

  20. Hettiarachchi, A., Wigdor, D.: Annexing reality: enabling opportunistic use of everyday objects as tangible proxies in augmented reality. In: Conference on Human Factors in Computing Systems, pp. 1957–1967. ACM, New York (2016)

    Google Scholar 

  21. Kanamori, M., Suzuki, M., Oshiro, H., Tanaka, M., Inoguchi, T., Takasugi, H.: Pilot study on improvement of quality of life among elderly using a pet-type robot. In: IEEE International Symposium on Computational Intelligence in Robotics and Automation, pp. 107–112. IEEE, New York (2003)

    Google Scholar 

  22. Libin, A.V., Libin, E.V.: Person-robot interactions from the robopsychologists’ point of view: the robotic psychology and robotherapy approach. Proc. IEEE 92, 1789–1803 (2004)

    Article  Google Scholar 

  23. Breemen, A.V., Yan, X., Meerbeek, B.: iCat: an animated user-interface robot with personality. In: 4th International Joint Conference on Autonomous Agents and Multiagent Systems, pp. 143–144. ACM, New York (2005)

    Google Scholar 

  24. Shoji, M., Miura, K., Konno, A.: U-Tsu-Shi-O-Mi: the virtual humanoid you can reach. In: SIGGRAPH 2006 Emerging Technologies, p. 34. ACM, New York (2006)

    Google Scholar 

  25. Shimizu, N., Sugimoto, M., Sekiguchi, D., Hasegawa, S., Inami, M.: Mixed reality robotic user interface: virtual kinematics to enhance robot motion. In: International Conference on Advances in Computer Entertainment Technology, pp. 166–169. ACM, New York (2008)

    Google Scholar 

  26. Hinckley, K., Pausch, R., Goble, J.C., Kassell, N.F.: Passive real-world interface props for neurosurgical visualization. In: Conference on Human Factors in Computing Systems, pp. 452–458. ACM, New York (1994)

    Google Scholar 

  27. Lok, B., Naik, S., Whitton, M., Brooks, F.P.: Effects of handling real objects and avatar fidelity on cognitive task performance in virtual environments. In: IEEE Virtual Reality, January 2003, pp. 125–132 (2003)

    Google Scholar 

  28. Kwon, E., Kim, G.J., Lee, S.: Effects of sizes and shapes of props in tangible augmented reality. In: IEEE International Symposium on Mixed and Augmented Reality, pp. 201–202. IEEE, New York (2009)

    Google Scholar 

  29. Shapira, L., Amores, J., Benavides, X.: TactileVR: integrating physical toys into learn and play virtual reality experiences. In: IEEE International Symposium on Mixed and Augmented Reality, pp. 100–106. IEEE, New York (2016)

    Google Scholar 

  30. Wurpts, M.: Poster: updating an obsolete trainer using passive haptics and pressure sensors. In: IEEE Symposium on 3D User Interfaces, pp. 155–156. IEEE, New York (2009)

    Google Scholar 

  31. Chesney, T., Lawson, S.: The illusion of love: does a virtual pet provide the same companionship as a real one? Interact. Stud. 8, 337–342 (2007)

    Article  Google Scholar 

  32. Holz, T., Campbell, A.G., O’Hare, G.M.P., Stafford, J.W., Martin, A., Dragone, M.: MiRA—mixed reality agents. Int. J. Hum. Comput. Stud. 69, 251–268 (2011)

    Article  Google Scholar 

  33. Kim, Y., Park, H., Bang, S., Lee, S.H.: Retargeting human-object interaction to virtual avatars. IEEE Trans. Visual Comput. Graph. 22, 2405–2412 (2016)

    Article  Google Scholar 

  34. Zhao, Y., Follmer, S.: A functional optimization based approach for continuous 3D retargeted touch of arbitrary, complex boundaries in haptic virtual reality. In: CHI Conference on Human Factors in Computing Systems, p. 544. ACM, New York (2018)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dongdong Weng .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xue, Y., Weng, D., Jiang, H., Gao, Q. (2019). MMRPet: Modular Mixed Reality Pet System Based on Passive Props. In: Wang, Y., Huang, Q., Peng, Y. (eds) Image and Graphics Technologies and Applications. IGTA 2019. Communications in Computer and Information Science, vol 1043. Springer, Singapore. https://doi.org/10.1007/978-981-13-9917-6_61

Download citation

  • DOI: https://doi.org/10.1007/978-981-13-9917-6_61

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-13-9916-9

  • Online ISBN: 978-981-13-9917-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics