Skip to main content
Log in

Natural embedding of live actors and entities into 360° virtual reality scenes

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

This paper is concerned with techniques for directly embedding moving objects in the real world into 360° virtual reality scenes captured by 360° camera, DSLR camera, smartphone, and others. For more natural embedding, we present a cylindrical mapping methodology based on a proposed international standard model for mixed and augmented reality, in which the living physical objects are called live actors and entities. Our experiments illustrate the realistic movements and interactions of live actors and entities that are embedded into 360° virtual reality scenes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20

Similar content being viewed by others

References

  1. Pope VC, Dawes R, Sheikh A (2017) The geometry of storytelling: theatrical use of space for 360-degree videos and virtual reality. In: Conference on Human Factors in Computing Systems (CHI 2017), pp 4468–4478

  2. Tan J, Cheung G, Ma R (2018) 360-degree virtual-reality cameras for the masses. IEEE Multimed 25(1):87–94

    Google Scholar 

  3. Kokkonis G, Psannis KE, Roumeliotis M, Schonfeld D (2017) Real-time wireless multisensory smart surveillance with 3D-HEVC streams for internet-of-things (IoT). J Supercomput 73(3):1044–1062

    Google Scholar 

  4. Dong W, Zhang L, Shi G, Li X (2012) Nonlocally centralized sparse representation for image restoration. IEEE Trans Image Process 22(4):1620–1630

    MathSciNet  MATH  Google Scholar 

  5. Jiang B, Yang J, Jiang N, Lv Z, Meng Q (2018) Quality assessment for virtual reality technology based on real scene. Neural Comput Appl 29(5):1199–1208

    Google Scholar 

  6. Azuma RT (1997) A survey of augmented reality. Presence Teleop Virtual Environ 4(2):355–385

    Google Scholar 

  7. Paul M, Takemura H, Utsumi A, Kishino F (1994) Augmented reality: a class of displays on the reality-virtuality continuum. In: Proceedings of Telemanipulator and Telepresence Technologies 2351:34

  8. Jeon S, Choi S (2011) Real stiffness augmentation for haptic augmented reality. Presence: Teleop Virtual Environ 20(4):337–370

    Google Scholar 

  9. Lindeman R, Noma H, Barros P (2008) An empirical study of hear-through augmented reality using bone conduction to deliver spatialized audio. In: Proceedings of IEEE VR, pp 35–42

  10. Lee G, Dunser A, Kim S, Billinghurst M (2012) CityViewAR: a mobile outdoor AR application for city visualization. In: Proceedings of ISMAR, pp 57–64

  11. Perrine P, Fleig O, Jannin P (2005) Augmented virtuality based on stereoscopic reconstruction in multimodal image-guided neurosurgery: methods and performance evaluation. IEEE Trans Med Imag 24(11):1500–1511

    Google Scholar 

  12. Billinghurst M, Kato H, Poupyrev I (2001) The MagicBook: a transitional AR interface. Comput Gr 25(5):745–753

    Google Scholar 

  13. Fuchs H, Livingston M, Raskar R, Colucci D, Keller K, State A, Crawford J, Rademacher P, Drake S, Meyer A (1998) Augmented reality visualization for laparoscopic surgery. In: Proceedings of the International Conference on Medical Image Computing and Computer Assisted Intervention, pp 934–943

  14. ISO/IEC DIS 18039 (2018) Information technology—computer graphics and image processing—mixed and augmented reality reference model

  15. ISO/IEC CD 18040 (2018) Information technology—computer graphics, image processing and environmental data representation—Live actor and entity representation in mixed and augmented reality

  16. Hong L, Muraki S, Kaufman A, Bartz D, He T (1997) Virtual voyage: interactive navigation in the human colon. In: SIGGRAPH’97, pp 27–34

  17. Kunz A, Zank M, Fjeld M, Nescher T (2016) Real walking in virtual environments for factory planning and evaluation. In: 6th CIRP Conference on Assembly Technologies and Systems (CATS), Procedia CIRP, 44:257–262

  18. Cao Z, Hu Z (2012) Design virtual reality scene roam for tour animations base on VRML and Java. In: 2012 International Conference on Solid State Devices and Materials Science, Physics Procedia 25:693–699

  19. Razzaque S, Kohn Z, Whitton MC (2001) Redirected walking. In: Proceedings of Eurographics, pp 289–294

  20. Chheang V, Chhaya P, Bunrong L, Ryu G, Jeong S, Lee G, Yoo KH (2017) Navigation control in virtual reality walking tour. In: The 4th International Conference on Bigdata and Application, 4(1):210–212

  21. Schild J, Laviola J, Masuch M (2012) Understanding user experience in stereoscopic 3d games. In: CHI’12, pp 89–98

  22. Sun Q, Wei LY, Kaufman A (2016) Mapping virtual and physical reality. In: Proceedings of ACM SIGGRAPH 2016, 35(64):1–12

  23. Hashemian AM, Riecke BE (2017) Leaning-based 360 interfaces: investigating virtual reality navigation interfaces with leaning-based-translation and full-rotation. HCI 10:15–32

    Google Scholar 

  24. Facebook Spaces. https://www.facebook.com/spaces. 06 June 2018

  25. Kim IK, Kwon SA, Kim M, Yoo KH (2015) X3D nodes for representing and rendering real characters in 3D virtual environments. Int J Smart Home 09(2):193–206

    Google Scholar 

  26. Jeong JS, Kwon SA, Park JA, Park C, Baek N, Jang RH, Yoo KH (2013) An embedding method for real characters into the 3D virtual space. Int J Softw Eng Appl 7(1):69–78

    Google Scholar 

  27. Chheang V, Ryu G, Jeong S, Lee G, Yoo KH (2016) A web-based system for embedding a live actor and entity using X3DOM. In: 2016 Korea Broadcasting and Media Engineering Conference, pp 1–3, Nov 2016

  28. X3DOM (2018) http://www.x3dom.org, 19 June 2018

  29. Kontakis K, Malamos AG, Steiakaki M, Panagiotakis S (2017) Spatial indexing of complex virtual reality scenes in the web. Int J Image Gr 17(2):1–18

    Google Scholar 

  30. Cheng L-P, Roumen T, Rantzsch H, Kohler S, Schmidt P, Kovacs R, Jasper J, Kemper J, Baudisch P (2015) Turkdeck: Physical virtual reality based on people. In: UIST’15, pp 417–426

  31. Cho MY, Jeong YS (2017) Human gesture recognition performance evaluation for service robots. In: International Conference on Advanced Communications Technology(ICACT), pp 847–851

  32. Chheang V, Jeong S, Lee G, Yoo KH (2017) Cylindrical embedding a live actor and entity into 360° virtual reality scene. In: The 5th International Conference on Bigdata and Application, pp 43–45

  33. Jolesz FA (1997) Image-guided procedures and the operating room of the future. Radiology 204(3):601–612

    Google Scholar 

  34. CompuPhase.com (2017) Panorama image projection. Technical report (ITB CompuPhase), August 2000, retrieved 15 October 2017

  35. Wikipedia.org (2018) Cylindrical equal-area projection. https://en.wikipedia.org/wiki/Cylindrical_equal-area_projection. 15 June 2018

  36. VRTracker.xyz (2018) Solving your calibration problems. https://vrtracker.xyz/solving-calibration-problems/. 24 July 2018

  37. Raposo C, Barreto JP, Nunes U (2013) Fast and accurate calibration of a kinect sensor. In: International Conference on 3D Vision, pp 342–349

  38. Wang Q, Kurillo G, Ofli F, Bajcsy R (2015) Evaluation of pose tracking accuracy in the first and second generations of microsoft kinect. In: International Conference on IEEE Healthcare Informatic (ICHI), 2015, pp 380–389

  39. Yang L, Zhang L, Dong H (2015) Evaluating and improving the depth accuracy of kinect for windows v2. IEEE Sens J 5:4275–4285

    Google Scholar 

  40. Karan Branko (2015) Calibration of kinect-type RGB-D sensors for robotic applications. FME Trans 43(1):47–54

    Google Scholar 

  41. Dou M, Khamis S, Degtyarev Y, Davidson P, Fenello SR, Kowdle A, Escolano SO, Rhemann C, Kim D, Taylor J, Kohli P, Tankovich V, Izadi S (2016) Fusion4D: real-time performance capture of challenging scenes. ACM Trans Graphics 35(4):114

    Google Scholar 

Download references

Acknowledgements

This research was financially supported by the ICT R&D program of MSIP/IITP [20160001210021001, Standard Development of HMD Based VR Service Framework] and by the MSIT (Ministry of Science and ICT), Korea, under the ITRC (Information Technology Research Center) support program (IITP-2017-2013-0-00881) supervised by the IITP (Institute for Information & Communications Technology Promotion).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kwan-Hee Yoo.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chheang, V., Jeong, S., Lee, G. et al. Natural embedding of live actors and entities into 360° virtual reality scenes. J Supercomput 76, 5655–5677 (2020). https://doi.org/10.1007/s11227-018-2615-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11227-018-2615-z

Keywords

Navigation