Skip to main content

Walk Through a Virtual Museum with Binocular Stereo Effect and Spherical Panorama Views Based on Image Rendering Carried by Tracked Robot

  • Conference paper
  • First Online:
Augmented Reality, Virtual Reality, and Computer Graphics (AVR 2020)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 12243))

  • 1857 Accesses

Abstract

In order to provide users with a virtual tour which have walk through and binocular stereoscopic experience, the authors propose a method to use the tracked robot carrying a single panoramic camera. Panorama photos of continuous movement are taken by the tracked robot so that the audience can wander freely in the virtual museum. Panoramic photos are captured at the distance calculated according to requirements of stereo vision comfort. The recorded photos are used to make binocular panoramic video. Two adjacent panoramic images are used as stereoscopic pairs so as to realize comfortable stereoscopic vision. Because only one camera is used, not only the amount of data is reduced, but also the occlusion issue is avoided. Videos can be shot at different distances according to the rules of visual comfort so that the users can enlarge the picture and choose an appropriate spacing.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Anrui: Study on 3D video comfort enhancement method based on parallax change adjustment. Jilin University (2016). (in Chinese). https://kns.cnki.net/KCMS/detail/detail.aspx?dbcode=CMFD&dbname=CMFD201602&filename=1016083509

  2. Ikei, Y., Yem, V., Tashiro, K., Fujie, T., Amemiya, T., Kitazaki, M.: Live stereoscopic 3D image with constant capture direction of 360° cameras for high-quality visual telepresence. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 431–439. IEEE, March 2019

    Google Scholar 

  3. Keskinen, T., et al.: The effect of camera height, actor behavior, and viewer position on the user experience of 360 videos. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 423–430. IEEE (2019)

    Google Scholar 

  4. Zhang, Y., Wang, Z.: Towards visual comfort: disciplines on the scene structure design for VR contents. In: De Paolis, L.T., Bourdot, P. (eds.) AVR 2018. LNCS, vol. 10850, pp. 190–196. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-95270-3_14

    Chapter  Google Scholar 

  5. Wartell, Z., Hodges, L.F., Ribarsky, W.: Characterizing image fusion techniques in stereoscopic HTDs. In: Graphics Interface, pp. 223–232, June 2001

    Google Scholar 

  6. Grinberg, V.S., Podnar, G.W., Siegel, M.: Geometry of binocular imaging. In: Stereoscopic Displays and Virtual Reality Systems, vol. 2177, pp. 56–65. International Society for Optics and Photonics, April 1994

    Google Scholar 

  7. Allen, B., Hanley, T., Rokers, B., Shawn Green, C.: Visual 3D acuity predicts discomfort in 3D stereoscopic environments. Entertain. Comput. 13, 1–9 (2016). https://doi.org/10.1016/j.entcom.2016.01.001

    Article  Google Scholar 

  8. Dodgson, N.A.: Variation and extrema of human interpupillary distance. In: Stereoscopic Displays and Virtual Reality Systems XI, vol. 5291, pp. 36–46. International Society for Optics and Photonics, May 2004

    Google Scholar 

  9. Bastanlar, Y., Grammalidis, N., Zabulis, X., Yilmaz, E., Yardimci, Y., Triantafyllidis, G.: 3D reconstruction for a cultural heritage virtual tour system. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. Beijing 37, 1023–1028 (2008)

    Google Scholar 

  10. Kiourt, C., Koutsoudis, A., Pavlidis, G.: DynaMus: a fully dynamic 3D virtual museum framework. J. Cult. Herit. 22, 984–991 (2016)

    Article  Google Scholar 

  11. Ying, X., Peng, K., Zha, H.: Walkthrough in large environments using concatenated panoramas. In: Proceedings of the 2009 International Conference on Robotics and Biomimetics (ROBIO 2009), pp. 286–291. IEEE Press, Piscataway (2009)

    Google Scholar 

  12. Jung, J.-H., Kang, H.-B.: An efficient arbitrary view generation method using panoramic-based image morphing. In: Huang, D.-S., Li, K., Irwin, G.W. (eds.) ICIC 2006, Part I. LNCS, vol. 4113, pp. 1207–1212. Springer, Heidelberg (2006). https://doi.org/10.1007/11816157_150

    Chapter  Google Scholar 

  13. Zheng, J.Y.: Digital route panoramas. IEEE Multimedia 10(3), 57–67 (2003)

    Article  MathSciNet  Google Scholar 

  14. Zheng, J.Y., Zhou, Y., Shi, M.: Scanning and rendering scene tunnels for virtual city traversing. In: Proceedings of the ACM Symposium on Virtual Reality Software and Technology (VRST 2004), pp. 106–113. ACM, New York (2004)

    Google Scholar 

  15. Yang, L., Crawfis, R.: Rail-track viewer: an image-based virtual walkthrough system. In: Stürzlinger, W., Müller, S. (eds.) Proceedings of the Workshop on Virtual Environments 2002 (EGVE 2002), p. 37-ff. Eurographics Association, Aire-la-Ville (2002)

    Google Scholar 

  16. Aliaga, D.G., Carlbom, I.: Plenoptic stitching: a scalable method for reconstructing 3D interactive walk throughs. In: Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH 2001), pp. 443–450. ACM, New York (2001)

    Google Scholar 

  17. Shum, H.-Y., He, L.-W.: Rendering with concentric mosaics. In: Computer Graphics. Annual Conference Series, vol. 33, pp. 299–306 (1999)

    Google Scholar 

  18. Lai, P.K., Xie, S., Lang, J., Laqaruère, R.: Realtime panoramic depth maps from omni-directional stereo images for 6 DoF videos in virtual reality. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 405–412. IEEE, March 2019

    Google Scholar 

  19. Panoramic camera usage recording of insta pro2. https://blog.csdn.net/yf160702/article/details/101053297. Accessed 23 Sept 2019

Download references

Acknowledgments

The work is supported by Ministry of Education (China) Humanities and Social Sciences Research Foundation under Grant No.: 19A10358002.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to YanXiang Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, Y., Wang, G. (2020). Walk Through a Virtual Museum with Binocular Stereo Effect and Spherical Panorama Views Based on Image Rendering Carried by Tracked Robot. In: De Paolis, L., Bourdot, P. (eds) Augmented Reality, Virtual Reality, and Computer Graphics. AVR 2020. Lecture Notes in Computer Science(), vol 12243. Springer, Cham. https://doi.org/10.1007/978-3-030-58468-9_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-58468-9_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-58467-2

  • Online ISBN: 978-3-030-58468-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics