Skip to main content

Capturing Industrial Machinery into Virtual Reality

  • Conference paper
  • First Online:
Articulated Motion and Deformable Objects (AMDO 2018)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 10945))

Included in the following conference series:

Abstract

In this paper we set out to find a new technical and commercial solution to easily acquire a virtual model of existing machinery for visualisation in a VR environment. To this end we introduce an image-based scanning approach with an initial focus on a monocular (handheld) capturing device such as a portable camera. Poses of the camera will be estimated with a Simultaneous Localisation and Mapping technique. Depending on the required quality offline calibration is incorporated by means of ArUco markers placed within the captured scene. Once the images are captured, they are compressed in a format that allows rapid low-latency streaming and decoding on the GPU. Finally, upon viewing the model in a VR environment, an optical flow method is used to interpolate between the triangulisation of the captured viewpoints to deliver a smooth VR experience. We believe our tool will facilitate the capturing of machinery into VR providing a wide range of benefits such as doing marketing, providing offsite help and performing remote maintenance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Choi, S.S., Jung, K., Noh, S.D.: Virtual reality applications in manufacturing industries: past research, present findings, and future directions. Concurrent Eng. 23(1), 40–63 (2015)

    Article  Google Scholar 

  2. Wang, X., Ong, S.K., Nee, A.Y.C.: A comprehensive survey of augmented reality assembly research. Adv. Manufact. 4(1), 1–22 (2016)

    Article  Google Scholar 

  3. Werrlich, S., Nitsche, K., Notni, G.: Demand analysis for an augmented reality based assembly training. In: Proceedings of the 10th International Conference on Pervasive Technologies Related to Assistive Environments, PETRA 2017, pp. 416–422. ACM, New York (2017)

    Google Scholar 

  4. Grajewski, D., Górski, F., Zawadzki, P., Hamrol, A.: Application of virtual reality techniques in design of ergonomic manufacturing workplaces. In: Proceedings of 2013 International Conference on Virtual and Augmented Reality in Education. Procedia Computer Science, vol. 25, pp. 289–301 (2013)

    Article  Google Scholar 

  5. Pontonnier, C., Dumont, G., Samani, A., Madeleine, P., Badawi, M.: Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions. J. Multimodal User Interfaces 8(2), 199–208 (2014)

    Article  Google Scholar 

  6. McMillan, L., Bishop, G.: Plenoptic modeling: an image-based rendering system. In: Proceedings of the 22 nd Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1995, pp. 39–46. ACM, New York (1995)

    Google Scholar 

  7. Mortensen, J.: Virtual light fields for global illumination in computer graphics. Ph.D. thesis (2011)

    Google Scholar 

  8. Buehler, C., Bosse, M., McMillan, L., Gortler, S., Cohen, M.: Unstructured lumigraph rendering. In: Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 2001, pp. 425–432. ACM, New York (2001)

    Google Scholar 

  9. Davis, A., Levoy, M., Durand, F.: Unstructured light fields. Comput. Graph. Forum 31(2pt1), 305–314 (2012)

    Article  Google Scholar 

  10. Raptis, G.E., Katsini, C., Fidas, C., Avouris, N.: Effects of image-based rendering and reconstruction on game developers efficiency, game performance, and gaming experience. In: Bernhaupt, R., Dalvi, G., Joshi, A., Balkrishan, D.K., O’Neill, J., Winckler, M. (eds.) INTERACT 2017. LNCS, vol. 10514, pp. 87–96. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-67684-5_6

    Chapter  Google Scholar 

  11. OpenSLAM. World Wide Web (2018). http://openslam.org/

  12. ArUco. World Wide Web (2018). http://www.uco.es/investiga/grupos/ava/node/26

  13. Mur-Artal, R., Tardós, J.D.: ORB-SLAM2: an open-source SLAM system for monocular, stereo and RGB-D cameras. CoRR, abs/1610.06475 (2016)

    Google Scholar 

Download references

Acknowledgements

This research was partially supported by Flanders Make, the strategic research centre for the manufacturing industry, in view of the Flanders Make FLEXAS_VR project.

We also gratefully express our gratitude to the European Fund for Regional Development (ERDF) and the Flemish Government, which are kindly funding part of the research at the Expertise Centre for Digital Media.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fabian Di Fiore .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Put, J., Michiels, N., Di Fiore, F., Van Reeth, F. (2018). Capturing Industrial Machinery into Virtual Reality. In: Perales, F., Kittler, J. (eds) Articulated Motion and Deformable Objects. AMDO 2018. Lecture Notes in Computer Science(), vol 10945. Springer, Cham. https://doi.org/10.1007/978-3-319-94544-6_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-94544-6_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-94543-9

  • Online ISBN: 978-3-319-94544-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics