Abstract
Virtual reality offers unique affordances that can benefit the scientific discovery process. However, virtual reality applications must maintain very high frame rates to provide immersion and prevent adverse events such as visual fatigue and motion sickness. Maintaining high frame rates can be challenging when visualizing scientific data that is large in scale. One successful technique for enabling interactive exploration of large-scale datasets is to create a large image collection from a structured sampling of camera positions, time steps, and visualization operators. This paper highlights our work to adapt this technique for virtual reality, and uses two authentic scientific datasets – a) a large-scale simulation of cancer cell transport and capture in a microfluidic device and b) a large-scale molecular dynamics simulation of graphene for creating extremely low friction interactions. We create a collection of omnidirectional stereoscopic images (three-dimensional surround-view panoramas), each of which captures all possible view angles from a given location. Therefore, virtual reality devices can always render local movements at full frame rates without loading a new image from the collection.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Ahrens, J., Jourdain, S., O’Leary, P., Patchett, J., Rogers, D.H., Petersen, M.: An image-based approach to extreme scale in situ visualization and analysis. In: SC 2014: Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis, pp. 424–434 (2014). https://doi.org/10.1109/SC.2014.40
Babylon.js. https://www.babylonjs.com/. Accessed 01 Mar 2022
Berman, D., Deshmukh, S.A., Sankaranarayanan, S.K.R.S., Erdemir, A., Sumant, A.V.: Macroscale superlubricity enabled by graphene nanoscroll formation. Science 348(6239), 1118–1122 (2015). https://doi.org/10.1126/science.1262024
Blender online community: blender - a 3D modelling and rendering package. https://www.blender.org. Accessed 01 Mar 2022
Cai, Y., Heng, P., Wu, E., Liu, X., Li, H., Sun, Q.: An image-based virtual reality prototype system. J. Comput. Sci. Technol. 13(5), 475–480 (1998). https://doi.org/10.1007/BF02948507. Sep
Dwyer, T., et al.: Immersive analytics: an introduction. In: Immersive Analytics. LNCS, vol. 11190, pp. 1–23. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01388-2_1
Faigle, C., Fox, G., Furmanski, W., Niemiec, J., Simoni, D.: Integrating virtual environments with high performance computing. In: Proceedings of IEEE Virtual Reality Annual International Symposium, pp. 62–68 (1993). https://doi.org/10.1109/VRAIS.1993.380797
Gaggioli, A., Breining, R.: Perception and cognition in immersive virtual reality. In: Emerging Communication: Studies on New Technologies and Practices in Communication. IOS Press (2001)
Ge, J., et al.: Point-based VR visualization for large-scale mesh datasets by real-time remote computation. In: Proceedings of the 2006 ACM International Conference on Virtual Reality Continuum and Its Applications, pp. 43–50. VRCIA 2006 (2006). https://doi.org/10.1145/1128923.1128931
glTF™. https://www.khronos.org/gltf/. Accessed 10 May 2022
Heroux, M.A., et al.: ECP software technology capability assessment report-public. Technical report, US Department of Energy’s Exascale Computing Initiative (2020)
Latt, J., et al.: Palabos: parallel lattice Boltzmann solver. Comput. Math. Appl. 81, 334–350 (2021). https://doi.org/10.1016/j.camwa.2020.03.022
Lütjens, M., Kersten, T.P., Dorschel, B., Tschirschwitz, F.: Virtual reality in cartography: immersive 3D visualization of the arctic Clyde inlet (Canada) using digital elevation models and bathymetric data. Multimodal Technol. Interact. 3(1), 9 (2019). https://doi.org/10.3390/mti3010009
Marrinan, T., Papka, M.E.: Real-time omnidirectional stereo rendering: generating 360\(^\circ \) surround-view panoramic images for comfortable immersive viewing. IEEE Trans. Vis. Comput. Graph. 27(5), 2587–2596 (2021). https://doi.org/10.1109/TVCG.2021.3067780
Nelson, D., et al.: The IllustrisTNG simulations: public data release. Comput. Astrophy. Cosmo. 6(1), 2 (2019). https://doi.org/10.1186/s40668-019-0028-x. May
Oculus for developers: performance and optimization. https://developer.oculus.com/documentation/unity/unity-perf. Accessed 16 Feb 2022
Schatz, M.C., Langmead, B.: The DNA data deluge: fast, efficient genome sequencing machines are spewing out more data than geneticists can analyze. IEEE Spectr. 50(7), 26–33 (2013). https://doi.org/10.1109/MSPEC.2013.6545119
Schuchardt, P., Bowman, D.A.: The benefits of immersion for spatial understanding of complex underground cave systems. In: Proceedings of the 2007 ACM Symposium on Virtual Reality Software and Technology, pp. 121–124. VRST 2007 (2007). https://doi.org/10.1145/1315184.1315205
Sowndararajan, A., Wang, R., Bowman, D.A.: Quantifying the benefits of immersion for procedural training. In: Proceedings of the 2008 Workshop on Immersive Projection Technologies/Emerging Display Technologies. IPT/EDT 2008 (2008). https://doi.org/10.1145/1394669.1394672
Tan, J., Ding, Z., Hood, M., Li, W.: Simulation of circulating tumor cell transport and adhesion in cell suspensions in microfluidic devices. Biomicrofluidics 13(6), 064105 (2019). https://doi.org/10.1063/1.5129787
Thompson, A.P., et al.: LAMMPS - a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Comput. Phys. Commun. 271, 108171 (2022). https://doi.org/10.1016/j.cpc.2021.108171
Whang, J.: Improving the perception of depth of image-based objects in a virtual environment. Master’s thesis, Virginia Polytechnic Institute and State University (2020)
Xu, M., Li, C., Zhang, S., Callet, P.L.: State-of-the-art in 360\(^\circ \) video/image processing: perception, assessment and compression. IEEE J. Sel. Top. Sign. Process. 14(1), 5–26 (2020). https://doi.org/10.1109/JSTSP.2020.2966864
Acknowledgements
This research was supported in part by the Argonne Leadership Computing Facility, which is a U.S. Department of Energy Office of Science User Facility operated under contract DE-AC02-06CH11357. This work also used the Extreme Science and Engineering Discovery Environment (XSEDE) Bridges-2 at the Pittsburgh Supercomputing Center through allocation CIS210066, which is supported by National Science Foundation grant number ACI-1548562. We would also like to acknowledge the Center for Research Computing and Data at Northern Illinois University, where computations were performed on their Gaea high-performance computing cluster. Finally, we would like to thank Michael Hood and Subramanian Sankaranarayanan for their contributions towards generating the data from the science driver simulations.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Marrinan, T., Tan, J., Insley, J.A., Kanayinkal, A., Papka, M.E. (2022). Interactive Virtual Reality Exploration of Large-Scale Datasets Using Omnidirectional Stereo Images. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2022. Lecture Notes in Computer Science, vol 13598. Springer, Cham. https://doi.org/10.1007/978-3-031-20713-6_9
Download citation
DOI: https://doi.org/10.1007/978-3-031-20713-6_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-20712-9
Online ISBN: 978-3-031-20713-6
eBook Packages: Computer ScienceComputer Science (R0)