Skip to main content

A Review of 3D Reconstruction Techniques for Deformable Tissues in Robotic Surgery

  • Conference paper
  • First Online:
Medical Image Computing and Computer Assisted Intervention – MICCAI 2024 Workshops (MICCAI 2024)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 15274))

  • 112 Accesses

Abstract

As a crucial and intricate task in robotic minimally invasive surgery, reconstructing surgical scenes using stereo or monocular endoscopic video holds immense potential for clinical applications. NeRF-based techniques have recently garnered attention for the ability to reconstruct scenes implicitly. On the other hand, Gaussian splatting-based 3D-GS represents scenes explicitly using 3D Gaussians and projects them onto a 2D plane as a replacement for the complex volume rendering in NeRF. However, these methods face challenges regarding surgical scene reconstruction, such as slow inference, dynamic scenes, and surgical tool occlusion. This work explores and reviews state-of-the-art (SOTA) approaches, discussing their innovations and implementation principles. Furthermore, we replicate the models and conduct testing and evaluation on two datasets. The test results demonstrate that with advancements in these techniques, achieving real-time, high-quality reconstructions becomes feasible. The code is available at: https://github.com/Epsilon404/ surgicalnerf.

M. Xu and Z. Guo—Contributed equally to this work.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Batlle, V.M., Montiel, J.M., Fua, P., Tardós, J.D.: LightNeuS: neural surface reconstruction in endoscopy using illumination decline. In: International Conference on Medical Image Computing and Computer-Assisted Intervention, pp. 502–512. Springer (2023)

    Google Scholar 

  2. Bobrow, T.L., Golhar, M., Vijayan, R., Akshintala, V.S., Garcia, J.R., Durr, N.J.: Colonoscopy 3D video dataset with paired depth from 2D-3D registration. Med. Image Anal. 102956 (2023)

    Google Scholar 

  3. Cui, B., Islam, M., Bai, L., Ren, H.: Surgical-DINO: adapter learning of foundation models for depth estimation in endoscopic surgery. Int. J. Comput. Assist. Radiol. Surg. 1–8 (2024)

    Google Scholar 

  4. Cui, B., Islam, M., Bai, L., Wang, A., Ren, H.: EndoDAC: efficient adapting foundation model for self-supervised depth estimation from any endoscopic camera. arXiv preprint arXiv:2405.08672 (2024)

  5. Fridovich-Keil, S., Meanti, G., Warburg, F.R., Recht, B., Kanazawa, A.: K-planes: explicit radiance fields in space, time, and appearance. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 12479–12488 (2023)

    Google Scholar 

  6. Gropp, A., Yariv, L., Haim, N., Atzmon, M., Lipman, Y.: Implicit geometric regularization for learning shapes (2020)

    Google Scholar 

  7. Hayoz, M., et al.: Learning how to robustly estimate camera pose in endoscopic videos. Int. J. Comput. Assist. Radiol. Surg. 18(7), 1185–1192 (2023)

    Article  MATH  Google Scholar 

  8. Huang, Y., Cui, B., Bai, L., Guo, Z., Xu, M., Ren, H.: Endo-4DGS: distilling depth ranking for endoscopic monocular scene reconstruction with 4d gaussian splatting. arXiv preprint arXiv:2401.16416 (2024)

  9. Huang, Y., Cui, B., Zhang, J., Bai, L., Ren, H.: Registering neural 4D gaussians for endoscopic surgery. arXiv preprint arXiv:2407.20213 (2024)

  10. Kajiya, J.T., Von Herzen, B.P.: Ray tracing volume densities. ACM SIGGRAPH Comput. Graph. 18(3), 165–174 (1984)

    Article  Google Scholar 

  11. Kerbl, B., Kopanas, G., Leimkühler, T., Drettakis, G.: 3D gaussian splatting for real-time radiance field rendering. ACM Trans. Graph. 42(4) (2023). https://repo-sam.inria.fr/fungraph/3d-gaussian-splatting/

  12. Knappe, P., Gross, I., Pieck, S., Wahrburg, J., Künzler, S., Kerschbaumer, F.: Position control of a surgical robot by a navigation system. In: Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No. 03CH37453), vol. 4, pp. 3350–3354. IEEE (2003)

    Google Scholar 

  13. Li, C., et al.: EndoSparse: real-time sparse view synthesis of endoscopic scenes using gaussian splatting. arXiv preprint arXiv:2407.01029 (2024)

  14. Li, Z., et al.: Revisiting stereo depth estimation from a sequence-to-sequence perspective with transformers. In: Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), pp. 6197–6206 (2021)

    Google Scholar 

  15. Liu, H., Liu, Y., Li, C., Li, W., Yuan, Y.: LGS: a light-weight 4D gaussian splatting for efficient surgical scene reconstruction. arXiv preprint arXiv:2406.16073 (2024)

  16. Liu, Y., Li, C., Yang, C., Yuan, Y.: EndoGaussian: gaussian splatting for deformable surgical scene reconstruction. arXiv preprint arXiv:2401.12561 (2024)

  17. Mildenhall, B., Srinivasan, P.P., Tancik, M., Barron, J.T., Ramamoorthi, R., Ng, R.: NeRF: representing scenes as neural radiance fields for view synthesis. In: ECCV (2020)

    Google Scholar 

  18. Psychogyios, D., Vasconcelos, F., Stoyanov, D.: Realistic endoscopic illumination modeling for nerf-based data generation. In: International Conference on Medical Image Computing and Computer-Assisted Intervention, pp. 535–544. Springer (2023)

    Google Scholar 

  19. Pumarola, A., Corona, E., Pons-Moll, G., Moreno-Noguer, F.: D-NeRF: neural radiance fields for dynamic scenes. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10318–10327 (2021)

    Google Scholar 

  20. Qian, L., Wu, J.Y., DiMaio, S.P., Navab, N., Kazanzides, P.: A review of augmented reality in robotic-assisted surgery. IEEE Trans. Med. Robot. Bionics 2(1), 1–16 (2019)

    Article  Google Scholar 

  21. Wang, P., Liu, L., Liu, Y., Theobalt, C., Komura, T., Wang, W.: NeuS: learning neural implicit surfaces by volume rendering for multi-view reconstruction. In: NeurIPS (2021)

    Google Scholar 

  22. Wang, Y., Long, Y., Fan, S.H., Dou, Q.: Neural rendering for stereo 3D reconstruction of deformable tissues in robotic surgery. In: International Conference on Medical Image Computing and Computer-Assisted Intervention, pp. 431–441. Springer (2022)

    Google Scholar 

  23. Wu, G., et al.: 4D gaussian splatting for real-time dynamic scene rendering. arXiv preprint arXiv:2310.08528 (2023)

  24. Yang, C., Wang, K., Wang, Y., Yang, X., Shen, W.: Neural lerplane representations for fast 4D reconstruction of deformable tissues. In: MICCAI (2023)

    Google Scholar 

  25. Yang, L., Kang, B., Huang, Z., Xu, X., Feng, J., Zhao, H.: Depth anything: unleashing the power of large-scale unlabeled data (2024)

    Google Scholar 

  26. Yang, S., Li, Q., Shen, D., Gong, B., Dou, Q., Jin, Y.: Deform3DGS: flexible deformation for fast surgical scene reconstruction with gaussian splatting. arXiv preprint arXiv:2405.17835 (2024)

  27. Yang, Z., Chen, K., Long, Y., Dou, Q.: Efficient data-driven scene simulation using robotic surgery videos via physics-embedded 3D gaussians. arXiv preprint arXiv:2405.00956 (2024)

  28. Yifan, W., Serena, F., Wu, S., Öztireli, C., Sorkine-Hornung, O.: Differentiable surface splatting for point-based geometry processing. ACM Trans. Graph. 38(6), 1–14 (2019). https://doi.org/10.1145/3355089.3356513

    Article  Google Scholar 

  29. Zha, R., Cheng, X., Li, H., Harandi, M., Ge, Z.: EndoSurf: neural surface reconstruction of deformable tissues with stereo endoscope videos (2023)

    Google Scholar 

  30. Zhao, H., Zhao, X., Zhu, L., Zheng, W., Xu, Y.: HFGS: 4D gaussian splatting with emphasis on spatial and temporal high-frequency components for endoscopic scene reconstruction. arXiv preprint arXiv:2405.17872 (2024)

  31. Zhu, L., Wang, Z., Cui, J., Jin, Z., Lin, G., Yu, L.: EndoGS: deformable endoscopic tissues reconstruction with gaussian splatting. arXiv preprint arXiv:2401.11535 (2024)

Download references

Acknowledgments

This work was supported by Hong Kong RGC CRF C4026-21G, RIF R4020-22, GRF 14211420 & 14203323, Shenzhen-Hong Kong-Macau Technology Research Programme (Type C) STIC Grant SGDX20210823103535014 (202108233000303) and the Key Project 2021B1515120035 (B.02.21.00101) of the Regional Joint Fund Project of the Basic and Applied Research Fund of Guangdong Province.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongliang Ren .

Editor information

Editors and Affiliations

Ethics declarations

Disclosure of Interests

The authors have no competing interests to declare.

Rights and permissions

Reprints and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xu, M., Guo, Z., Wang, A., Bai, L., Ren, H. (2025). A Review of 3D Reconstruction Techniques for Deformable Tissues in Robotic Surgery. In: Celebi, M.E., Reyes, M., Chen, Z., Li, X. (eds) Medical Image Computing and Computer Assisted Intervention – MICCAI 2024 Workshops. MICCAI 2024. Lecture Notes in Computer Science, vol 15274. Springer, Cham. https://doi.org/10.1007/978-3-031-77610-6_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-77610-6_15

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-77609-0

  • Online ISBN: 978-3-031-77610-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics