Skip to main content

EndoGS: Deformable Endoscopic Tissues Reconstruction with Gaussian Splatting

  • Conference paper
  • First Online:
Medical Image Computing and Computer Assisted Intervention – MICCAI 2024 Workshops (MICCAI 2024)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 15274))

  • 82 Accesses

Abstract

Surgical 3D reconstruction is a critical area of research in robotic surgery, with recent works adopting variants of dynamic radiance fields to achieve success in 3D reconstruction of deformable tissues from single-viewpoint videos. However, these methods often suffer from time-consuming optimization or inferior quality, limiting their adoption in downstream tasks. Inspired by 3D Gaussian Splatting, a recent trending 3D representation, we present EndoGS, applying Gaussian Splatting for deformable endoscopic tissue reconstruction. Specifically, our approach incorporates deformation fields to handle dynamic scenes, depth-guided supervision with spatial-temporal weight masks to optimize 3D targets with tool occlusion from a single viewpoint, and surface-aligned regularization terms to capture the much better geometry. As a result, EndoGS reconstructs and renders high-quality deformable endoscopic tissues from a single-viewpoint video, estimated depth maps, and labeled tool masks. Experiments on DaVinci robotic surgery videos demonstrate that EndoGS achieves superior rendering quality. Code is available at https://github.com/HKU-MedAI/EndoGS.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bonilla, S., Zhang, S., Psychogyios, D., Stoyanov, D., Vasconcelos, F., Bano, S.: Gaussian pancakes: geometrically-regularized 3D gaussian splatting for realistic endoscopic reconstruction. arXiv preprint arXiv:2404.06128 (2024)

  2. Cao, A., Johnson, J.: Hexplane: a fast representation for dynamic scenes. In: CVPR (2023)

    Google Scholar 

  3. Chen, G., Wang, W.: A survey on 3D gaussian splatting. arXiv preprint arXiv:2401.03890 (2024)

  4. Chen, L., Tang, W., John, N.W., Wan, T.R., Zhang, J.J.: Slam-based dense surface reconstruction in monocular minimally invasive surgery and its application to augmented reality. Comput. Methods Program. Biomed. (2018)

    Google Scholar 

  5. Chung, J., Oh, J., Lee, K.M.: Depth-regularized optimization for 3D gaussian splatting in few-shot images. arXiv:2311.13398 (2023)

  6. Fridovich-Keil, S., Meanti, G., Warburg, F.R., Recht, B., Kanazawa, A.: K-planes: explicit radiance fields in space, time, and appearance. In: CVPR (2023)

    Google Scholar 

  7. Gao, W., Tedrake, R.: SurfelWarp: efficient non-volumetric single view dynamic reconstruction. arXiv:1904.13073 (2019)

  8. Guédon, A., Lepetit, V.: SuGaR: surface-aligned gaussian splatting for efficient 3d mesh reconstruction and high-quality mesh rendering. arXiv:2311.12775 (2023)

  9. Huang, Y., Cui, B., Bai, L., Guo, Z., Xu, M., Ren, H.: Endo-4DGS: distilling depth ranking for endoscopic monocular scene reconstruction with 4D gaussian splatting. arXiv preprint arXiv:2401.16416 (2024)

  10. Kerbl, B., Kopanas, G., Leimkühler, T., Drettakis, G.: 3D gaussian splatting for real-time radiance field rendering. ACM Trans. Graph. 42(4) (2023)

    Google Scholar 

  11. Li, C., et al.: EndoSparse: real-time sparse view synthesis of endoscopic scenes using gaussian splatting. arXiv preprint arXiv:2407.01029 (2024)

  12. Li, Y., Richter, F., Lu, J., Funk, E.K., Orosco, R.K., Zhu, J., Yip, M.C.: SuPer: a surgical perception framework for endoscopic tissue manipulation with surgical robotics. RA-L (2020)

    Google Scholar 

  13. Li, Y., Fu, X., Zhao, S., Jin, R., Zhou, S.K.: Sparse-view CT reconstruction with 3D gaussian volumetric representation. arXiv:2312.15676 (2023)

  14. Li, Z., et al.: Revisiting stereo depth estimation from a sequence-to-sequence perspective with transformers. In: ICCV (2021)

    Google Scholar 

  15. Liu, H., Liu, Y., Li, C., Li, W., Yuan, Y.: LGS: a light-weight 4D gaussian splatting for efficient surgical scene reconstruction. arXiv preprint arXiv:2406.16073 (2024)

  16. Liu, Y., Li, C., Yang, C., Yuan, Y.: EndoGaussian: gaussian splatting for deformable surgical scene reconstruction. arXiv preprint arXiv:2401.12561 (2024)

  17. Long, Y., et al.: E-DSSR: efficient dynamic surgical scene reconstruction with transformer-based stereoscopic depth perception. In: MICCAI (2021)

    Google Scholar 

  18. Lu, J., Jayakumari, A., Richter, F., Li, Y., Yip, M.C.: Super deep: a surgical perception framework for robotic tissue manipulation using deep learning for feature extraction. In: ICRA (2021)

    Google Scholar 

  19. Luiten, J., Kopanas, G., Leibe, B., Ramanan, D.: Dynamic 3D gaussians: tracking by persistent dynamic view synthesis. arXiv:2308.09713 (2023)

  20. Mildenhall, B., Srinivasan, P.P., Tancik, M., Barron, J.T., Ramamoorthi, R., Ng, R.: NeRF: representing scenes as neural radiance fields for view synthesis. Commun. ACM 65(1), 99–106 (2021)

    Article  Google Scholar 

  21. Newcombe, R.A., Fox, D., Seitz, S.M.: DynamicFusion: reconstruction and tracking of non-rigid scenes in real-time. In: CVPR (2015)

    Google Scholar 

  22. Park, K., et al.: NeRFies: deformable neural radiance fields. In: ICCV (2021)

    Google Scholar 

  23. Pumarola, A., Corona, E., Pons-Moll, G., Moreno-Noguer, F.: D-NeRF: neural radiance fields for dynamic scenes. In: CVPR (2021)

    Google Scholar 

  24. Recasens, D., Lamarca, J., Fácil, J.M., Montiel, J., Civera, J.: Endo-depth-and-motion: reconstruction and tracking in endoscopic videos using depth networks and photometric constraints. RA-L (2021)

    Google Scholar 

  25. Schonberger, J.L., Frahm, J.M.: Structure-from-motion revisited. In: CVPR (2016)

    Google Scholar 

  26. Scott, D.J., Cendan, J.C., Pugh, C.M., Minter, R.M., Dunnington, G.L., Kozar, R.A.: The changing face of surgical education: simulation as the new paradigm. J. Surg. Res. (2008)

    Google Scholar 

  27. Shin, C., Ferguson, P.W., Pedram, S.A., Ma, J., Dutson, E.P., Rosen, J.: Autonomous tissue manipulation via surgical robot using learning based model predictive control. In: ICRA (2019)

    Google Scholar 

  28. Song, J., Wang, J., Zhao, L., Huang, S., Dissanayake, G.: Dynamic reconstruction of deformable soft-tissue with stereo scope in minimal invasive surgery. RA-L (2017)

    Google Scholar 

  29. Wang, Y., Long, Y., Fan, S.H., Dou, Q.: Neural rendering for stereo 3D reconstruction of deformable tissues in robotic surgery. In: MICCAI (2022)

    Google Scholar 

  30. Wu, G., Yi, T., Fang, J., Xie, L., Zhang, X., Wei, W., Liu, W., Tian, Q., Wang, X.: 4D gaussian splatting for real-time dynamic scene rendering. arXiv:2310.08528 (2023)

  31. Wu, T., et al.: Recent advances in 3D gaussian splatting. Comput. Visual Media 1–30 (2024)

    Google Scholar 

  32. Yang, C., Wang, K., Wang, Y., Dou, Q., Yang, X., Shen, W.: Efficient deformable tissue reconstruction via orthogonal neural plane. arXiv:2312.15253 (2023)

  33. Yang, C., Wang, K., Wang, Y., Yang, X., Shen, W.: Neural lerplane representations for fast 4D reconstruction of deformable tissues. arXiv:2305.19906 (2023)

  34. Yang, S., Li, Q., Shen, D., Gong, B., Dou, Q., Jin, Y.: Deform3DGS: flexible deformation for fast surgical scene reconstruction with gaussian splatting. arXiv preprint arXiv:2405.17835 (2024)

  35. Yang, Z., Yang, H., Pan, Z., Zhu, X., Zhang, L.: Real-time photorealistic dynamic scene representation and rendering with 4D gaussian splatting. arXiv:2310.10642 (2023)

  36. Yang, Z., Chen, K., Long, Y., Dou, Q.: Efficient data-driven scene simulation using robotic surgery videos via physics-embedded 3d gaussians. arXiv preprint arXiv:2405.00956 (2024)

  37. Yang, Z., Gao, X., Zhou, W., Jiao, S., Zhang, Y., Jin, X.: Deformable 3D gaussians for high-fidelity monocular dynamic scene reconstruction. arXiv:2309.13101 (2023)

  38. Zhao, H., Zhao, X., Zhu, L., Zheng, W., Xu, Y.: HFGS: 4D gaussian splatting with emphasis on spatial and temporal high-frequency components for endoscopic scene reconstruction. arXiv preprint arXiv:2405.17872 (2024)

  39. Zhou, H., Jagadeesan, J.: Real-time dense reconstruction of tissue surface from stereo optical video. TMI (2019)

    Google Scholar 

  40. Zhou, H., Jayender, J.: EMDQ-SLAM: real-time high-resolution reconstruction of soft tissue surface from stereo laparoscopy videos. In: MICCAI (2021)

    Google Scholar 

  41. Zhu, L., Wang, Z., Jin, Z., Lin, G., Yu, L.: Deformable endoscopic tissues reconstruction with gaussian splatting. arXiv preprint arXiv:2401.11535 (2024)

Download references

Acknowledgement

This work was partially supported by the Research Grants Council of Hong Kong (T45-401/22-N and 27206123) and the National Natural Science Foundation of China (No. 62201483). We thank Med-AIR Lab CUHK for DaVinci robotic prostatectomy data.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lequan Yu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhu, L., Wang, Z., Cui, J., Jin, Z., Lin, G., Yu, L. (2025). EndoGS: Deformable Endoscopic Tissues Reconstruction with Gaussian Splatting. In: Celebi, M.E., Reyes, M., Chen, Z., Li, X. (eds) Medical Image Computing and Computer Assisted Intervention – MICCAI 2024 Workshops. MICCAI 2024. Lecture Notes in Computer Science, vol 15274. Springer, Cham. https://doi.org/10.1007/978-3-031-77610-6_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-77610-6_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-77609-0

  • Online ISBN: 978-3-031-77610-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics