Skip to main content

A Two-Stage Cascaded Deep Neural Network with Multi-decoding Paths for Kidney Tumor Segmentation

  • Conference paper
  • First Online:
  • 724 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13168))

Abstract

Kidney cancer is aggressive cancer that accounts for a large proportion of adult malignancies. Computed tomography (CT) imaging is an effective tool for kidney cancer diagnosis. Automatic and accurate kidney and kidney tumor segmentation in CT scans is crucial for treatment and surgery planning. However, kidney tumors and cysts have various morphologies, with blurred edges and unpredictable positions. Therefore, precise segmentation of tumors and cysts faces a huge challenge. Consider these difficulties, we propose a cascaded deep neural network, which first accurately locate the kidney area through 2D U-Net, and then segment kidneys, kidney tumors, renal cysts through Multi-decoding Segmentation Network (MDS-Net) from the ROI of the kidney. We evaluated our method on the 2021 Kidney and Kidney Tumor Segmentation Challenge (KiTS21) dataset. The method achieved Dice score, Surface Dice and Tumor Dice of 69.4%, 56.9% and 51.9% respectively, in the test cases. The model of cascade network proposed in this paper has a promising application prospect in kidney cancer diagnosis.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   44.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   59.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Bray, F., Ferlay, J., Soerjomataram, I., Siegel, R.L., Torre, L.A., Jemal, A.: Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA: Cancer J. Clin. 68(6), 394–424 (2018)

    Google Scholar 

  2. Cao, Y., Xu, J., Lin, S., Wei, F., Hu, H.: GCNet: non-local networks meet squeeze-excitation networks and beyond. In: Proceedings of the IEEE/CVF International Conference on Computer Vision Workshops (2019)

    Google Scholar 

  3. Çiçek, Ö., Abdulkadir, A., Lienkamp, S.S., Brox, T., Ronneberger, O.: 3D U-Net: learning dense volumetric segmentation from sparse annotation. In: Ourselin, S., Joskowicz, L., Sabuncu, M.R., Unal, G., Wells, W. (eds.) MICCAI 2016. LNCS, vol. 9901, pp. 424–432. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46723-8_49

    Chapter  Google Scholar 

  4. Ferlay, J., et al.: Estimating the global cancer incidence and mortality in 2018: Globocan sources and methods. Int. J. Cancer 144(8), 1941–1953 (2019)

    Article  Google Scholar 

  5. Ferlay, J., Shin, H.R., Bray, F., Forman, D., Mathers, C., Parkin, D.M.: Estimates of worldwide burden of cancer in 2008: Globocan 2008. Int. J. Cancer 127(12), 2893–2917 (2010)

    Article  Google Scholar 

  6. Guo, J., Zeng, W., Yu, S., Xiao, J.: Rau-net: U-net model based on residual and attention for kidney and kidney tumor segmentation. In: 2021 IEEE International Conference on Consumer Electronics and Computer Engineering (ICCECE), pp. 353–356. IEEE (2021)

    Google Scholar 

  7. Havaei, M., et al.: Brain tumor segmentation with deep neural networks. Med. Image Anal. 35, 18–31 (2017)

    Article  Google Scholar 

  8. Kim, T., et al.: Active learning for accuracy enhancement of semantic segmentation with CNN-corrected label curations: evaluation on kidney segmentation in abdominal CT. Sci. Rep. 10(1), 1–7 (2020)

    Article  Google Scholar 

  9. Li, X., Liu, L., Heng, P.A.: H-DenseUNet for kidney and tumor segmentation from CT scans (2019)

    Google Scholar 

  10. Li, Z., Pan, J., Wu, H., Wen, Z., Qin, J.: Memory-efficient automatic kidney and tumor segmentation based on non-local context guided 3D U-Net. In: Martel, A.L., et al. (eds.) MICCAI 2020. LNCS, vol. 12264, pp. 197–206. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-59719-1_20

    Chapter  Google Scholar 

  11. Liu, C., et al.: Brain tumor segmentation network using attention-based fusion and spatial relationship constraint. arXiv preprint arXiv:2010.15647 (2020)

  12. Long, J., Shelhamer, E., Darrell, T.: Fully convolutional networks for semantic segmentation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3431–3440 (2015)

    Google Scholar 

  13. Mu, G., Lin, Z., Han, M., Yao, G., Gao, Y.: Segmentation of kidney tumor by multi-resolution VB-nets (2019)

    Google Scholar 

  14. Oktay, O., et al.: Attention U-Net: learning where to look for the pancreas. arXiv preprint arXiv:1804.03999 (2018)

  15. Rickmann, A.-M., Roy, A.G., Sarasua, I., Navab, N., Wachinger, C.: ‘Project & excite’ modules for segmentation of volumetric medical scans. In: Shen, D., et al. (eds.) MICCAI 2019. LNCS, vol. 11765, pp. 39–47. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-32245-8_5

    Chapter  Google Scholar 

  16. Ronneberger, O., Fischer, P., Brox, T.: U-Net: convolutional networks for biomedical image segmentation. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9351, pp. 234–241. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-24574-4_28

    Chapter  Google Scholar 

  17. Taha, A., Lo, P., Li, J., Zhao, T.: Kid-Net: convolution networks for kidney vessels segmentation from CT-volumes. In: Frangi, A.F., Schnabel, J.A., Davatzikos, C., Alberola-López, C., Fichtinger, G. (eds.) MICCAI 2018. LNCS, vol. 11073, pp. 463–471. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-00937-3_53

    Chapter  Google Scholar 

  18. Yu, L., Yang, X., Chen, H., Qin, J., Heng, P.A.: Volumetric convnets with mixed residual connections for automated prostate segmentation from 3D MR images. In: Thirty-First AAAI Conference on Artificial Intelligence (2017)

    Google Scholar 

  19. Yu, Q., Shi, Y., Sun, J., Gao, Y., Zhu, J., Dai, Y.: Crossbar-net: a novel convolutional neural network for kidney tumor segmentation in CT images. IEEE Trans. Image Process. 28(8), 4060–4074 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  20. Zhang, Y., et al.: Cascaded volumetric convolutional network for kidney tumor segmentation from CT volumes. arXiv preprint arXiv:1910.02235 (2019)

  21. Zhang, Z., et al.: Multi-modality pathology segmentation framework: application to cardiac magnetic resonance images. In: Zhuang, X., Li, L. (eds.) MyoPS 2020. LNCS, vol. 12554, pp. 37–48. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-65651-5_4

    Chapter  Google Scholar 

Download references

Acknowledgment

This work was financed by Fujian Provincial Natural Science Foundation project (Grant No. 2021J02019, 2021J01578, 2019Y9070), Fuzhou Science and Technology Project (2020-GX-17).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Liqin Huang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

He, T., Zhang, Z., Pei, C., Huang, L. (2022). A Two-Stage Cascaded Deep Neural Network with Multi-decoding Paths for Kidney Tumor Segmentation. In: Heller, N., Isensee, F., Trofimova, D., Tejpaul, R., Papanikolopoulos, N., Weight, C. (eds) Kidney and Kidney Tumor Segmentation. KiTS 2021. Lecture Notes in Computer Science, vol 13168. Springer, Cham. https://doi.org/10.1007/978-3-030-98385-7_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-98385-7_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-98384-0

  • Online ISBN: 978-3-030-98385-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics