Cross-Model Attention-Guided Tumor Segmentation for 3D Automated Breast Ultrasound (ABUS) Images | IEEE Journals & Magazine | IEEE Xplore

Cross-Model Attention-Guided Tumor Segmentation for 3D Automated Breast Ultrasound (ABUS) Images


Abstract:

Tumor segmentation in 3D automated breast ultrasound (ABUS) plays an important role in breast disease diagnosis and surgical planning. However, automatic segmentation of ...Show More

Abstract:

Tumor segmentation in 3D automated breast ultrasound (ABUS) plays an important role in breast disease diagnosis and surgical planning. However, automatic segmentation of tumors in 3D ABUS images is still challenging, due to the large tumor shape and size variations, and uncertain tumor locations among patients. In this paper, we develop a novel cross-model attention-guided tumor segmentation network with a hybrid loss for 3D ABUS images. Specifically, we incorporate the tumor location into a segmentation network by combining an improved 3D Mask R-CNN head into V-Net as an end-to-end architecture. Furthermore, we introduce a cross-model attention mechanism that is able to aggregate the segmentation probability map from the improved 3D Mask R-CNN to each feature extraction level in the V-Net. Then, we design a hybrid loss to balance the contribution of each part in the proposed cross-model segmentation network. We conduct extensive experiments on 170 3D ABUS from 107 patients. Experimental results show that our method outperforms other state-of-the-art methods, by achieving the Dice similarity coefficient (DSC) of 64.57%, Jaccard coefficient (JC) of 53.39%, recall (REC) of 64.43%, precision (PRE) of 74.51%, 95th Hausdorff distance (95HD) of 11.91 mm, and average surface distance (ASD) of 4.63 mm. Our code will be available online (https://github.com/zhouyuegithub/CMVNet).
Published in: IEEE Journal of Biomedical and Health Informatics ( Volume: 26, Issue: 1, January 2022)
Page(s): 301 - 311
Date of Publication: 18 May 2021

ISSN Information:

PubMed ID: 34003755

Funding Agency:


References

References is not available for this document.