Abstract
Mammogram malignancy classification with only image-level annotations is challenging due to a lack of lesion annotations. If we can generate the healthy version of the diseased data, we can easily explore the lesion features. An intuitive idea of such generation is to use existing Cycle-GAN based methods. They achieve the healthy generation regarding healthy images as reference domain, while maintaining the original content by cycle consistency mechanism. However, healthy mammogram patterns are diverse which may lead to uncertain generations. Moreover, the back translation from healthy to the original remains an ill-posed problem due to lack of lesion information. To address these problems, we propose a novel model called bilateral residual generating adversarial network(BR-GAN). We use the Cycle-GAN as a basic framework while regarding the contralateral as generation reference based on the bilateral symmetry prior. To address the ill-posed back translation problem, we propose a residual-preserved mechanism to try to preserve the lesion features from the original features. The generated features and the original features are aggregated for further classification. BR-GAN outperforms current state-of-the-art methods on INBreast and in-house datasets.
Keywords
C. Wang—This work was done when Chu-ran Wang was an intern at Deepwise AI Lab.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Dhungel, N., Carneiro, G., Bradley, A.P.: The automated learning of deep features for breast mass classification from mammograms. In: Ourselin, S., Joskowicz, L., Sabuncu, M.R., Unal, G., Wells, W. (eds.) MICCAI 2016. LNCS, vol. 9901, pp. 106–114. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46723-8_13
Fukui, H., Hirakawa, T., Yamashita, T., Fujiyoshi, H.: Attention branch network: learning of attention mechanism for visual explanation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 10705–10714 (2019)
Haarburger, C., et al.: Multiparametric magnetic resonance image synthesis using generative adversarial networks. In: VCBM (2019)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Hu, X., Jiang, Y., Fu, C.W., Heng, P.A.: Mask-ShadowGAN: learning to remove shadows from unpaired data. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 2472–2481 (2019)
Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Pereira, F., Burges, C.J.C., Bottou, L., Weinberger, K.Q. (eds.) Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)
Li, H., Chen, D., Nailon, W.H., Davies, M.E., Laurenson, D.: A deep dual-path network for improved mammogram image processing. In: ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 1224–1228. IEEE (2019). https://doi.org/10.1109/ICASSP.2019.8682496
Lotter, W., Sorensen, G., Cox, D.: A multi-scale CNN and curriculum learning strategy for mammogram classification. In: Carneiro, G., et al. (eds.) DLMIA/ML-CDS -2017. LNCS, vol. 10553, pp. 169–177. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-67558-9_20
Moreira, I.C., Amaral, I., Domingues, I., Cardoso, A., Cardoso, M.J., Cardoso, J.S.: Inbreast: toward a full-field digital mammographic database. Acad. Radiol. 19(2), 236–248 (2012). https://doi.org/10.1016/j.acra.2011.09.014
Nizan, O., Tal, A.: Breaking the cycle-colleagues are all you need. arXiv preprint arXiv:1911.10538 (2019)
Otsu, N.: A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 9(1), 62–66 (1979). https://doi.org/10.1109/TSMC.1979.4310076
Ribli, D., Horváth, A., Unger, Z., Pollner, P., Csabai, I.: Detecting and classifying lesions in mammograms with deep learning. Sci. Rep. 8(1), 4165 (2018). https://doi.org/10.1038/s41598-018-22437-z
Schlegl, T., Seeböck, P., Waldstein, S.M., Schmidt-Erfurth, U., Langs, G.: Unsupervised anomaly detection with generative adversarial networks to guide marker discovery. In: Niethammer, M., et al. (eds.) IPMI 2017. LNCS, vol. 10265, pp. 146–157. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-59050-9_12
Siddiquee, M.M.R., et al.: Learning fixed points in generative adversarial networks: from image-to-image translation to disease detection and localization. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 191–200 (2019)
Siegel, R.L., Miller, K.D., Jemal, A.: Cancer statistics. CA Cancer J. Clin. 69(1), 7–34 (2019). https://doi.org/10.3322/caac.21551
Tai, S.C., Chen, Z.S., Tsai, W.T.: An automatic mass detection system in mammograms based on complex texture features. IEEE J. Biomed. Health Inf. 18(2), 618–627 (2013). https://doi.org/10.1109/JBHI.2013.2279097
Wu, E., Wu, K., Cox, D., Lotter, W.: Conditional infilling GANs for data augmentation in mammogram classification. In: Stoyanov, D., et al. (eds.) RAMBO/BIA/TIA -2018. LNCS, vol. 11040, pp. 98–106. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-00946-5_11
Zhou, B., Khosla, A., Lapedriza, A., Oliva, A., Torralba, A.: Learning deep features for discriminative localization. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2921–2929 (2016)
Zhu, J.Y., Park, T., Isola, P., Efros, A.A.: Unpaired image-to-image translation using cycle-consistent adversarial networks. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 2223–2232 (2017)
Zhu, W., Lou, Q., Vang, Y.S., Xie, X.: Deep multi-instance networks with sparse label assignment for whole mammogram classification. In: Descoteaux, M., Maier-Hein, L., Franz, A., Jannin, P., Collins, D.L., Duchesne, S. (eds.) MICCAI 2017. LNCS, vol. 10435, pp. 603–611. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-66179-7_69
Acknowledgement
This work was supported by MOST-2018AAA0102004, NSFC-61625201 and ZheJiang Province Key Research & Development Program (No. 2020C03073).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, Cr., Zhang, F., Yu, Y., Wang, Y. (2020). BR-GAN: Bilateral Residual Generating Adversarial Network for Mammogram Classification. In: Martel, A.L., et al. Medical Image Computing and Computer Assisted Intervention – MICCAI 2020. MICCAI 2020. Lecture Notes in Computer Science(), vol 12262. Springer, Cham. https://doi.org/10.1007/978-3-030-59713-9_63
Download citation
DOI: https://doi.org/10.1007/978-3-030-59713-9_63
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-59712-2
Online ISBN: 978-3-030-59713-9
eBook Packages: Computer ScienceComputer Science (R0)