Abstract
Breast cancer remains a major global health issue, being the most commonly diagnosed cancer among women worldwide and also affecting a significant number of men. Despite advancements in screening techniques such as mammography and ultrasound, there is a critical need for more precise diagnostic tools to enhance early detection and treatment. Recent developments in machine learning, particularly deep learning, have shown promising potential to improve detection accuracy by effectively analyzing complex patterns in medical imaging. However, developing effective deep learning models tailored to breast cancer data presents substantial challenges. These challenges include processing extensive datasets of breast cancer images, complex model training, and selecting optimal parameters that improve detection accuracy without compromising generalizability across different scenarios and imaging technologies. This paper proposes an innovative approach utilizing deep learning to analyze the Ki-67 protein index from biopsy samples, a crucial marker of cell proliferation in breast cancer diagnostics. By applying advanced neural architectures such as DeepLabv3+ with MobileNet-v2, Xception, DenseNet-121, U-Net, and the Fully Convolutional Regression Network, our method focuses on distinguishing between Ki-67 positive and negative tumor cells and detecting tumor-infiltrating lymphocytes with high precision. These models were rigorously evaluated against the SHIDC-B-Ki-67 dataset, achieving not only high accuracy, reaching up to 98.8%, but also significant reductions in processing times, down to just 13 s, which is crucial for timely clinical decision-making. Our results contribute to integrating artificial intelligence with conventional diagnostic methods, establishing benchmarks for the accuracy and efficiency of breast cancer detection and paving the way for future research in automated medical image analysis.













Similar content being viewed by others
Funding
The authors declare no funding information.
References
Jokhadze N, Das A, Dizon DS. Global cancer statistics: a healthy population relies on population health. Clin: CA Cancer J; 2024.
Luo L, Wang X, Lin Y, Ma X, Tan A, Chan R, Vardhanabhuti V, Chu WC, Cheng K-T, Chen H. Deep learning in breast cancer imaging: a decade of progress and future directions. IEEE Rev Biomed Eng. 2024;1–20.
Negahbani F, Sabzi R, Jahromi BP, Firouzabadi D, Movahedi F, Shirazi MK, Majidi S, Dehghanian A. Pathonet introduced as a deep neural network backend for evaluation of ki-67 and tumor-infiltrating lymphocytes in breast cancer. Sci Rep. 2021;11(1):1–13.
Phan T-C, Phan A-C, Le T-M-T, Trieu T-N. Deep learning for detection and classification of nuclear protein in breast cancer tissue. In: Thai-Nghe N, Do T-N, Haddawy P, editors. Intelligent systems and data science. Singapore: Springer; 2024. p. 15–28.
Hou Y. Breast cancer pathological image classification based on deep learning. J Xray Sci Technol. 2020;28(4):727–38.
Alqahtani Y, Mandawkar U, Sharma A, Hasan MNS, Kulkarni MH, Sugumar R. Breast cancer pathological image classification based on the multiscale CNN squeeze model. Comput Intell Neurosci. 2022;2022:1–11.
Kaur A, Kaushal C, Sandhu JK, Damaševičius R, Thakur N. Histopathological image diagnosis for breast cancer diagnosis based on deep mutual learning. Diagnostics (Basel). 2023;14(1):95.
Yan R, Ren F, Wang Z, Wang L, Zhang T, Liu Y, Rao X, Zheng C, Zhang F. Breast cancer histopathological image classification using a hybrid deep neural network. Methods. 2020;173:52–60.
Gnanasekaran VS, Joypaul S, Sundaram PM, Chairman DD. Deep learning algorithm for breast masses classification in mammograms. IET Image Process. 2020;14:2860–8.
The ki-67 protein: fascinating forms and an unknown function. Exp Cell Res. 2000;257(2):231–7.
Davey MG, Hynes SO, Kerin MJ, Miller N, Lowery AJ. Ki-67 as a prognostic biomarker in invasive breast cancer. Cancers (Basel). 2021;13(17):4455.
La Rosa S, Bonzini M, Sciarra A, Asioli S, Maragliano R, Arrigo M, Foschini MP, Righi A, Maletta F, Motolese A, Papotti M, Sessa F, Uccella S. Exploring the prognostic role of ki67 proliferative index in merkel cell carcinoma of the skin: clinico-pathologic analysis of 84 cases and review of the literature. Endocr Pathol. 2020;31(4):392–400.
La Rosa S, Bonzini M, Sciarra A, Asioli S, Maragliano R, Arrigo M, Foschini MP, Righi A, Maletta F, Motolese A, Papotti M, Sessa F, Uccella S. Exploring the prognostic role of ki67 proliferative index in merkel cell carcinoma of the skin: clinico-pathologic analysis of 84 cases and review of the literature. Endocr Pathol. 2020;31(4):392–400.
Sandler M, Howard A, Zhu M, Zhmoginov A, Chen L-C. Mobilenetv2: inverted residuals and linear bottlenecks. In: Proceedings of the IEEE conference on computer vision and pattern recognition. 2018. pp. 4510–20.
Chollet F. Xception: deep learning with depthwise separable convolutions. In: Proceedings of the IEEE conference on computer vision and pattern recognition. 2017. pp. 1251–8.
Li X, Shen X, Zhou Y, Wang X, Li T-Q. Classification of breast cancer histopathological images using interleaved densenet with senet (idsnet). PloS One. 2020;15(5):0232127.
Zeng H, Peng S, Li D. Deeplabv3+ semantic segmentation model based on feature cross attention mechanism. J Phys Conf Ser. 2020;1678: 012106.
Ronneberger O, Fischer P, Brox T. U-Net: convolutional networks for biomedical image segmentation. 2015.
Weidi Xie JAN, Zisserman A. Microscopy cell counting and detection with fully convolutional regression networks. Comput Methods Biomech Biomed Eng Imaging Vis. 2018;6(3):283–92.
Author information
Authors and Affiliations
Contributions
These authors contributed equally to this work.
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare no conflict of interest.
Research Involving Human and/or Animals
Not applicable.
Informed Consent
Not applicable.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Phan, TC., Huu, H.N. Enhancing Breast Cancer Detection with Advanced Deep Learning Techniques for Ki-67 Nuclear Protein Analysis. SN COMPUT. SCI. 5, 663 (2024). https://doi.org/10.1007/s42979-024-03004-y
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s42979-024-03004-y