Skip to main content
Log in

Deep learning-based classification and segmentation for scalpels

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

Scalpels are typical tools used for cutting in surgery, and the surgical tray is one of the locations where the scalpel is present during surgery. However, there is no known method for the classification and segmentation of multiple types of scalpels. This paper presents a dataset of multiple types of scalpels and a classification and segmentation method that can be applied as a first step for validating segmentation of scalpels and further applications can include identifying scalpels from other tools in different clinical scenarios.

Methods

The proposed scalpel dataset contains 6400 images with labeled information of 10 types of scalpels, and a classification and segmentation model for multiple types of scalpels is obtained by training the dataset based on Mask R-CNN. The article concludes with an analysis and evaluation of the network performance, verifying the feasibility of the work.

Results

A multi-type scalpel dataset was established, and the classification and segmentation models of multi-type scalpel were obtained by training the Mask R-CNN. The average accuracy and average recall reached 94.19% and 96.61%, respectively, in the classification task and 93.30% and 95.14%, respectively, in the segmentation task.

Conclusion

The first scalpel dataset is created covering multiple types of scalpels. And the classification and segmentation of multiple types of scalpels are realized for the first time. This study achieves the classification and segmentation of scalpels in a surgical tray scene, providing a potential solution for scalpel recognition, localization and tracking.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Tang J, Gong Y, Xu L, Wang Z, Zhang Y, Ren Z, Wang H, Xia Y, Li X, Wang J, Jin M, Su B (2022) Bleeding contour detection for craniotomy. Biomed Signal Process Control 73:103419. https://doi.org/10.1016/j.bspc.2021.103419

    Article  Google Scholar 

  2. Su B, Yu S, Li X, Gong Y, Li H, Ren Z, Xia Y, Wang H, Zhang Y, Yao W, Wang J, Tang J (2021) Autonomous robot for removing superficial traumatic blood. IEEE J Transl Eng Health Med 9:1–9. https://doi.org/10.1109/JTEHM.2021.3056618

    Article  Google Scholar 

  3. Speidel S, Benzko J, Krappe S, Sudra G, Azad P, Müller-Stich BP, Gutt C, Dillmann R (2009) Automatic classification of minimally invasive instruments based on endoscopic image sequences. In: Medical imaging 2009: visualization, image-guided procedures, and modeling. International society for optics and photonics, pp 72610A. https://doi.org/10.1117/12.811112

  4. Zhou T, Wachs JP (2017) Needle in a haystack: interactive surgical instrument recognition through perception and manipulation. Robot Auton Syst 97:182–192. https://doi.org/10.1016/j.robot.2017.08.013

    Article  Google Scholar 

  5. Parida S (2015) Addressing hospital staffing shortages: dynamic surgical tool tracking and delivery using baxter. J Purdue Undergrad Res 5(1):10. https://doi.org/10.5703/jpur.05.1.09

    Article  Google Scholar 

  6. Su YH, Huang K, Hannaford B (2018) Real-time vision-based surgical tool segmentation with robot kinematics prior. In: 2018 international symposium on medical robotics (ISMR). IEEE, pp 1-6. https://doi.org/10.1109/ISMR.2018.8333305

  7. Suárez-Quispe JC, Ramos OE (2020) Scalpel region detection based on the location of color marks and edge detection. In: 2020 IEEE XXVII international conference on electronics, electrical engineering and computing (INTERCON). IEEE, pp 1-4. https://doi.org/10.1109/INTERCON50315.2020.9220207

  8. Bamba Y, Ogawa S, Itabashi M, Kameoka S, Okamoto T, Yamamoto M (2021) Automated recognition of objects and types of forceps in surgical images using deep learning. Sci Rep 11(1):1–8. https://doi.org/10.1038/s41598-021-01911-1

    Article  CAS  Google Scholar 

  9. Ohuchida K (2020) Robotic surgery in gastrointestinal surgery. Cyborg Bionic Syst 9724807. https://doi.org/10.34133/2020/9724807

  10. Nakadate R, Iwasa T, Onogi S, Arata J, Oguri S, Okamoto Yasuharu, Akahoshi T, Eto M, Hashizume M (2020) Surgical robot for intraluminal access: an ex vivo feasibility study. Cyborg Bionic Syst 8378025. https://doi.org/10.34133/2020/8378025

  11. Jimenez-Moreno R, Useche-Murillo P (2021) Classification and grip of occluded objects. Indones J Electr Eng Inform (IJEEI) 9(1):70–83. https://doi.org/10.52549/ijeei.v9i1.1846

  12. da Costa Rocha C, Padoy N, Rosa B (2019) Self-supervised surgical tool segmentation using kinematic information. In: 2019 international conference on robotics and automation (ICRA). IEEE, pp 8720-8726. https://doi.org/10.1109/ICRA.2019.8794334

  13. García-Peraza-Herrera LC, Li W, Gruijthuijsen C, Devreker A, Attilakos G, Deprest J, Poorten EV, Stoyanov D, Vercauteren T, Ourselin S (2016) Real-time segmentation of non-rigid surgical tools based on deep learning and tracking. international workshop on computer-assisted and robotic endoscopy. Springer, Cham, pp 84–95. https://doi.org/10.1007/978-3-319-54057-3_8

  14. Attia M, Hossny M, Nahavandi S, Asadi H (2017) Surgical tool segmentation using a hybrid deep CNN-RNN auto encoder-decoder. In: 2017 IEEE international conference on systems, man, and cybernetics (SMC). IEEE, pp 3373-3378. https://doi.org/10.1109/SMC.2017.8123151

  15. Garcia-Peraza-Herrera LC, Li W, Fidon L, Gruijthuijsen C, Devreker A, Attilakos G, Deprest J, Poorten EV, Stoyanov D, Vercauteren T, Ourselin S (2017) Toolnet: holistically-nested real-time segmentation of robotic surgical tools. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, pp 5717-5722. https://doi.org/10.1109/IROS.2017.8206462

  16. Laina I, Rieke N, Rupprecht C, Vizcaíno JP, Eslami A, Tombari F, Navab N (2017) Concurrent segmentation and localization for tracking of surgical instruments. In: International conference on medical image computing and computer-assisted intervention. Springer, Cham, pp 664-672. https://doi.org/10.1007/978-3-319-66185-8_75

  17. Ni ZL, Bian GB, Xie XL, Hou ZG, Zhou XH, Zhou YJ (2019) RASNet: segmentation for tracking surgical instruments in surgical videos using refined attention segmentation network. In: 2019 41st annual international conference of the IEEE engineering in medicine and biology society (EMBC). IEEE, pp 5735-5738. https://doi.org/10.1109/EMBC.2019.8856495

  18. Pakhomov D, Premachandran V, Allan M, Azizian M, Navab N (2019) Deep residual learning for instrument segmentation in robotic surgery. International workshop on machine learning in medical imaging. Springer, Cham, pp 566–573. https://doi.org/10.1007/978-3-030-32692-0_65

  19. Shvets AA, Rakhlin A, Kalinin AA, Iglovikov VI (2018) Automatic instrument segmentation in robot-assisted surgery using deep learning. In: 2018 17th IEEE international conference on machine learning and applications (ICMLA). IEEE, pp 624-628. https://doi.org/10.1109/ICMLA.2018.00100

  20. Colleoni E, Edwards P, Stoyanov D (2020) Synthetic and real inputs for tool segmentation in robotic surgery. International conference on medical image computing and computer-assisted intervention. Springer, Cham, pp 700–710

    Google Scholar 

  21. Jha D, Ali S, Emanuelsen K, Hicks SA, Thambawita V, Garcia-Ceja E, Riegler MA, de Lange T, Schmidt PT, Johansen HD, Johansen D, Halvorsen P (2021) Kvasir-instrument: diagnostic and therapeutic tool segmentation dataset in gastrointestinal endoscopy. International conference on multimedia modeling. Springer, Cham, pp 218–229. https://doi.org/10.1007/978-3-030-67835-7_19

  22. Bouget D, Benenson R, Omran M, Riffaud L, Schiele B, Jannin P (2015) Detecting surgical tools by modelling local appearance and global shape. IEEE Trans Med Imaging 34(12):2603–2617. https://doi.org/10.1109/TMI.2015.2450831

    Article  PubMed  Google Scholar 

  23. Zisimopoulos O, Flouty E, Luengo I, Giataganas P, Nehme J, Chow A, Stoyanov D (2018) Deepphase: surgical phase recognition in cataracts videos. International conference on medical image computing and computer-assisted intervention. Springer, Cham, pp 265–272. https://doi.org/10.1007/978-3-030-00937-3_31

  24. Sahu M, Mukhopadhyay A, Szengel A, Zachow S (2017) Addressing multi-label imbalance problem of surgical tool detection using CNN. Int J Comput Assist Radiol Surg 12(6):1013–1020. https://doi.org/10.1007/s11548-017-1565-x

    Article  PubMed  Google Scholar 

  25. Choi B, Jo K, Choi S, Choi J (2017) Surgical-tools detection based on convolutional neural network in laparoscopic robot-assisted surgery. In: 2017 39th annual international conference of the ieee engineering in medicine and biology society (EMBC). IEEE, pp 1756-1759. https://doi.org/10.1109/EMBC.2017.8037183

  26. Wang S, Xu Z, Yan C, Huang J (2019) Graph convolutional nets for tool presence detection in surgical videos. International conference on information processing in medical imaging. Springer, Cham, pp 467–478. https://doi.org/10.1007/978-3-030-20351-1_36

  27. Jin A, Yeung S, Jopling J, Krause J, Azagury D, Milstein A, Fei-Fei L (2018) Tool detection and operative skill assessment in surgical videos using region-based convolutional neural networks. In: 2018 IEEE winter conference on applications of computer vision (WACV). IEEE, pp 691-699. https://doi.org/10.1109/WACV.2018.00081

  28. He K, Gkioxari G, Dollár P, Girshick R (2017) Mask r-cnn. In: Proceedings of the IEEE international conference on computer vision. pp 2961-2969. https://doi.org/10.48550/arXiv.1703.06870

  29. Yamanoi Y, Togo S, Jiang Y, Yokoi H (2021) Learning data correction for myoelectric hand based on “Survival of the Fittest”. Cyborg Bionic Syst 9875814. https://doi.org/10.34133/2021/9875814

  30. Wang L, Ma L, Yang J, Wu J (2021) Human somatosensory processing and artificial somatosensation. Cyborg Bionic Syst 9843259. https://doi.org/10.34133/2021/9843259

  31. Zhu Y, Li C, Jin H, Sun L (2021) Classifying motion intention of step length and synchronous walking speed by functional near-infrared spectroscopy. Cyborg Bionic Syst 9821787. https://doi.org/10.34133/2021/9821787

  32. Xu D, Wang Q (2021) Noninvasive human-prosthesis interfaces for locomotion intent recognition: a review. Cyborg Bionic Syst 9863761. https://doi.org/10.34133/2021/9863761

Download references

Acknowledgements

We appreciated the financial support of the National Natural Science Foundation of China (Grant Nos. 62273055, 91748103, 61573208) and Beijing Natural Science Foundation (Grant No. Z170001).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Jie Tang or Li Gao.

Ethics declarations

Conflict of interest

The authors declare no other competing or financial interests.

Ethical approval

The study was approved by the Ethics Committee of Xuanwu Hospital.

Informed consent

This study belongs to exception where it is not necessary to obtain consent.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Baiquan Su, Qingqian Zhang and Yi Gong are the co-first authors of this paper.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Su, B., Zhang, Q., Gong, Y. et al. Deep learning-based classification and segmentation for scalpels. Int J CARS 18, 855–864 (2023). https://doi.org/10.1007/s11548-022-02825-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-022-02825-7

Keywords

Navigation