Skip to main content

Advertisement

Log in

Real-time surgical needle detection using region-based convolutional neural networks

  • Short communication
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Objective

Conventional surgical assistance and skill analysis for suturing mostly focus on the motions of the tools. As the quality of the suturing is determined by needle motions relative to the tissues, having knowledge of the needle motion would be useful for surgical assistance and skill analysis. As the first step toward demonstrating the usefulness of the knowledge of the needle motion, we developed a needle detection algorithm.

Methods

Owing to the small needle size, attaching sensors to it is difficult. Therefore, we developed a real-time video-based needle detection algorithm using a region-based convolutional neural network.

Results

Our method successfully detected the needle with an average precision of 89.2%. The needle was robustly detected even when the needle was heavily occluded by the tools and/or the blood vessels during microvascular anastomosis. However, there were some incorrect detections, including partial detection.

Conclusion

To the best of our knowledge, this is the first time deep neural networks have been applied to real-time needle detection. In the future, we will develop a needle pose estimation algorithm using the predicted needle location toward computer-aided surgical assistance and surgical skill analysis.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

References

  1. Reiley CE, Plaku E, Hager GD (2010) Motion generation of robotic surgical tasks: learning from expert demonstrations. In: 2010 Annual international conference of the IEEE engineering in medicine and biology, pp 967–970

  2. Fard MJ, Ameri S, Darin Ellis R, Chinnam RB, Pandya AK, Klein MD (2018) Automated robot-assisted surgical skill evaluation: predictive analytics approach. Int J Med Robot Comput Assist Surg 14(1):1–10

    Article  Google Scholar 

  3. Speidel S, Kroehnert A, Bodenstedt S, Kenngott H, Müller-Stich B, Dillmann R (2015) Image-based tracking of the suturing needle during laparoscopic interventions. Proc SPIE 9415:94150B-1–94150B-6

    Article  Google Scholar 

  4. Chen Y, Marinho MM, Kurose Y, Nakazawa A, Deie K, Harada K, Mitsuishi M (2018) Towards robust needle segmentation and tracking in pediatric endoscopic surgery. Proc SPIE 10576:105762Y-1–105762Y-8

    Google Scholar 

  5. Kurose Y, Baek YM, Kamei Y, Tanaka S, Harada K, Sora S, Morita A, Sugita N, Mitsuishi M (2013) Preliminary study of needle tracking in a microsurgical robotic system for automated operations. In: 2013 13th international conference on control, automation and systems (ICCAS 2013), pp 627–630

  6. Ozguner O, Hao R, Jackson RC, Shkurti T, Newman W, Cavusoglu MC (2018) Three-dimensional surgical needle localization and tracking using stereo endoscopic image streams. In: 2018 IEEE international conference on robotics and automation (ICRA), pp 6617–6624

  7. Mitsuishi M, Morita A, Sugita N, Sora S, Mochizuki R, Tanimoto K, Baek YM, Takahashi H, Harada K (2013) Master-slave robotic platform and its feasibility study for micro-neurosurgery. Int J Med Robot Comput Assist Surg 9(2):180–189

    Article  Google Scholar 

  8. Ren S, He K, Girshick R, Sun J (2017) Faster R-CNN: towards real-time object detection with region proposal networks. IEEE Trans Pattern Anal Mach Intell 39(6):1137–1149

    Article  Google Scholar 

  9. Zeiler MD, Fergus R (204) Visualizing and understanding convolutional networks. In: Computer vision–ECCV 2014, vol 8689, pp 818–833

    Chapter  Google Scholar 

  10. Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Huang Z, Karpathy A, Khosla A, Bernstein M, Berg AC, Fei-Fei L (2015) ImageNet Large Scale Visual Recognition Challenge. Int J Comput Vision 115(3):211–252

    Article  Google Scholar 

  11. Everingham M, Eslami SMA, Van Gool L, Williams CKI, Winn J, Zisserman A (2015) The Pascal Visual Object Classes Challenge: a Retrospective. Int J Comput Vision 111(1):98–136

    Article  Google Scholar 

Download references

Acknowledgements

This work was funded by ImPACT Program of Council for Science, Technology and Innovation, Cabinet Office, Government of Japan, Grant-in-Aid for JSPS Research Fellows Number 18J12185 and Global Leader Program for Social Design and Management by the Ministry of Education, Culture, Sports, Science and Technology of Japan. We thank Prof. Nakatomi from the University of Tokyo Hospital for providing us with videos of microvascular anastomosis in real surgery.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Atsushi Nakazawa.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical statement

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed consent

The clinical data were obtained under the regulations of the University of Tokyo.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nakazawa, A., Harada, K., Mitsuishi, M. et al. Real-time surgical needle detection using region-based convolutional neural networks. Int J CARS 15, 41–47 (2020). https://doi.org/10.1007/s11548-019-02050-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-019-02050-9

Keywords

Navigation