Skip to main content
Log in

Spatiotemporal analysis of speckle dynamics to track invisible needle in ultrasound sequences using convolutional neural networks: a phantom study

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

Accurate needle placement into the target point is critical for ultrasound interventions like biopsies and epidural injections. However, aligning the needle to the thin plane of the transducer is a challenging issue as it leads to the decay of visibility by the naked eye. Therefore, we have developed a CNN-based framework to track the needle using the spatiotemporal features of the speckle dynamics.

Methods

There are three key techniques to optimize the network for our application. First, we used Gunnar-Farneback (GF) as a traditional motion field estimation technique to augment the model input with the spatiotemporal features extracted from the stack of consecutive frames. We also designed an efficient network based on the state-of-the-art Yolo framework (nYolo). Lastly, the Assisted Excitation (AE) module was added at the neck of the network to handle the imbalance problem.

Results

Fourteen freehand ultrasound sequences were collected by inserting an injection needle steeply into the Ultrasound Compatible Lumbar Epidural Simulator and Femoral Vascular Access Ezono test phantoms. We divided the dataset into two sub-categories. In the second category, in which the situation is more challenging and the needle is totally invisible, the angle and tip localization error were 2.43 ± 1.14° and 2.3 ± 1.76 mm using Yolov3+GF+AE and 2.08 ± 1.18° and 2.12 ± 1.43 mm using nYolo+GF+AE.

Conclusion

The proposed method has the potential to track the needle in a more reliable operation compared to other state-of-the-art methods and can accurately localize it in 2D B-mode US images in real time, allowing it to be used in current ultrasound intervention procedures.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Pourtaherian A, Ghazvinian Zanjani F, Zinger S, Mihajlovic N, Ng GC, Korsten HHM, de With PHN (2018) Robust and semantic needle detection in 3D ultrasound using orthogonal-plane convolutional neural networks. Int J Comput Assist Radiol Surg 13(9):1321–1333. https://doi.org/10.1007/s11548-018-1798-3

    Article  PubMed  PubMed Central  Google Scholar 

  2. Jiang B, Gao W, Kacher D, Nevo E, Fetics B, Lee Thomas C, Jayender J (2018) Kalman filter-based EM-optical sensor fusion for needle deflection estimation. Int J Comput Assist Radiol Surg 13(4):573–583. https://doi.org/10.1007/s11548-018-1708-8

    Article  PubMed  PubMed Central  Google Scholar 

  3. Daoud MI, Alshalalfah AL, Mohamed OA, Alazrai R (2018) A hybrid camera and ultrasound-based approach for needle localization and tracking using a 3D motorized curvilinear ultrasound probe. Med Image Anal 50:145–166. https://doi.org/10.1016/j.media.2018.09.006

    Article  PubMed  Google Scholar 

  4. Kaya M, Senel E, Ahmad A, Bebek O (2018) Visual needle tip tracking in 2D US guided robotic interventions. Mechatronics 57:129–139. https://doi.org/10.1016/j.mechatronics.2018.12.002

    Article  Google Scholar 

  5. Antico M, Sasazawa F, Wu L, Jaiprakash A, Roberts J, Crawford R, Pandey Ajay K, Fontanarosa D (2019) Ultrasound guidance in minimally invasive robotic procedures. Med Image Anal 54:149–167. https://doi.org/10.1016/j.media.2019.01.002

    Article  PubMed  Google Scholar 

  6. Draper KJ, Blake CC, Gowman L, Downey DB, Fenster A (2000) An algorithm for automatic needle localization in ultrasound-guided breast biopsies. Med Phys 27(8):1971–1979. https://doi.org/10.1118/1.1287437

    Article  CAS  PubMed  Google Scholar 

  7. Zhou H, Qiu W, Ding M, Zhang S (2007) Automatic needle segmentation in 3D ultrasound images using 3D Hough transform. MIPPR 2007: Med Imag Parall Process Images Optimiz Techn 6789:191–197. https://doi.org/10.1117/12.749339

    Article  Google Scholar 

  8. Beigi P, Rohling R, Salcudean SE, Ng GC (2016) Spectral analysis of the tremor motion for needle detection in curvilinear ultrasound via spatiotemporal linear sampling. Int J Comput Assist Radiol Surg 11(6):1183–1192. https://doi.org/10.1007/s11548-016-1402-7

    Article  PubMed  Google Scholar 

  9. Aziz MJ, Amiri Tehrani Zade A, Farnia P, Alimohamadi M, Makkiabadi B, Ahmadian A, Alirezaie J (2021) Accurate automatic glioma segmentation in brain MRI images based on CapsNet. Annu Int Conf IEEE Eng Med Biol Soc IEEE Eng Med Biol Soc Annu Int Conf 2021:3882–3885. https://doi.org/10.1109/EMBC46164.2021.9630324

    Article  Google Scholar 

  10. Amiri Tehrani Zade A, Aziz MJ, Masoudnia S, Mirbagheri A, Ahmadian A (2022) An improved capsule network for glioma segmentation on MRI images: a curriculum learning approach. Comput Biol Med 148:105917

    Article  PubMed  Google Scholar 

  11. Mwikirize C, Nosher JL, Hacihaliloglu I (2018) Convolution neural networks for real-time needle detection and localization in 2D ultrasound. Int J Comput Assist Radiol Surg 13(5):647–657. https://doi.org/10.1007/s11548-018-1721-y

    Article  PubMed  Google Scholar 

  12. Mwikirize C, Nosher JL, Hacihaliloglu I (2019) Learning needle tip localization from digital subtraction in 2D ultrasound. Int J Comput Assist Radiol Surg 14(6):1017–1026. https://doi.org/10.1007/s11548-019-01951-z

    Article  PubMed  Google Scholar 

  13. Gillies DJ, Rodgers JR, Gyacskov I, Roy P, Kakani N, Cool DW, Fenster A (2020) Deep learning segmentation of general interventional tools in two-dimensional ultrasound images. Med Phys 47(10):4956–4970. https://doi.org/10.1002/mp.14427

    Article  PubMed  Google Scholar 

  14. Chen S, Lin Y, Li Z, Wang F, Cao Q (2022) Automatic and accurate needle detection in 2D ultrasound during robot-assisted needle insertion process. Int J Comput Assist Radiol Surg 17(2):295–303. https://doi.org/10.1007/s11548-021-02519-6

    Article  PubMed  Google Scholar 

  15. Mwikirize C, Kimbowa AB, Imanirakiza S, Katumba A, Nosher JL, Hacihaliloglu I (2021) Time-aware deep neural networks for needle tip localization in 2D ultrasound. Int J Comput Assist Radiol Surg 16(5):819–827. https://doi.org/10.1007/s11548-021-02361-w

    Article  PubMed  Google Scholar 

  16. Derakhshani MM, Masoudnia S, Shaker AH, Mersa O, Sadeghi MA, Rastegari M, Araabi BN (2019) Assisted excitation of activations: a learning technique to improve object detectors. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 9201–9210

  17. Hacohen G, Weinshall D (2019) On the power of curriculum learning in training deep networks. In: International conference on machine learning. PMLR, May, pp 2535–2544

  18. Farnebäck G (2003) Two-frame motion estimation based on polynomial expansion. Scandinavian conference on Image analysis. Springer, Berlin and Heidelberg, pp 363–370

    Chapter  Google Scholar 

  19. Prevost R, Salehi M, Jagoda S, Kumar N, Sprung J, Ladikos A, Bauer R, Zetting O, Wein W (2018) 3D freehand ultrasound without external tracking using deep learning. Med Image Anal 48:187–202

    Article  PubMed  Google Scholar 

  20. Ren S, He K, Girshick R, Sun J (2015) Faster r-cnn: towards real-time object detection with region proposal networks. Advances in neural information processing systems, 28

  21. Redmon J, Farhadi A (2018) Yolov3: an incremental improvement. arXiv preprint arXiv:1804.02767

  22. Bochkovskiy A, Wang CY, Liao HYM (2020) Yolov4: optimal speed and accuracy of object detection. arXiv preprint arXiv:2004.10934

  23. Chen LC, Papandreou G, Kokkinos I, Murphy K, Yuille AL (2017) Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFS. IEEE Trans Pattern Anal Mach Intell 40(4):834–848

    Article  PubMed  Google Scholar 

  24. Zhao K, Han Q, Zhang CB, Xu J, Cheng MM (2021) Deep hough transform for semantic line detection. IEEE Trans Patt Anal Mach Intell. https://doi.org/10.1109/TPAMI.2021.3077129

    Article  Google Scholar 

  25. Lasso A, Heffter T, Rankin A, Pinter C, Ungi T, Fichtinger G (2014) PLUS: open-source toolkit for ultrasound-guided intervention systems. IEEE Trans Biomed Eng 61(10):2527–2537

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Funding

This study was funded by the Faculty of Medicine, Tehran University of Medical Sciences, under Grant No. 26189.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alireza Ahmadian.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors. All of the experiments were performed on US videos captured using epidural and femoral vascular access phantoms.

Informed consent

This article does not contain patient data.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Amiri Tehrani Zade, A., Jalili Aziz, M., Majedi, H. et al. Spatiotemporal analysis of speckle dynamics to track invisible needle in ultrasound sequences using convolutional neural networks: a phantom study. Int J CARS 18, 1373–1382 (2023). https://doi.org/10.1007/s11548-022-02812-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-022-02812-y

Keywords

Navigation