Skip to main content

Advertisement

Log in

A novel complementation method of an acoustic shadow region utilizing a convolutional neural network for ultrasound-guided therapy

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

Noise-free ultrasound images are essential for organ monitoring during regional ultrasound-guided therapy. When the affected area is located under the ribs, however, acoustic shadow is caused by the reflection of sound from hard tissues such as bone, and the image is output with missing information in this region. Therefore, in the present study, we attempt to complement the image in the missing area.

Methods

The overall flow of the complementation method to generate a shadow-free composite image is as follows. First, we constructed a binary classification method for the presence or absence of acoustic shadow on a phantom kidney based on a convolutional neural network. Second, we created a composite shadow-free image by searching for a suitable image from a time-series database and superimposing the corresponding area without shadow onto the missing area of the target image. In addition, we constructed and verified an automatic kidney mask generation method utilizing U-Net.

Results

The complementation accuracy for kidney tracking could be enhanced by template matching. Zero-mean normalized cross-correlation (ZNCC) values after complementation were higher than that of before complementation under four different data generation conditions: (i) changing the position of the bed of the robotic ultrasound diagnostic system in the translational direction, (ii) changing the probe angle in the translational direction, (iii) with the addition of rotational motion of the probe to condition (ii). Although there was large variation in the shape of the kidney contour in condition (iii), the proposed method improved the ZNCC value from 0.5437 to 0.5807.

Conclusions

The effectiveness of the proposed method was demonstrated in phantom experiments. Verification of its effectiveness in real organs is necessary in future study.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Powers J, Kremkau F (2011) Medical ultrasound systems. Interface focus 1(4):477–489

    Article  Google Scholar 

  2. Flatters SJL (2008) Characterization of a model of persistent postoperative pain evoked by skin/muscle incision and retraction (SMIR). PAIN® 135(1–2):119–130

    Article  Google Scholar 

  3. Kennedy JE (2005) High-intensity focused ultrasound in the treatment of solid tumours. Nat Rev Cancer 5(4):321–327

    Article  CAS  Google Scholar 

  4. Hellier P et al (2010) An automatic geometrical and statistical method to detect acoustic shadows in intraoperative ultrasound brain images. Med Image Anal 14(2):195–204

    Article  Google Scholar 

  5. Koizumi N, Seo J, Lee D, Nomiya A, Yoshinaka K, Sugita N, Matsumoto Y, Homma Y, Mitsuishi M (2010) Integration of diagnostics and therapy by ultrasound and robot technology. In: International symposium on micro-nanomechatronics and human science. IEEE, pp 53–58

  6. Gamal E, Ahmed FE, Elmogy M, Atwan A (2016) Current trends in medical image registration and fusion. Egypt Inf J 17(1):99–124

    Google Scholar 

  7. Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. Arxiv preprint arXiv:1409.1556

  8. Ronneberger O, Fischer P, Brox T (2015) U-net: convolutional networks for biomedical image segmentation. In: International conference on medical image computing and computer-assisted intervention. Springer, Cham

  9. Otsuka A, Koizumi N, Hosoi I, Tsukihara H, Nishiyama Y (2018) Method for extracting acoustic shadows to construct an organ composite model in ultrasound images. In: 15th international conference on ubiquitous robots. IEEE, pp 719–722

  10. Hu R, Singla R, Deeba F, Rohling RN (2019) Acoustic shadow detection: study and statistics of B-mode and radiofrequency data. Ultrasound Med Biol 45(8):2248–2257

    Article  Google Scholar 

  11. Yasutomi S, Arakaki T, Matsuoka R, Sakai A, Komatsu R, Shozu K, Komatsu M (2021) Shadow estimation for ultrasound images using auto-encoding structures and synthetic shadows. Appl Sci 11(3):1127

    Article  CAS  Google Scholar 

  12. Meng Q, Baumgartner C, Sinclair M, Housden J, Rajchl M, Gomez A, Kainz B (2018) Automatic shadow detection in 2d ultrasound images. In: Data driven treatment response assessment and preterm, perinatal, and paediatric image analysis. Springer, Cham, pp 66–75

  13. Kobayashi K, Sasaki Y, Eura F, Kondo R, Tomita K, Kobayashi T, Watanabe Y, Otsuka A, Tsukihara H, Matsumoto N, Numata K, Nagaoka H, Iwai T, Iijima H, Nishiyama Y, Koizumi N (2019) Development of bed-type ultrasound diagnosis and therapeutic robot. In: 2019 IEEE international conference on cyborg and bionic systems. IEEE, pp 171–176

  14. Abolmaesumi P, Salcudean SE, Zhu WH, Sirouspour MR, DiMaio SP (2002) Image-guided control of a robot for medical ultrasound. IEEE Trans Robot Autom 18(1):11–23

    Article  Google Scholar 

  15. Melodelima D, N'Djin WA, Miller NR, Bamber JC, Chapelon JY (2009) Comparative study of the effects of respiratory notion on in-vivo HIFU treatments in the liver. In: IEEE international ultrasonics symposium, pp 1314–1317

  16. Koizumi N et al (2014) Remote ultrasound diagnostic system (RUDS). J Robot Mechatron 26(3):396–397

    Article  Google Scholar 

  17. Hazelaar C, Dahele M, Mostafavi H, van der Weide L, Slotman B, Verbakel W (2018) Markerless positional verification using template matching and triangulation of kV images acquired during irradiation for lung tumors treated in breath-hold. Phys Med Biol 63(11):115005

    Article  Google Scholar 

  18. Hartmann W, et al. (2017) Learned multi-patch similarity. In: Proceedings of the IEEE international conference on computer vision

  19. Kingma DP, Ba J (2014) Adam: a method for stochastic optimization. Arxiv preprint 375. arXiv:1412.6980 376

Download references

Acknowledgements

The authors gratefully acknowledge the support by Hideyo Miyazaki (Center Hospital of the National Center for Global Health and Medicine), Hideyuki Iijima, Toshiyuki Iwai, Hidetoshi Nagaoka (Obayashi Mfg. Co., Ltd.), the financial support by JSPS KAKENHI JP20H02113 Grant Number and the Saitama Prefecture New Technology and Product Development Subsidy Project.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Norihiro Koizumi.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Informed consent

This article does not contain patient data.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Matsuyama, M., Koizumi, N., Otsuka, A. et al. A novel complementation method of an acoustic shadow region utilizing a convolutional neural network for ultrasound-guided therapy. Int J CARS 17, 107–119 (2022). https://doi.org/10.1007/s11548-021-02525-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-021-02525-8

Keywords

Navigation