Skip to main content

Advertisement

Log in

An Human-Computer Interactive Augmented Reality System for Coronary Artery Diagnosis Planning and Training

  • Image & Signal Processing
  • Published:
Journal of Medical Systems Aims and scope Submit manuscript

Abstract

In order to let the doctor carry on the coronary artery diagnosis and preoperative planning in a more intuitive and more natural way, and to improve the training effect for interns, an augmented reality system for coronary artery diagnosis planning and training (ARS-CADPT) is designed and realized in this paper. At first, a 3D reconstruction algorithm based on computed tomographic (CT) images is proposed to model the coronary artery vessels (CAV). Secondly, the algorithms of static gesture recognition and dynamic gesture spotting and recognition are presented to realize the real-time and friendly human-computer interaction (HCI), which is the characteristic of ARS-CADPT. Thirdly, a Sort-First parallel rendering and splicing display subsystem is developed, which greatly expands the capacity of student users. The experimental results show that, with the use of ARS-CADPT, the reconstruction accuracy of CAV model is high, the HCI is natural and fluent, and the visual effect is good. In a word, the system fully meets the application requirement.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Miller, J.M., Rochitte, C.E., Dewey, M., et al., Diagnostic Performance of Coronary Angiography by 64-row CT. N. Engl. J. Med. 359(359):2324–2336, 2008.

    Article  CAS  PubMed  Google Scholar 

  2. Fine, J.J., Hopkins, C.B., Ruff, N., and Newton, F.C., Comparison of Accuracy of 64-slice Cardiovascular Computed Tomography with Coronary Angiography in Patients with Suspected Coronary Artery Disease. Am. J. Cardiol. 97(2):172–174, 2006.

    Article  Google Scholar 

  3. State, A., Chen, D.T., Tector, C., Brandt, A., Chen, H., Ohbuchi, R., Bajura, M., and Fuchs, H., Case Study: Observing a Volume Rendered Fetus within a Pregnant Patient. Proc. IEEE Vis. 94:364–368, 1994.

    Google Scholar 

  4. Tang, S.L., Kwoh, C.K., Teo, M.Y., Sing, N.W., and Ling, K.V., Augmented reality systems for medical applications: Improving surgical procedures by enhancing the surgeon's 'view' of the patient. IEEE Eng. Med. Biol. Mag. 17(3):49–58, 1998.

    Article  CAS  PubMed  Google Scholar 

  5. Wu, J.R., Wang, M.L., Liu, K.C., et al., Real-time advanced spinal surgery via visible patient model and augmented reality system. Comput. Methods Prog. Biomed. 113(3):869–881, 2014.

    Article  Google Scholar 

  6. De Paolis, L.T., and Aloisio, G., Augmented reality in minimally invasive surgery. Adv. Biomed. Sens. Meas. Instrumen. Syst. 55 LNEE:305–320, 2010.

    Article  Google Scholar 

  7. Huang, D., Tang, W., Wan, T.R., John, N.W., Gould, D., Ding, Y., and Chen, Y., A New Approach to Haptic Rendering of Guiderwires for Use in Minimally Invasive Surgical Simulation. Comput. Animat. Virtual Worlds. 22(2–3):261–268, 2011.

    Article  Google Scholar 

  8. Alexandrova, I.V., Rall, M., Breidt, M., Kloos, U., Tullius, G., Bulthoff, H.H., and Mohler, B.J., Animations of medical training scenarios in immersive virtual environments. In: Proceedings Workshop on Digital Media and Digital Content Management, DMDCM 9–12, 2011.

  9. Oliveira, A.C.M.T.G., Tori, R., Brito, W., Santos, J., Biscaro, H.H., and Nunes, F.L.S., Realistic simulation of deformation for medical training applications. In: Proceedings 2013 15th Symposium on Virtual and Augmented Reality, SVR 272–275, 2013.

  10. Azuma, R.T., A survey of augmented reality. Presence Teleoper. Virtual Environ. 6(4):355–385, 1997.

    Article  Google Scholar 

  11. Lo, C.M., Chen, R.T., Chang, Y.C., et al., Multi-dimensional tumor detection in automated whole breast ultrasound using topographic watershed. IEEE Trans. Med. Imaging. 33(7):1503–1511, 2014.

    Article  PubMed  Google Scholar 

  12. Dhage P, Phegade M R, Shah S K. Watershed segmentation brain tumor detection. Int Conf Pervasive Comput. 1–5, 2015.

  13. TANG, X., WANG, et al., A Comprehensive Interpolation for Medical Slices Based on Combination of Linear and Matching Interpolation. Chin. J. Electron. 20(1):82–84, 2011.

    Google Scholar 

  14. Chen, J., and Ma, W., A novel adaptive 3D medical image interpolation method based on shape. Proc. SPIE Int. Soc. Opt. Eng. 8768:23, 2013.

    Google Scholar 

  15. Osareh, A., and Shadgar, B., A Segmentation Method of Lung Cavities Using Region Aided Geometric Snakes. J. Med. Syst. 34(4):419–433, 2010.

    Article  PubMed  Google Scholar 

  16. Oeltze, S., and Preim, B., 3D visualization of vasculature: an overview. Vis. Med. Life Sci. 1:39–59, 2008.

    Google Scholar 

  17. Antiga, L., Ene-Iordache, B., and Remuzzi, A., Computational geometry for patient-specific reconstruction and meshing of blood vessels from MR and CT angiography. IEEE Trans. Med. Imaging. 22(5):674–684, 2003.

    Article  PubMed  Google Scholar 

  18. Pavlovic, V.I., Sharma, R., and Huang, T.S., Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review. IEEE Trans. Pattern Anal. Mach. Intell. 19(7):677–695, 1997.

    Article  Google Scholar 

  19. Jonathan, A., Vassilis, A., Quan, Y., and Stan, S., A unified framework for gesture recognition and spatiotemporal gesture segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 31(9):1685–1699, 2009.

    Article  Google Scholar 

  20. Chen, X., Xua, L., Wang, Y., Wang, H., Wang, F., Zeng, X., Wang, Q., and Egger, J., Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display. J. Biomed. Inform. 55(C):124–131, 2015.

    Article  PubMed  Google Scholar 

  21. Chen, X., Xu, L., Wang, H., Wang, F., Wang, Q., and Kikinis, R., Development of a surgical navigation system based on 3D Slicer for intraoperative implant placement surgery. Med. Eng. Phys. 41:81–89, 2017.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

This work is supported by Natural Science Foundation of China (Grant No.: 61472245) and Shanghai Municipal Natural Science Foundation (Grant No.: 14ZR1419700 and 13ZR1455600).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qiming Li.

Additional information

This article is part of the Topical Collection on Image & Signal Processing

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, Q., Huang, C., Lv, S. et al. An Human-Computer Interactive Augmented Reality System for Coronary Artery Diagnosis Planning and Training. J Med Syst 41, 159 (2017). https://doi.org/10.1007/s10916-017-0805-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10916-017-0805-5

Keywords

Navigation