Skip to main content

Advertisement

Log in

Augmented visualization with depth perception cues to improve the surgeon’s performance in minimally invasive surgery

  • Origin al Article
  • Published:
Medical & Biological Engineering & Computing Aims and scope Submit manuscript

Abstract

Minimally invasive techniques, such as laparoscopy and radiofrequency ablation of tumors, bring important advantages in surgery: by minimizing incisions on the patient’s body, they can reduce the hospitalization period and the risk of postoperative complications. Unfortunately, they come with drawbacks for surgeons, who have a restricted vision of the operation area through an indirect access and 2D images provided by a camera inserted in the body. Augmented reality provides an “X-ray vision” of the patient anatomy thanks to the visualization of the internal organs of the patient. In this way, surgeons are free from the task of mentally associating the content from CT images to the operative scene. We present a navigation system that supports surgeons in preoperative and intraoperative phases and an augmented reality system that superimposes virtual organs on the patient’s body together with depth and distance information. We implemented a combination of visual and audio cues allowing the surgeon to improve the intervention precision and avoid the risk of damaging anatomical structures. The test scenarios proved the good efficacy and accuracy of the system. Moreover, tests in the operating room suggested some modifications to the tracking system to make it more robust with respect to occlusions.

Augmented visualization in minimally invasive surgery.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23

Similar content being viewed by others

References

  1. Cleary K, Peters T (2010) Image-guided interventions: technology review and clinical applications. Annu Rev Biomed Eng 12:119–142

    Article  CAS  PubMed  Google Scholar 

  2. Gibson E, Giganti F, Hu Y, Bonmati E, Bandula S, Gurusamy K, Davidson B, Pereira SP, Clarkson MJ, Barratt DC (2018) Automatic multi-organ segmentation on abdominal ct with dense v-networks. IEEE Trans Med Imaging 37(8):1822–1834

    Article  PubMed  PubMed Central  Google Scholar 

  3. Wang G, Li W, Zuluaga MA, Pratt R, Patel PA, Aertsen M, Doel T, David AL, Deprest J, Ourselin S, Vercauteren T (2018) Interactive medical image segmentation using deep learning with image-specific fine tuning. IEEE Trans Med Imaging 37(7):1562–1573

    Article  PubMed  Google Scholar 

  4. (October, 2018) Mimics Medical Imaging Software, Materialise Group. http://www.materialise.com/en/medical/software/mimics

  5. (October, 2018) 3D Slicer. http://www.slicer.org

  6. (October, 2018) ParaView. http://www.paraview.org

  7. Ahrens J, Geveci B, Law C (2005) 36 - ParaView: an end-user tool for large-data visualization. In: Visualization handbook. Butterworth-Heinemann, Burlington, pp 717–731

  8. (October, 2018) OsiriX Imaging Software. http://www.osirix-viewer.com

  9. (October, 2018) ITK-SNAP. http://www.itksnap.org

  10. Peters TM, Linte CA (2016) Image-guided interventions and computer-integrated therapy: Quo vadis? Med Image Anal 33:56–63. 20th Anniversary of the Medical Image Analysis Journal (MedIA)

    Article  PubMed  Google Scholar 

  11. Bernhardt S, Nicolau SA, Soler L, Doignon C (2017) The status of augmented reality in laparoscopic surgery as of 2016. Med Image Anal 37:66–90

    Article  PubMed  Google Scholar 

  12. Katic D, Wekerle AL, Görtler J, Spengler P, Bodenstedt S, Röhl S, Suwelack S, Kenngott HG, Wagner M, Müller-Stich BP, Dillmann R, Speidel S (2013) Context-aware augmented reality in laparoscopic surgery. Comput Med Imaging Graph 37(2):174–182. Special Issue on Mixed Reality Guidance of Therapy - Towards Clinical Implementation

    Article  PubMed  Google Scholar 

  13. Sielhorst T, Feuerstein M, Traub J, Kutter O, Navab N (2006) CAMPAR: a software framework guaranteeing quality for medical augmented reality. Int J Comput Assist Radiol Surg 1(SUPPL. 7):29–30

    Google Scholar 

  14. Sauer F (2005) Image registration: enabling technology for image guided surgery and therapy. In: 2005 IEEE engineering in medicine and biology 27th annual conference, pp 7242–7245

  15. Markelj P, Tomaževic D, Likar B, Pernuš F (2012) A review of 3D/2D registration methods for image-guided interventions. Med Image Anal 16(3):642–661. Computer Assisted Interventions

    Article  CAS  PubMed  Google Scholar 

  16. Linte CA, Camp JJ, Holmes DR, Rettmann ME, Robb RA (2013) Toward online modeling for lesion visualization and monitoring in cardiac ablation therapy. In: 16th international conference medical image computing and computer-assisted intervention – MICCAI 2013, Nagoya, Japan, September 22-26, 2013, Proceedings, Part I. Springer Berlin Heidelberg, Berlin, pp 9–17

  17. Maintz J, Viergever MA (1998) A survey of medical image registration. Med Image Anal 2(1):1–36

    Article  CAS  PubMed  Google Scholar 

  18. Rolland JP, Davis L, Baillot Y (2001) A survey of tracking technology for virtual environments. Fundam Wearable Comput Augment Real 8:1–48

    Google Scholar 

  19. Koivukangas T, Katisko JP, Koivukangas JP (2013) Technical accuracy of optical and the electromagnetic tracking systems. SpringerPlus 2(1):1–7

    Article  Google Scholar 

  20. Franz AM, Haidegger T, Birkfellner W, Cleary K, Peters TM, Maier-Hein L (2014) Electromagnetic tracking in medicine -A review of technology, validation, and applications. IEEE Trans Med Imaging 33(8):1702–1725

    Article  PubMed  Google Scholar 

  21. Su LM, Vagvolgyi BP, Agarwal R, Reiley CE, Taylor RH, Hager GD (2009) Augmented reality during robot-assisted laparoscopic partial nephrectomy: toward real-time 3D-CT to stereoscopic video registration. Urology 73(4):896–900

    Article  PubMed  Google Scholar 

  22. Stoyanov D, Yang GZ (2009) Soft tissue deformation tracking for robotic assisted minimally invasive surgery. In: 2009 annual international conference of the IEEE engineering in medicine and biology society, pp 254–257

  23. Roberts DW, Strohbehn JW, Hatch JF, Murray W, Kettenberger H (1986) A frameless stereotaxic integration of computerized tomographic imaging and the operating microscope. J Neurosurg 65(4):545–549

    Article  CAS  PubMed  Google Scholar 

  24. Kelly PJ, Kall BA, Goerss S, IV FE (1986) Computer-assisted stereotaxic laser resection of intra-axial brain neoplasms. J Neurosurg 64(3):427–439

    Article  CAS  PubMed  Google Scholar 

  25. Grimson E (1994) Automated registration for enhanced reality visualization in surgery. In: Proceedings of the 1st international symposium on medical robotics and computer assisted surgery. Pittsburg, Pennsylvania

  26. Watanabe E, Satoh M, Konno T, Hirai M, Yamaguchi T (2016) The trans-visible navigator: a see-through neuronavigation system using augmented reality. World Neurosurg 87:399–405

    Article  PubMed  Google Scholar 

  27. De Paolis LT, De Mauro A, Raczkowsky J, Aloisio G (2009) Virtual model of the human brain for neurosurgical simulation. In: Studies in health technology and informatics, vol 150, pp 811–815

  28. Ricciardi F, Copelli C, De Paolis LT (2017) An augmented reality system for maxillo-facial surgery. Lecture notes in computer science, LNCS 10325. Springer, pp 53–62

  29. Ricciardi F, Copelli C, De Paolis LT (2015) A pre-operative planning module for an augmented reality application in maxillo-facial surgery. Lecture Notes in Computer Science, LNCS 9254, Springer, pp 244–254

  30. Liu L, Ecker TM, Siebenrock KA, Zheng G (2016) Computer assisted planning, simulation and navigation of periacetabular osteotomy. In: 2016 Proceedings medical imaging and augmented reality: 7th international conference, MIAR 2016. Springer International Publishing, Bern, pp 15–26

  31. Lo Presti G, Freschi C, Sinceri S, Morelli G, Ferrari M, Ferrari V (2014) Virtual reality surgical navigation system for holmium laser enucleation of the prostate. In: 2014 revised selected papers augmented and virtual reality: 1st international conference, AVR 2014. Springer International Publishing , Lecce, pp 79–89

  32. Wu JR, Wang ML, Liu KC, Hu MH, Lee PY (2014) Real-time advanced spinal surgery via visible patient model and augmented reality system. Comput Methods Programs Biomed 113(3):869–881

    Article  PubMed  Google Scholar 

  33. Sampogna G, Pugliese R, Elli M, Vanzulli A, Forgione A (2017) Routine clinical application of virtual reality in abdominal surgery. Minim Invasive Ther Allied Technol 26(3):1–12

    Article  Google Scholar 

  34. De Paolis LT (2017) Augmented visualization as surgical support in the treatment of tumors. Lecture Notes in Computer Science, LNCS 10208. Springer, pp 432–443

  35. De Paolis LT, Ricciardi F (2018) Augmented visualization in the treatment of the liver tumours with radiofrequency ablation. Computer Methods in Biomechanics and Biomedical Engineering Imaging & Visualization, Taylor and Francis 6(4):396–404

    Article  Google Scholar 

  36. Nicolau S, Pennec X, Soler L, Buy X, Gangi A, Ayache N, Marescaux J (2009) An augmented reality system for liver thermal ablation: design and evaluation on clinical cases. Med Image Anal 13(3):494–506

    Article  CAS  PubMed  Google Scholar 

  37. De Paolis LT, Ricciardi F, Dragoni A F, Aloisio G (2011) An augmented reality application for the radio frequency ablation of the liver tumors. Lecture Notes in Computer Science, LNCS 6785 (Part 4). Springer, pp 572–581

  38. Pereira PL (2007) Actual role of radiofrequency ablation of liver metastases. Eur Radiol 17(8):2062–2070

    Article  PubMed  Google Scholar 

  39. Wen R, Tay WL, Nguyen BP, Chng CB, Chui CK (2014) Hand gesture guided robot-assisted surgery based on a direct augmented reality interface. Comput Methods Prog Biomed 116(2):68–80. New methods of human-robot interaction in medical practice

    Article  Google Scholar 

  40. Novak EJ, Silverstein MD, Bozic KJ (2007) The cost-effectiveness of computer-assisted navigation in total knee arthroplasty. J Bone Joint Surg Am 89(11):2389–2397

    PubMed  Google Scholar 

  41. De Paolis LT, Aloisio G (2010) Augmented reality in minimally invasive surgery. Lecture Notes in Electrical Engineering, LNEE 55, Springer, pp 305–320

  42. Teistler M, Ampanozi G, Schweitzer W, Flach P, Thali MJ, Ebert LC (2016) Use of a low-cost three-dimensional gaming controller for forensic reconstruction of CT images. J Forensic Radiol Imaging 7:10–13

    Article  Google Scholar 

  43. Jeong JW, Lee J, Park SH, Hyung WJ, Lee S (2014) Vessel navigator for surgical rehearsal system using topological map: an application to gastrectomy. In: The 2014 2nd international conference on systems and informatics (ICSAI 2014), pp 288–292

  44. Turini G, Condino S, Postorino M, Ferrari V, Ferrari M (2016) Improving endovascular intraoperative navigation with real-time skeleton-based deformation of virtual vascular structures. In: 2016 Proceedings augmented reality, virtual reality, and computer graphics: 3rd international conference, AVR 2016, Part II. Springer International Publishing, Lecce, pp 82–91

  45. Chen X, Xu L, Wang Y, Wang H, Wang F, Zeng X, Wang Q, Egger J (2015) Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display. J Biomed Inform 55:124–131

    Article  PubMed  Google Scholar 

  46. (October, 2018) ARToolKit. http://www.hitl.washington.edu/artoolkit

  47. De Paolis LT, Pulimeno M, Aloisio G (2008) An augmented reality application for minimally invasive surgery. In: IFMBE Proceedings, vol 20. Springer, pp 489–492

  48. Aloisio G, Barone L, Bergamasco M, Avizzano C, De Paolis LT, Franceschini M, Mongelli A, Pantile G, Provenzano L, Raspolli M (2004) Computer-based simulator for catheter insertion training. In: Studies in health technology and informatics, vol 98, pp 4–6

  49. Sánchez-Margallo FM, Sánchez-Margallo JA, Moyano-Cuevas JL, Pérez EM, Maestre J (2017) Use of natural user interfaces for image navigation during laparoscopic surgery: initial experience. Minim Invasive Ther Allied Technol 26(5):1–9

    Article  Google Scholar 

  50. Santos L, Carbonaro N, Tognetti A, González JL, de la Fuente E, Fraile JC, Pérez-Turiel J (2018) Dynamic gesture recognition using a smart glove in hand-assisted laparoscopic surgery. Technologies 6 (1):8

    Article  Google Scholar 

  51. De Paolis LT, Pulimeno M, Aloisio G (2010) Advanced visualization and interaction systems for surgical pre-operative planning. J Comput Inf Technol 18(4):385–392

    Article  Google Scholar 

  52. Garber L (2013) Gestural technology: moving interfaces in a new direction [technology news]. Computer 46 (10):22–25

    Article  Google Scholar 

  53. Invitto S, Faggiano C, Sammarco S, De Luca V, De Paolis LT (2016) Haptic, virtual interaction and motor imagery: entertainment tools and psychophysiological testing. Sensors 16(3):394

    Article  Google Scholar 

  54. Zhang G, jun Zhou X, zhan Zhu C, Dong Q, Su L (2016) Usefulness of three-dimensional(3D) simulation software in hepatectomy for pediatric hepatoblastoma. Surg Oncol 25(3):236–243

    Article  PubMed  Google Scholar 

  55. (October, 2018a) NDI Polaris Vicra. https://www.ndigital.com/medical/products/polaris-family/systems

  56. (October, 2018b) Vicon Bonita. https://www.vicon.com/file/vicon/bonita-brochure

  57. (October, 2018) MeshLab. http://www.meshlab.net

  58. (October, 2018) Vicra Measurement Volume. https://www.ndigital.com/medical/products/polaris-family/features/vicra-measurement-volume

  59. Horn BKP (1987) Closed-form solution of absolute orientation using unit quaternions. J Opt Soc Am A 4 (4):629–642

    Article  Google Scholar 

  60. Sielhorst T, Bichlmeier C, Heining SM, Navab N (2006) Depth perception–a major issue in medical AR: evaluation study by twenty surgeons. Med Image Comput Comput Assist Interv 9(Pt 1):364–372

    PubMed  Google Scholar 

  61. Cutting JE, Vishton PM (1995) Chapter 3 - perceiving layout and knowing distances: the integration, relative potency, and contextual use of different information about depth. In: Perception of space and motion, handbook of perception and cognition. Academic Press, San Diego, pp 69–117

  62. Bichlmeier C, Navab N (2006) Virtual window for improved depth perception in medical AR. In: International workshop on augmented environments for medical imaging including augmented reality in computer-aided surgery (AMI-ARCS)

  63. Bork F, Fuers B, Schneider AK, Pinto F, Graumann C, Navab N (2015) Auditory and visio-temporal distance coding for 3-dimensional perception in medical augmented reality. In: 2015 IEEE international symposium on mixed and augmented reality, pp 7–12

  64. (October, 2018) PQP - A Proximity Query Package. http://gamma.cs.unc.edu/SSV

  65. Larsen E, Gottschalk S, Lin MC, Manocha D (1999) Fast proximity queries with swept sphere volumes. Technical report of Department of Computer Science, UNC Chapel Hill, pp 1–32

  66. (October, 2018) IGSTK - Image-Guided Surgery Toolkit. http://www.igstk.org

  67. Cleary K, Ibanez L, Ranjan S, Blake B (2004) IGSTK: a software toolkit for image-guided surgery applications. Int Congr Ser 1268(Supplement C):473–479. CARS 2004 - Computer Assisted Radiology and Surgery. Proceedings of the 18th International Congress and Exhibition

    Article  Google Scholar 

  68. (October, 2018) ITK - Insight Segmentation and Registration Toolkit. http://www.itk.org

  69. (October, 2018) VTK - Visualization Toolkit. http://www.vtk.org

  70. (October, 2018) FLTK - Fast Light Toolkit. http://www.fltk.org

  71. Cleary K, Cheng P, Enquobahrie A, Yaniv Z (2009) In: IGSTK: The book

  72. Auranuch Lorsakul CS, Jackrit S (2008) Point-cloud-to-point-cloud technique on tool calibration for dental implant surgical path tracking

  73. (October, 2018) Blender 3D. http://www.blender.org

  74. McGahan J, Dodd G (2001) Radiofrequency ablation of the liver: current status. Am J Roentgenol 176 (1):3–16

    Article  Google Scholar 

  75. Robu MR, Edwards P, Ramalhinho J, Thompson S, Davidson B, Hawkes D, Stoyanov D, Clarkson MJ (2017) Intelligent viewpoint selection for efficient CT to video registration in laparoscopic liver surgery. Int J Comput Assist Radiol Surg 12(7):1079–1088

    Article  PubMed  PubMed Central  Google Scholar 

  76. Scott WR, Roth G, Rivest JF (2003) View planning for automated three-dimensional object reconstruction and inspection. ACM Comput Surv 35(1):64–96

    Article  Google Scholar 

  77. Sánchez-Margallo FM, Moyano-Cuevas JL, Latorre R, Maestre J, Correa L, Pagador J B, Sánchez-Peralta LF, Sánchez-Margallo JA, Usón-Gargallo J (2011) Anatomical changes due to pneumoperitoneum analyzed by mri: an experimental study in pigs. Surg Radiol Anat 33(5):389–396

    Article  PubMed  Google Scholar 

  78. Zahra Ronaghi DMK, Duffy EB (2015) Toward real-time remote processing of laparoscopic video. J Med Image 2(4):2–2–5

    Google Scholar 

  79. Shams R, Sadeghi P, Kennedy RA, Hartley RI (2010) A survey of medical image registration on multicore and the GPU. IEEE Signal Process Mag 27(2):50–60

    Article  Google Scholar 

  80. Fluck O, Vetter C, Wein W, Kamen A, Preim B, Westermann R (2011) A survey of medical image registration on graphics hardware. Comput Methods Programs Biomed 104(3):45–57

    Article  Google Scholar 

  81. Schoob A, Kundrat D, Kahrs LA, Ortmaier T (2017) Stereo vision-based tracking of soft tissue motion with application to online ablation control in laser microsurgery. Med Image Anal 40:80–95

    Article  PubMed  Google Scholar 

  82. Reichard D, Häntsch D, Bodenstedt S, Suwelack S, Wagner M, Kenngott H, Müller-Stich B, Maier-Hein L, Dillmann R, Speidel S (2017) Projective biomechanical depth matching for soft tissue registration in laparoscopic surgery. International Journal of Computer Assisted Radiology and Surgery

  83. Blavier A, Gaudissart Q, Cadiere GB, Nyssen AS (2006) Impact of 2d and 3d vision on performance of novice subjects using da vinci robotic system. Acta Chir Belg 106(6):662–664

    Article  CAS  PubMed  Google Scholar 

  84. Alaraimi B, El Bakbak W, Sarker S, Makkiyah S, Al-Marzouq A, Goriparthi R, Bouhelal A, Quan V, Patel B (2014) A randomized prospective study comparing acquisition of laparoscopic skills in three-dimensional (3d) vs. two-dimensional (2d) laparoscopy. World J Surg 38(11):2746–2752

    Article  CAS  PubMed  Google Scholar 

  85. Zhang L, Zhang YQ, Zhang JS, Xu L, Jonas JB (2012) Visual fatigue and discomfort after stereoscopic display viewing. Acta Ophthalmol 91(2):e149–e153

    Article  PubMed  Google Scholar 

  86. Malik AS, Khairuddin RNHR, Amin HU, Smith ML, Kamel N, Abdullah JM, Fawzy SM, Shim S (2015) EEG based evaluation of stereoscopic 3D displays for viewer discomfort. BioMedical Engineering OnLine 14(1):21

    Article  PubMed  PubMed Central  Google Scholar 

  87. Sinha R, Raje S, Rao G (2017) Three-dimensional laparoscopy: principles and practice. J Minimal Access Surgery 13(3):165–169

    Google Scholar 

  88. Dixon BJ, Daly MJ, Chan H, Vescan AD, Witterick IJ, Irish JC (2013) Surgeons blinded by enhanced navigation: the effect of augmented reality on attention. Surg Endosc 27(2):454–461

    Article  PubMed  Google Scholar 

  89. Lerotic M, Chung AJ, Mylonas G, Yang GZ (2007) Pq-space based non-photorealistic rendering for augmented reality. In: 10th international conference medical image computing and computer-assisted intervention – MICCAI 2007, Brisbane, Australia, October 29 - November 2, 2007, Proceedings, Part II. Springer, Berlin, pp 102–109

  90. Mendez E, Kalkofen D, Schmalstieg D (2006) Interactive context-driven visualization tools for augmented reality. In: Proceedings of the 5th IEEE and ACM international symposium on mixed and augmented reality, ISMAR ’06. IEEE Computer Society, Washington, pp 209–218

  91. Bichlmeier C, Wimmer F, Heining SM, Navab N (2007) Contextual anatomic mimesis hybrid in-situ visualization method for improving multi-sensory depth perception in medical augmented reality. In: 2007 6th IEEE and ACM international symposium on mixed and augmented reality, pp 129–138

  92. Bichlmeier C, Heining SM, Feuerstein M, Navab N (2009) The virtual mirror: a new interaction paradigm for augmented reality environments. IEEE Trans Med Imaging 28(9):1498–1510

    Article  PubMed  Google Scholar 

  93. Reichelt S, Häussler R, Fütterer G, Leister N (2010) Depth cues in human visual perception and their realization in 3D displays

  94. Livatino S, De Paolis LT, D’Agostino M, Zocco A, Agrimi A, De Santis A, Bruno LV, Lapresa M (2015) Stereoscopic visualization and 3D technologies in medical endoscopic teleoperation. IEEE Trans Ind Electron 62(1):525–535

    Article  Google Scholar 

  95. Nicolaou M, James A, Lo BPL, Darzi A, Yang GZ (2005) Invisible shadow for navigation and planning in minimal invasive surgery. In: 8th international conference medical image computing and computer-assisted intervention – MICCAI 2005, Palm Springs, CA, USA, October 26-29, 2005, Proceedings, Part II. Springer, Berlin, pp 25–32

  96. Hansen C, Wieferich J, Ritter F, Rieder C, Peitgen HO (2010) Illustrative visualization of 3D planning models for augmented reality in liver surgery. Int J Comput Assist Radiol Surg 5(2):133–141

    Article  PubMed  Google Scholar 

  97. Johnson L, Edwards P, Griffin L, Hawkes D (2004) Depth perception of stereo overlays in image-guided surgery

  98. Kalia M, Schulte zu Berge C, Roodaki H, Chakraborty C, Navab N (2016) Interactive depth of focus for improved depth perception. In: 2016 Proceedings medical imaging and augmented reality: 7th international conference, MIAR 2016. Springer International Publishing, Bern, pp 221–232

  99. (October, 2018) Spectra Measurement Volume. https://www.ndigital.com/medical/products/polaris-family/features/measurement-volume

  100. Mamone V, Viglialoro RM, Cutolo F, Cavallo F, Guadagni S, Ferrari V (2017) Robust laparoscopic instruments tracking using colored strips. In: 4th international conference augmented and virtual reality, and computer graphics (AVR 2017). Lecture Notes in Computer Science, LNCS 10325. Springer, Ugento, pp 129–143

  101. Invitto S, Faggiano C, Sammarco S, De Luca V, De Paolis LT (2015) Interactive entertainment, virtual motion training and brain ergonomy. In: 7th international conference on intelligent technologies for interactive entertainment (INTETAIN 2015), pp 88–94

  102. Lahanas V, Loukas C, Georgiou K, Lababidi H, Al-Jaroudi D (2017) Virtual reality-based assessment of basic laparoscopic skills using the leap motion controller. Surgical Endoscopy

  103. Rawat S, Vats S, Kumar P (2016) Evaluating and exploring the MYO ARMBAND. In: 2016 international conference system modeling advancement in research trends (SMART), pp 115–120

  104. Indraccolo C, De Paolis LT (2017) Augmented reality and MYO for a touchless interaction with virtual organs. Lecture notes in computer science, LNCS 10325. Springer, pp 63–73

  105. De Luca V, Meo A, Mongelli A, Vecchio P, De Paolis LT (2016) Development of a virtual simulator for microanastomosis: new opportunities and challenges. Lecture notes in computer science, LNCS 9769. Springer, pp 65–81

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lucio Tommaso De Paolis.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

De Paolis, L.T., De Luca, V. Augmented visualization with depth perception cues to improve the surgeon’s performance in minimally invasive surgery. Med Biol Eng Comput 57, 995–1013 (2019). https://doi.org/10.1007/s11517-018-1929-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11517-018-1929-6

Keywords

Navigation