Abstract
Purpose
Neuronavigation systems making use of augmented reality (AR) have been the focus of much research in the last couple of decades. In recent years, there has been considerable interest in using mobile devices for AR in the operating room (OR). We propose a complete system that performs real-time AR video augmentation on a mobile device in the context of image-guided neurosurgery.
Methods
MARIN (mobile augmented reality interactive neuronavigation system) improves upon the state of the art in terms of performance, allowing real-time augmentation, and interactivity by allowing users to interact with the displayed data. The system was tested in a user study with 17 subjects for qualitative and quantitative evaluation in the context of target localization and brought into the OR for preliminary feasibility tests, where qualitative feedback from surgeons was obtained.
Results
The results of the user study showed that MARIN performs significantly better in terms of both time (\(p <0.0004\)) and accuracy (\(p <0.04\)) for the task of target localization in comparison with a traditional image-guided neurosurgery (IGNS) navigation system. Further, MARIN AR visualization was found to be more intuitive and allowed users to estimate target depth more easily.
Conclusion
MARIN improves upon previously proposed mobile AR neuronavigation systems with its real-time performance, higher accuracy, full integration in the normal workflow and greater interactivity and customizability of the displayed information. The improvement in efficiency and usability over previous systems will facilitate bringing AR into the OR.







Similar content being viewed by others
References
Carbone M, Piazza R, Condino S (2020) Commercially available head- mounted displays are unsuitable for augmented reality surgical guidance: a call for focused research for surgical applications. Surg Innov. https://doi.org/10.1177/1553350620903197
Deng W, Li F, Wang M, Song Z (2014) Easy-to-use augmented reality neuronavigation using a wireless tablet PC. Stereotact Funct Neurosurg 92(1):17–24
Drouin S, Kersten-Oertel M, Chen SJS, Collins DL (2012) A realistic test and development environment for mixed reality in neurosurgery. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics), vol 7264 LNCS, pp 13–23. https://doi.org/10.1007/978-3-642-32630-1_2
Drouin S, Kochanowska A, Kersten-Oertel M, Gerard IJ, Zelmann R, De Nigris D, Bériault S, Arbel T, Sirhan D, Sadikot AF, Hall JA, Sinclair DS, Petrecca K, DelMaestro RF, Collins DL (2016) IBIS: an OR ready open-source platform for image-guided neurosurgery. Int J Comput Assist Radiol Surg 12(3):363–378. https://doi.org/10.1007/s11548-016-1478-0
Eftekhar B (2016) A smartphone app to assist scalp localization of superficial supratentorial lesions—technical note. World Neurosurg 85:359–363. https://doi.org/10.1016/j.wneu.2015.09.091
Faul F, Erdfelder E, Buchner A, Lang AG (2009) Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses. Behav Res Methods 41(4):1149–1160. https://doi.org/10.3758/BRM.41.4.1149
Galloway RL Jr (2001) The process and development of image-guided procedures. Ann Rev Biomed Eng 3(1):83–108
Galloway RL Jr, Peters T (2008) Overview and history of image-guided interventions. In: Peters T, Cleary K (eds) Image-guided interventions: technology and applications (chap. 1). Springer, Berlin, pp 1–16
Hart SG, Staveland LE (1988) Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. Adv Psychol 52:139–183
Hou Y, Ma L, Zhu R, Chen X, Zhang J (2016) A low-cost iphone-assisted augmented reality solution for the localization of intracranial lesions. PLoS ONE 11(7):1–18. https://doi.org/10.1371/journal.pone.0159185
Kersten-Oertel M, Gerard IJ, Drouin S, Mok K, Sirhan D, Sinclair DS, Collins DL (2015) Augmented reality for specific neurovascular surgical tasks. In: Workshop on augmented environments for computer-assisted interventions, Springer, pp 92–103
Lasso A, Heffter T, Rankin A, Pinter C, Ungi T, Fichtinger G (2014) PLUS: Open-source toolkit for ultrasound-guided intervention systems. IEEE Trans Biomed Eng. https://doi.org/10.1109/TBME.2014.2322864
Léger É, Drouin S, Collins DL, Popa T, Kersten-Oertel M (2017) Quantifying attention shifts in augmented reality image-guided neurosurgery. Healthc Technol Lett 4(5):188–192. https://doi.org/10.1049/htl.2017.0062
Léger É, Reyes J, Drouin S, Collins DL, Popa T, Kersten-Oertel M (2018) Gesture-based registration correction using a mobile augmented reality image-guided neurosurgery system. Healthc Technol Lett 5(5):137–142. https://doi.org/10.1049/htl.2018.5063
Leibinger A, Forte AE, Tan Z, Oldfield MJ, Beyrau F, Dini D, Rodriguez y Baena F (2016) Soft tissue phantoms for realistic needle insertion: a comparative study. Ann Biomed Eng 44(8):2442–2452. https://doi.org/10.1007/s10439-015-1523-0
Roberts DW, Strohbehn JW, Hatch JF, Murray W, Kettenberger H (1986) A frameless stereotaxic integration of computerized tomographic imaging and the operating microscope. J Neurosurg 65(4):545–549. https://doi.org/10.3171/jns.1986.65.4.0545
Sielhorst T, Feuerstein M, Navab N (2008) Advanced medical displays: a literature review of augmented reality. J Disp Technol 4(4):451–467
Tokuda J, Fischer GS, Papademetris X, Yaniv Z, Ibanez L, Cheng P, Liu H, Blevins J, Arata J, Golby AJ, Kapur T, Pieper S, Burdette EC, Fichtinger G, Tempany CM, Hata N (2009) OpenIGTLink: an open network protocol for image-guided therapy environment. Int J Med Robot Comput Assis Surg 5(4):423–434. https://doi.org/10.1002/rcs.274
Ungi T, Lasso A, Fichtinger G (2016) Open-source platforms for navigated image-guided interventions. Med Image Anal 33:181–186. https://doi.org/10.1016/j.media.2016.06.011
Watanabe E, Satoh M, Konno T, Hirai M, Yamaguchi T (2016) The trans-visible navigator: a see-through neuronavigation system using augmented reality. World Neurosurg 87:399–405. https://doi.org/10.1016/j.wneu.2015.11.084
Zhang Z (2000) A flexible new technique for camera calibration. IEEE Trans Pattern Anal Mach Intell 22(11):1330–1334. https://doi.org/10.1109/34.888718
Acknowledgements
This work was supported by the Fond de recherche du Québec - Nature et technologie and the Natural Sciences and Engineering Research Council of Canada.
Funding
Funding was provided by the Natural Sciences and Engineering Research Council of Canada (Grant Number n01573) and Fonds de Recherche du Québec - Nature et Technologies (Grant Number FE0223).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Ethical approval
All procedures performed in studies involving human participants were in accordance with the ethical standards of the University Human Research Ethics Committee (Certification Number: 30007443).
Informed consent
Informed consent was obtained from all participants included in the study.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Léger, É., Reyes, J., Drouin, S. et al. MARIN: an open-source mobile augmented reality interactive neuronavigation system. Int J CARS 15, 1013–1021 (2020). https://doi.org/10.1007/s11548-020-02155-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11548-020-02155-6