Abstract
Purpose
Image-guided percutaneous interventions are safer alternatives to conventional orthopedic and trauma surgeries. To advance surgical tools in complex bony structures during these procedures with confidence, a large number of images is acquired. While image-guidance is the de facto standard to guarantee acceptable outcome, when these images are presented on monitors far from the surgical site the information content cannot be associated easily with the 3D patient anatomy.
Methods
In this article, we propose a collaborative augmented reality (AR) surgical ecosystem to jointly co-localize the C-arm X-ray and surgeon viewer. The technical contributions of this work include (1) joint calibration of a visual tracker on a C-arm scanner and its X-ray source via a hand-eye calibration strategy, and (2) inside-out co-localization of human and X-ray observers in shared tracking and augmentation environments using vision-based simultaneous localization and mapping.
Results
We present a thorough evaluation of the hand-eye calibration procedure. Results suggest convergence when using 50 pose pairs or more. The mean translation and rotation errors at convergence are 5.7 mm and \(0.26^\circ \), respectively. Further, user-in-the-loop studies were conducted to estimate the end-to-end target augmentation error. The mean distance between landmarks in real and virtual environment was 10.8 mm.
Conclusions
The proposed AR solution provides a shared augmented experience between the human and X-ray viewer. The collaborative surgical AR system has the potential to simplify hand-eye coordination for surgeons or intuitively inform C-arm technologists for prospective X-ray view-point planning.
Similar content being viewed by others
References
Hong G, Cong-Feng L, Cheng-Fang H, Chang-Qing Z, Bing-Fang Z (2010) Percutaneous screw fixation of acetabular fractures with 2d fluoroscopy-based computerized navigation. Arch Orthop Trauma Surg 130(9):1177–1183
Gay B, Goitz HT, Kahler A (1992) Percutaneous CT guidance : screw fixation of acetabular fractures preliminary results of a new technique with. Am J Roentgenol 158(4):819–822
Pohlemann T, Gänsslen A, Schellwald O, Culemann U, Tscherne H (1996) Outcome after pelvic ring injuries. Injury 27:31–38
Wong JML, Bucknill A (2017) Fractures of the pelvic ring. Injury 48(4):795–802
Markelj P, Tomaževič D, Likar B, Pernuš F (2012) A review of 3d/2d registration methods for image-guided interventions. Med Image Anal 16(3):642–661
Matthews F, Hoigne DJ, Weiser M, Wanner GA, Regazzoni P, Suhm N, Messmer P (2007) Navigating the fluoroscope’s c-arm back into position: an accurate and practicable solution to cut radiation and optimize intraoperative workflow. J Orthop Trauma 21(10):687–692
Cartiaux O, Paul L, Docquier PL, Raucent B, Dombre E, Banse X (2010) Computer-assisted and robot-assisted technologies to improve bone-cutting accuracy when integrated with a freehand process using an oscillating saw. JBJS 92(11):2076–2082
Sefati S, Alambeigi F, Iordachita I, Armand M, Murphy RJ (2016) Fbg-based large deflection shape sensing of a continuum manipulator: manufacturing optimization. In: SENSORS, 2016 IEEE, pp. 1–3. IEEE
Taylor RH, Menciassi A, Fichtinger G, Fiorini P, Dario P (2016) Medical robotics and computer-integrated surgery. In: Siciliano B, Khatib O (eds) Springer handbook of robotics. Springer, Cham, pp 1657–1684. https://doi.org/10.1007/978-3-319-32552-1_63
Sarin VK, Apgar ME, Bruce RA, Pratt WR, Pratt CR (2010) Non-imaging, computer assisted navigation system for hip replacement surgery (2010). US Patent 7,780,681
Sefati S, Pozin M, Alambeigi F, Iordachita I, Taylor RH, Armand M (2017) A highly sensitive fiber bragg grating shape sensor for continuum manipulators with large deflections. In: SENSORS, 2017 IEEE, pp. 1–3. IEEE
Lang J, Mannava S, Floyd A, Goddard M, Smith B, Mofidi A, Seyler TM, Jinnah R (2011) Robotic systems in orthopaedic surgery. J Bone Joint Surg Br Vol 93(10):1296–1299
Sugano N (2003) Computer-assisted orthopedic surgery. J Orthop Sci 8(3):442–448
Kim CW, Lee YP, Taylor W, Oygar A, Kim WK (2008) Use of navigation-assisted fluoroscopy to decrease radiation exposure during minimally invasive spine surgery. Spine J 8(4):584–590
van de Kraats EB, van Walsum T, Kendrick L, Noordhoek NJ, Niessen WJ (2006) Accuracy evaluation of direct navigation with an isocentric 3d rotational x-ray system. Med Image Anal 10(2):113–124
Kraus M, Weiskopf J, Dreyhaupt J, Krischak G, Gebhard F (2015) Computer-aided surgery does not increase the accuracy of dorsal pedicle screw placement in the thoracic and lumbar spine: a retrospective analysis of 2,003 pedicle screws in a level i trauma center. Global Spine J 5(2):93–101
Joskowicz L, Hazan EJ (2016) Computer aided orthopaedic surgery: incremental shift or paradigm change? Med Image Anal 33:84–90. https://doi.org/10.1016/j.media.2016.06.036
Navab N, Heining SM, Traub J (2010) Camera augmented mobile c-arm (camc): calibration, accuracy study, and clinical applications. IEEE Trans Med Imaging 29(7):1412–1423
Tucker E, Fotouhi J, Lee S, Unberath M, Fuerst B, Johnson A, Armand M, Osgood G, Navab N (2018) Towards clinical translation of augmented orthopedic surgery: from pre-op ct to intra-op x-ray via rgbd sensing. In: SPIE medical imaging
Fotouhi J, Fuerst B, Lee SC, Keicher M, Fischer M, Weidert S, Euler E, Navab N, Osgood G (2016) Interventional 3d augmented reality for orthopedic and trauma surgery. In: 16th annual meeting of the int. society for computer assisted orthopedic surgery (CAOS)
Frantz T, Jansen B, Duerinck J, Vandemeulebroucke J (2018) Augmenting microsoft’s hololens with vuforia tracking for neuronavigation. Healthc Technol Lett 5(5):221–225
Liu H, Auvinet E, Giles J, y Baena FR (2018) Augmented reality based navigation for computer assisted hip resurfacing: a proof of concept study. Ann Biomed Eng 46(10):1595–1605
Song T, Yang C, Dianat O, Azimi E (2018) Endodontic guided treatment using augmented reality on a head-mounted display system. Healthc Technol Lett 5(5):201–207
Chen X, Xu L, Wang Y, Wang H, Wang F, Zeng X, Wang Q, Egger J (2015) Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display. J Biomed Inform 55:124–131
El-Hariri H, Pandey P, Hodgson AJ, Garbi R (2018) Augmented reality visualisation for orthopaedic surgical guidance with pre-and intra-operative multimodal image data fusion. Healthc Technol Lett 5(5):189–193
Fotouhi J, Fuerst B, Johnson A, Lee SC, Taylor R, Osgood G, Navab N, Armand M (2017) Pose-aware C-arm for automatic re-initialization of interventional 2D/3D image registration. Int J Comput Assist Radiol Surg 12(7):1221–1230
Fotouhi J, Alexander CP, Unberath M, Taylor G, Lee SC, Fuerst B, Johnson A, Osgood GM, Taylor RH, Khanuja H, Armand M, Navab N (2018) Plan in 2-d, execute in 3-d: an augmented reality solution for cup placement in total hip arthroplasty. J Med Imaging 5(2):021205
Andress S, Johnson A, Unberath M, Winkler A, Yu K, Fotouhi J, Weidert S, Osgood G, Navab N (2018) On-the-fly augmented reality for orthopedic surgery using a multimodal fiducial. J Med Imaging 5:5–12
Endres F, Hess J, Engelhard N, Sturm J, Cremers D, Burgard W (2012) An evaluation of the RGB–D SLAM system. In: 2012 IEEE international conference on robotics and automation, pp 1691–1696. https://doi.org/10.1109/ICRA.2012.6225199
Hajek J, Unberath M, Fotouhi J, Bier B, Lee SC, Osgood G, Maier A, Armand M, Navab N (2018) Closing the calibration loop: an inside-out-tracking paradigm for augmented reality in orthopedic surgery. In: Medical image computing and computer assisted intervention – MICCAI 2018. Springer, Cham, pp 299–306. https://doi.org/10.1007/978-3-030-00937-3_35
Tsai RY, Lenz RK (1989) A new technique for fully autonomous and efficient 3d robotics hand/eye calibration. IEEE Trans Robot Autom 5(3):345–358
Berger M, Müller K, Aichert A, Unberath M, Thies J, Choi JH, Fahrig R, Maier A (2016) Marker-free motion correction in weight-bearing cone-beam CT of the knee joint. Med Phys 43(3):1235–1248
De Silva T, Uneri A, Ketcha M, Reaungamornrat S, Kleinszig G, Vogt S, Aygun N, Lo S, Wolinsky J, Siewerdsen J (2016) 3D–2D image registration for target localization in spine surgery: investigation of similarity metrics providing robustness to content mismatch. Phys Med Biol 61(8):3009
de Oliveira ME, Debarba HG, Lädermann A, Chagué S, Charbonnier C (2019) A hand-eye calibration method for augmented reality applied to computer-assisted orthopedic surgery. Int J Med Robot Comput Assist Surg 15(2):e1969
Preuhs A, Berger M, Bauer S, Redel T, Unberath M, Achenbach S, Maier A (2018) Viewpoint planning for quantitative coronary angiography. Int J Comput Assist Radiol Surg 13:1–9
Unberath M, Fotouhi J, Hajek J, Maier A, Osgood G, Taylor R, Armand M, Navab N (2018) Augmented reality-based feedback for technician-in-the-loop c-arm repositioning. Healthc Technol Lett 5(5):143–147
Andress S, Johnson A, Unberath M, Winkler A, Yu K, Fotouhi J, Weidert S, Osgood G, Navab N (2008) On-the-fly augmented reality for orthopaedic surgery using a multi-modal fiducial. arXiv preprint arXiv:1801.01560
Tuceryan M, Genc Y, Navab N (2002) Single-point active alignment method (spaam) for optical see-through hmd calibration for augmented reality. Presence Teleoper Virtual Environ 11(3):259–276
Mezger U, Jendrewski C, Bartels M (2013) Navigation in surgery. Langenbeck’s Arch Surg 398(4):501–514
Acknowledgements
Research in this work was supported in part by the Graduate Student Fellowship from Johns Hopkins Applied Physics Laboratory, NIH R01 EB023939, R21 EB020113, R01 EB016703 Johns Hopkins University internal funding sources, and the NVIDIA Corporation with the donation of the GPUs used for this research. The authors want to thank Gerhard Kleinzig and Sebastian Vogt from Siemens Healthineers for their support and making a Siemens ARCADIS Orbic 3-D available.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Disclaimer
The concepts and information presented in this paper are based on research and are not commercially available.
Conflict of interest
The authors have no conflict of interest to declare.
Informed consent
This article does not contain human subjects research.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Fotouhi, J., Unberath, M., Song, T. et al. Co-localized augmented human and X-ray observers in collaborative surgical ecosystem. Int J CARS 14, 1553–1563 (2019). https://doi.org/10.1007/s11548-019-02035-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11548-019-02035-8