Abstract
Purpose
Large volumes of information in the OR are ignored by surgeons when the amount outpaces human mental processing abilities. We developed an augmented reality (AR) system for dental implant surgery that acts as an automatic information filter, selectively displaying only relevant information. The purpose is to reduce information overflow and offer intuitive image guidance. The system was evaluated in a pig cadaver experiment.
Methods
Information filtering is implemented via rule-based situation interpretation with description logics. The interpretation is based on intraoperative distances measurement between anatomical structures and the dental drill with optical tracking. For AR, a head-mounted display is used, which was calibrated with a novel method based on SPAAM. To adapt to surgeon specific preferences, we offer two alternative display formats: one with static and another with contact analog AR.
Results
The system made the surgery easier and showed ergonomical benefits, as assessed by a questionnaire. All relevant phases were recognized reliably. The new calibration showed significant improvements, while the deviation of the realized implants was \(<\)2.5 mm.
Conclusion
The system allowed the surgeon to fully concentrate on the surgery itself. It offered greater flexibility since the surgeon received all relevant information, but was free to deviate from it. Accuracy of the realized implants remains an open issue and part of future work.




Similar content being viewed by others
References
Ewers R, Schicho K, Truppe M, Seemann R, Reichwein A et al (2004) Computer-aided navigation in dental implantology: 7 years of clinical experience. J Oral Maxillofac Surg 62(3):329–334
Hassfeld S, Muehling J (2001) Computer assisted oral and maxillofacial surgery a review and an assessment of technology. J Oral Maxillofac Surg 30(1):2–13
Kersten-Oertel M, Jannin P, Collins DL (2013) The state of the art of visualization in mixed reality image guided surgery. Comput Med Imaging Graph 37(2):98–112
Woods DD, Patterson ES, Roth EM (2002) Can we ever escape from data overload? A cognitive systems diagnosis. Cognit Technol Work 4(1):22–36
Yamaguchi S, Ohtani T, Yatani H, Sohmura T (2009) Augmented reality system for dental implant surgery. HCI 13:633–638
Yamaguchi S, Ohtani T, Ono S, Yamanishi Y, Sohmura T, Yatani H (2011) Intuitive surgical navigation system for dental implantology by using retinal imaging display, implant dentistry—a rapidly evolving practice Prof. Ilser Turkyilmaz (Ed.), ISBN: 978-953-307-658-4, InTech. doi:10.5772/19034
Ploder O, Wagner A, Enislidis G, Ewers R (1995) Computer-assisted intraoperative visualization of dental implants. Augmented reality in medicine. Radiologe 35(9):569–572
Wanschitz F, Birkfellner W, Figl M, Patruta S, Wagner A, Watzinger F, Yerit K et al (2002) Computer-enhanced stereoscopic vision in a head-mounted display for oral implant surgery. Clin Oral Implant Res 13(6):610–616
Tran HH, Suenaga H, Kuwana K, Masamune K, Dohi T, Nakajima S, Liao H (2011) Augmented reality system for oral surgery using 3D auto stereoscopic visualization medical image computing and computer-assisted intervention—MICCAI 2011 LNCS 6891, 81-88
Lin YK, Yau HT, Wang IC, Zheng C, Chung KH (2013) A novel dental implant guided surgery based on integration of surgical template and augmented reality, Clin Implant Dent Relat Res. doi:10.1111/cid.12119
Phattanapon R, Kugamoorthy G, Peter H, Matthew D, Siriwan S (2010) Augmented reality haptics system for dental surgical skills training VRST ’10. In: Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology 97–98
Behringer R, Christian J, Holzinger A, Wilkinson S (2007) Some usability Issues of augmented and mixed reality for e-health applications in the medical domain. USAB 255–266
Adam EC (1993) Fighter cockpits of the future. In: proceedings of 12th IEEE/AIAA digital avionics systems conference (DASC), Int J Comput Assist Radiol Surg, 318–323
Lalys F, Jannin P (2013) Surgical process modelling: a review. Int J Comput Assist Radiol Surg [Epub ahead of print]
Blum T, Feuner H, Navab N (2010) Modeling and segmentation of surgical workflow from laparoscopic video MICCAI 2010
Padoy N, Blum T, Ahmadi A, Feuner H, Berger MO, Navab N (2010) Statistical modeling and recognition of surgical workflow, medical image analysis (2010)
Lalys F, Riaud L, Morandi X, Jannin P (2011) Surgical phases detection from microscope videos by combining SVM and HMM medical computer vision. MCV’10 proceedings of the 2010 international MICCAI conference on Medical computer vision: recognition techniques and applications in medical imaging
Bouarfa L, Jonkera PP, Dankelmana J (2010) Discovery of high-level tasks in the operating room. J Biomed Inform
James A, Vieira D, Lo B, Darzi A, Yang GZ (2007) Eye-gaze driven surgical workflow segmentation MICCAI 2007
Jannin P, Morandi X (2007) Surgical models for computer-assisted neurosurgery. Neuroimage 37:783–791
Neumuth T, Jannin P, Schlomberg J, Meixensberger J, Wiedemann P, Burgert O (2010) Analysis of surgical intervention populations using generic surgical process models. Int J Comput Assist Radiol Surg 6(1):59–71
Baader F, Calvanese D, McGuinness DL, Nardi D, Patel-Schneider PF (2003) The description logic handbook: theory, implementation, applications. Cambridge University Press, Cambridge
Neumuth T, Strau G, Meixensberger J, Lemke HU, Burgert O (2006) Acquisition of process descriptions from surgical interventions. DEXA2006, LNCS(4080)
Katic D, Wekerle AL, Bodenstedt S, Kenngott H, Mueller-Stich BP, Dillmann R, Speidel S (2011) Logic-based situation interpretation with real-valued sensor data for laparoscopic surgery. M2CAI 2011–2nd workshop on modeling and monitoring of computer assisted interventions
Katic D, Wekerle AL, Grtler J, Spengler P, Bodenstedt S, Rhl S, Suwelack S, Kenngott HG, Wagner M, Mueller-Stich BP, Dillmann R, Speidel S (2013) Context-aware augmented reality in laparoscopic surgery. Comp Med Imag Graph 37(2):174–182
Katic D, Sudra G, Speidel S, Castrillon-Oberndorfer G, Eggers G, Dillmann R (2010) Knowledge-based situation interpretation for context-aware augmented reality in dental implant surgery MIAR 2010
Tuceryan M, Navab N (2000) Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR IEEE and ACM international symposium on augmented reality
Katic D, Christian T, Castrillon-Oberndorfer G, Hoffmann J, Eggers G, Dillmann R, Speidel S (2011) Calibration of see-through-goggles for a context-aware augmented reality system computer assisted radiology and surgery
Acknowledgments
The present research is supported by the German Research Foundation (Research Grant DI 330/23-1) and part of the “SFB TRR 125 Cognition-Guided Surgery” founded by the German Research Foundation. It is furthermore sponsored by the European Social Fund of the State Baden-Wuerttemberg.
Conflict of interest
Darko Katić, Patrick Spengler, Sebastian Bodenstedt, Gregor Castrillon-Oberndorfer, Robin Seeberger, Juergen Hoffmann, Ruediger Dillmann and Stefanie Speidel declare that they have no conflict of interest.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Katić, D., Spengler, P., Bodenstedt, S. et al. A system for context-aware intraoperative augmented reality in dental implant surgery. Int J CARS 10, 101–108 (2015). https://doi.org/10.1007/s11548-014-1005-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11548-014-1005-0