Skip to main content

Advertisement

Log in

A practical marker-less image registration method for augmented reality oral and maxillofacial surgery

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Background

Image registration lies in the core of augmented reality (AR), which aligns the virtual scene with the reality. In AR surgical navigation, the performance of image registration is vital to the surgical outcome.

Methods

This paper presents a practical marker-less image registration method for AR-guided oral and maxillofacial surgery where a virtual scene is generated and mixed with reality to guide surgical operation or provide surgical outcome visualization in the manner of video see-through overlay. An intraoral 3D scanner is employed to acquire the patient’s teeth shape model intraoperatively. The shape model is then registered with a custom-made stereo camera system using a novel 3D stereo matching algorithm and with the patient’s CT-derived 3D model using an iterative closest point scheme, respectively. By leveraging the intraoral 3D scanner, the CT space and the stereo camera space are associated so that surrounding anatomical models and virtual implants could be overlaid on the camera’s view to achieve AR surgical navigation.

Results

Jaw phantom experiments were performed to evaluate the target registration error of the overlay, which yielded an average error of less than 0.50 mm with the time cost less than 0.5 s. Volunteer trial was also conducted to show the clinical feasibility.

Conclusions

The proposed registration method does not rely on any external fiducial markers attached to the patient. It performs automatically so as to maintain a correct AR scene, overcoming the misalignment difficulty caused by patient’s movement. Therefore, it is noninvasive and practical in oral and maxillofacial surgery.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Kamphuis C, Barsom E, Schijven M, Christoph N (2014) Augmented reality in medical education? Perspect Med Educ 3(4):300–311

    Article  PubMed  PubMed Central  Google Scholar 

  2. Loukas C, Lahanas V, Georgiou E (2013) An integrated approach to endoscopic instrument tracking for augmented reality applications in surgical simulation training. Int J Med Robot Comput Assist Surg 9(4):e34–e51

    Article  Google Scholar 

  3. Rhienmora P, Gajananan K, Haddawy P, Dailey MN, Suebnukarn S (2010) Augmented reality haptics system for dental surgical skills training. In: ACM symposium on virtual reality software and technology, 2010, pp 97-98

  4. Edwards PJ, Johnson LG, Hawkes DJ, Fenlon MR, Strong AJ, Gleeson MJ (2004) Clinical experience and perception in stereo augmented reality surgical navigation. In: Yang G-Z, Jiang T-Z (eds) Medical imaging and augmented reality, Berlin, Heidelberg. Springer, Berlin, pp 369–376

    Chapter  Google Scholar 

  5. Fallavollita P, Wang L, Weidert S, Navab N (2016) Augmented reality in orthopaedic interventions and education. In: Zheng G, Li S (eds) Computational radiology for orthopaedic interventions. Springer, Cham, pp 251–269. https://doi.org/10.1007/978-3-319-23482-3_13

    Chapter  Google Scholar 

  6. Liao H, Inomata T, Sakuma I, Dohi T (2010) 3-D augmented reality for MRI-guided surgery using integral videography autostereoscopic image overlay. IEEE Trans Biomed Eng 57(6):1476–1486

    Article  PubMed  Google Scholar 

  7. Tardif JP, Roy S, Meunier J (2003) Projector-based augmented reality in surgery without calibration. In: Proceedings of the 25th annual international conference of the IEEE engineering in medicine and biology society (IEEE Cat. No. 03CH37439), 17–21 Sept. 2003, vol 541, pp 548–551. https://doi.org/10.1109/iembs.2003.1279797

  8. Wang J, Suenaga H, Hoshi K, Yang L, Kobayashi E, Sakuma I, Liao H (2014) Augmented reality navigation with automatic marker-free image registration using 3-D image overlay for dental surgery. IEEE Trans Biomed Eng 61(4):1295–1304. https://doi.org/10.1109/TBME.2014.2301191

    Article  PubMed  Google Scholar 

  9. Wang J, Suenaga H, Liao H, Hoshi K, Yang L, Kobayashi E, Sakuma I (2015) Real-time computer-generated integral imaging and 3D image calibration for augmented reality surgical navigation. Comput Med Imaging Graph 40:147–159. https://doi.org/10.1016/j.compmedimag.2014.11.003

    Article  PubMed  Google Scholar 

  10. Liao H, Ishihara H, Tran HH, Masamune K, Sakuma I, Dohi T (2010) Precision-guided surgical navigation system using laser guidance and 3D autostereoscopic image overlay. Comput Med Imaging Graph 34(1):46–54. https://doi.org/10.1016/j.compmedimag.2009.07.003

    Article  PubMed  Google Scholar 

  11. Kiyokawa K, Kurata Y, Ohno H (2001) An optical see-through display for mutual occlusion with a real-time stereovision system. Comput Graph 25(5):765–779. https://doi.org/10.1016/S0097-8493(01)00119-4

    Article  Google Scholar 

  12. Qian L, Barthel A, Johnson A, Osgood G, Kazanzides P, Navab N, Fuerst B (2017) Comparison of optical see-through head-mounted displays for surgical interventions with object-anchored 2D-display. Int J Comput Assist Radiol Surg 12(6):901–910. https://doi.org/10.1007/s11548-017-1564-y

    Article  PubMed  PubMed Central  Google Scholar 

  13. Rodrigues DG, Jain A, Rick SR, Shangley L, Suresh P, Weibel N (2017) Exploring mixed reality in specialized surgical environments. Paper presented at the proceedings of the 2017 CHI conference extended abstracts on human factors in computing systems, Denver, Colorado, USA

  14. Chen X, Xu L, Wang Y, Wang H, Wang F, Zeng X, Wang Q, Egger J (2015) Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display. J Biomed Inform 55:124–131. https://doi.org/10.1016/j.jbi.2015.04.003

    Article  PubMed  Google Scholar 

  15. Teber D, Guven S, Simpfendörfer T, Baumhauer M, Güven EO, Yencilek F, Gözen AS, Rassweiler J (2009) Augmented reality: a new tool to improve surgical accuracy during laparoscopic partial nephrectomy? Preliminary in vitro and in vivo results. Eur Urol 56(2):332–338. https://doi.org/10.1016/j.eururo.2009.05.017

    Article  PubMed  Google Scholar 

  16. Wang J, Suenaga H, Yang L, Kobayashi E, Sakuma I, Wang J, Suenaga H, Yang L, Kobayashi E, Sakuma I (2017) Video see-through augmented reality for oral and maxillofacial surgery. Int J Med Robot Comput Assist Surg 13(2):e1754–e1767. https://doi.org/10.1002/rcs.1754

    Article  Google Scholar 

  17. Su L-M, Vagvolgyi BP, Agarwal R, Reiley CE, Taylor RH, Hager GD (2009) Augmented reality during robot-assisted laparoscopic partial nephrectomy: toward real-time 3D-CT to stereoscopic video registration. Urology 73(4):896–900. https://doi.org/10.1016/j.urology.2008.11.040

    Article  PubMed  Google Scholar 

  18. Volonte F, Fo Pugin, Bucher P, Sugimoto M, Ratib O, Morel P (2011) Augmented reality and image overlay navigation with OsiriX in laparoscopic and robotic surgery: not only a matter of fashion. J Hepato-Biliary-Pancreat Sci 18(4):506–509. https://doi.org/10.1007/s00534-011-0385-6

    Article  Google Scholar 

  19. Krempien R, Hoppe H, Kahrs L, Daeuber S, Schorr O, Eggers G, Bischof M, Munter MW, Debus J, Harms W (2008) Projector-based augmented reality for intuitive intraoperative guidance in image-guided 3D interstitial brachytherapy. Int J Radiat Oncol Biol Phys 70(3):944–952. https://doi.org/10.1016/j.ijrobp.2007.10.048

    Article  PubMed  Google Scholar 

  20. Zitová B, Flusser J (2003) Image registration methods: a survey. Image Vis Comput 21(11):977–1000. https://doi.org/10.1016/S0262-8856(03)00137-9

    Article  Google Scholar 

  21. Souzaki R, Ieiri S, Uemura M, Ohuchida K, Tomikawa M, Kinoshita Y, Koga Y, Suminoe A, Kohashi K, Oda Y, Hara T, Hashizume M, Taguchi T (2013) An augmented reality navigation system for pediatric oncologic surgery based on preoperative CT and MRI images. J Pediatr Surg 48(12):2479–2483. https://doi.org/10.1016/j.jpedsurg.2013.08.025

    Article  PubMed  Google Scholar 

  22. Puerto-Souza GA, Mariottini GL (2013) Toward long-term and accurate augmented-reality display for minimally-invasive surgery. In: 2013 IEEE international conference on robotics and automation, 6–10 May 2013, pp 5384–5389. https://doi.org/10.1109/icra.2013.6631349

  23. Stoyanov D, Darzi A, Yang GZ (2010) A practical approach towards accurate dense 3D depth recovery for robotic laparoscopic surgery. Comput Aided Surg 10(4):199–208. https://doi.org/10.3109/10929080500230379

    Article  Google Scholar 

  24. Chang P-L, Stoyanov D, Davison AJ, Edwards PE (2013) Real-time dense stereo reconstruction using convex optimisation with a cost-volume for image-guided robotic surgery. In: medical image computing and computer-assisted intervention—MICCAI 2013, Berlin, Heidelberg, 2013. Springer Berlin Heidelberg, pp 42–49

  25. Totz J, Thompson S, Stoyanov D, Gurusamy K, Davidson BR, Hawkes DJ, Clarkson MJ (2014) Fast semi-dense surface reconstruction from stereoscopic video in laparoscopic surgery. In: Information processing in computer-assisted interventions, 2014. Springer, Cham, pp 206–215

  26. Bouchard C, Magill JC, Nikonovskiy V, Byl M, Murphy BA, Kaban LB, Troulis MJ (2012) Osteomark: a surgical navigation system for oral and maxillofacial surgery. Int J Oral Maxillofac Surg 41(2):265–270. https://doi.org/10.1016/j.ijom.2011.10.017

    Article  CAS  PubMed  Google Scholar 

  27. Casap N, Nadel S, Tarazi E, Weiss EI (2011) Evaluation of a navigation system for dental implantation as a tool to train novice dental practitioners. J Oral Maxillofac Surg 69(10):2548–2556. https://doi.org/10.1016/j.joms.2011.04.026

    Article  PubMed  Google Scholar 

  28. Venosta D, Sun Y, Matthews F, Kruse AL, Lanzer M, Gander T, Grätz KW, Lübbers H-T (2014) Evaluation of two dental registration-splint techniques for surgical navigation in cranio-maxillofacial surgery. J Cranio-Maxillofac Surg 42(5):448–453. https://doi.org/10.1016/j.jcms.2013.05.040

    Article  Google Scholar 

  29. Badiali G, Ferrari V, Cutolo F, Freschi C, Caramella D, Bianchi A, Marchetti C (2014) Augmented reality as an aid in maxillofacial surgery: validation of a wearable system allowing maxillary repositioning. J Cranio-Maxillofac Surg 42(8):1970–1976. https://doi.org/10.1016/j.jcms.2014.09.001

    Article  Google Scholar 

  30. Wang J, Suenaga H, Yang L, Liao H, Ando T, Kobayashi E, Sakuma I (2015) 3D surgical overlay with markerless image registration using a single camera. In: Linte CA, Yaniv Z, Fallavollita P (eds) Augmented environments for computer-assisted interventions. Springer, Cham, pp 124–133

    Chapter  Google Scholar 

  31. Yang J, Li H, Campbell D, Jia Y (2016) Go-ICP: a globally optimal solution to 3D ICP point-set registration. IEEE Trans Pattern Anal Mach Intell 38(11):2241–2254. https://doi.org/10.1109/TPAMI.2015.2513405

    Article  PubMed  Google Scholar 

  32. Wold S, Esbensen K, Geladi P (1987) Principal component analysis. Chemometr Intell Lab Syst 2(1):37–52. https://doi.org/10.1016/0169-7439(87)80084-9

    Article  CAS  Google Scholar 

  33. Ulrich M, Wiedemann C, Steger C (2012) Combining scale-space and similarity-based aspect graphs for fast 3D object recognition. IEEE Trans Pattern Anal Mach Intell 34(10):1902–1914. https://doi.org/10.1109/TPAMI.2011.266

    Article  PubMed  Google Scholar 

  34. Wang J, Kobayashi E, Sakuma I (2015) Coarse-to-fine dot array marker detection with accurate edge localization for stereo visual tracking. Biomed Signal Process Control 15:49–59. https://doi.org/10.1016/j.bspc.2014.09.008

    Article  Google Scholar 

  35. Liu DC, Nocedal J (1989) On the limited memory BFGS method for large scale optimization. Math Program 45(1):503–528. https://doi.org/10.1007/bf01589116

    Article  Google Scholar 

  36. Wang J, Ji X, Zhang X, Sun Z, Wang T (2018) Real-time robust individual X point localization for stereoscopic tracking. Pattern Recognit Lett 112:138–144. https://doi.org/10.1016/j.patrec.2018.07.002

    Article  Google Scholar 

Download references

Funding

This work was partially supported by National Natural Science Foundation of China (Grant No. 61701014).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shuo Yang.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of our institutional review board and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, J., Shen, Y. & Yang, S. A practical marker-less image registration method for augmented reality oral and maxillofacial surgery. Int J CARS 14, 763–773 (2019). https://doi.org/10.1007/s11548-019-01921-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-019-01921-5

Keywords

Navigation