Skip to main content

Advertisement

Log in

Fast and accurate online calibration of optical see-through head-mounted display for AR-based surgical navigation using Microsoft HoloLens

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose:

The use of an optical see-through head-mounted display (OST-HMD) in augmented reality (AR) has significantly increased in recent years, but the alignment between the virtual scene and physical reality is still a challenge. A fast and accurate calibration method of OST-HMD is important for augmented reality in the medical field.

Methods:

We proposed a fast online calibration procedure for OST-HMD with the aid of an optical tracking system. Two 3D datasets are collected in this procedure: the virtual points rendered in front of the observer’s eyes and the corresponding points in optical tracking space. The transformation between these two 3D coordinates is solved to build the connection between virtual and real space. An AR-based surgical navigation system is developed with the help of this procedure, which is used for experiment verification and clinical trial.

Results:

Phantom experiment based on the 3D-printed skull is performed, and the average root-mean-square error of control points between rendered object and skull model is \(1.30 \pm 0.39\) mm, and the time consumption of the calibration procedure can be less than 30 s. A clinical trial is also conducted to show the feasibility in real surgery theatre.

Conclusions:

The proposed calibration method does not rely on the camera of the OST-HMD and is very easy to operate. Phantom experiment and clinical case demonstrated the feasibility of our AR-based surgical navigation system and indicated it has the potential in clinical application.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Arun KS, Huang TS, Blostein SD (1987) Least-squares fitting of two 3-d point sets. IEEE Trans Pattern Anal Mach Intell PAMI 9(5):698–700. https://doi.org/10.1109/TPAMI.1987.4767965

    Article  CAS  Google Scholar 

  2. Azimi E, Qian L, Navab N, Kazanzides P (2017) Alignment of the virtual scene to the tracking space of a mixed reality head-mounted display. arXiv preprint arXiv:1703.05834

  3. Besl PJ, McKay ND (1992) A method for registration of 3-d shapes. IEEE Trans Pattern Anal Mach Intell 14(2):239–256. https://doi.org/10.1109/34.121791

    Article  Google Scholar 

  4. Borgmann H, Socarrás MR, Salem J, Tsaur I, Rivas JG, Barret E, Tortolero L (2017) Feasibility and safety of augmented reality-assisted urological surgery using smartglass. World J Urol 35(6):967–972

    Article  CAS  Google Scholar 

  5. Brown J (2019) Op11 wearable technology and simulation to support learning. is it the way forward? A pilot study in primary care with healthcare professionals across the north west: Wearable technology. BMJ Simul Technol Enhanced Learn 5:5–A7

    Google Scholar 

  6. Chen L, Day TW, Tang W, John NW (2017) Recent developments and future challenges in medical mixed reality. In: 2017 IEEE international symposium on mixed and augmented reality (ISMAR). IEEE, pp 123–135

  7. Chen X, Yanping L, Wu Y, Wang CT (2008) Real-time motion tracking in image-guided oral implantology. Int J Med Robot Comput Assist Surg MRCAS 4:339–47. https://doi.org/10.1002/rcs.215

    Article  Google Scholar 

  8. Cosentino F, John NW, Vaarkamp J (2014) An overview of augmented and virtual reality applications in radiotherapy and future developments enabled by modern tablet devices. J Radiother Pract 13(03):350–364

    Article  Google Scholar 

  9. El-Hariri H, Pandey P, Hodgson AJ, Garbi R (2018) Augmented reality visualisation for orthopaedic surgical guidance with pre-and intra-operative multimodal image data fusion. Healthc Technol Lett 5(5):189–193

    Article  Google Scholar 

  10. Frantz T, Jansen B, Duerinck J, Vandemeulebroucke J (2018) Augmenting Microsoft’s Hololens with vuforia tracking for neuronavigation. Healthc Technol Lett 5(5):221–225

    Article  PubMed  PubMed Central  Google Scholar 

  11. Genc Y, Tuceryan M, Navab N (2002) Practical solutions for calibration of optical see-through devices. In: Proceedings international symposium on mixed and augmented reality, pp 169–175. https://doi.org/10.1109/ISMAR.2002.1115086

  12. Gibby JT, Swenson SA, Cvetko S, Rao R, Javan R (2018) Head-mounted display augmented reality to guide pedicle screw placement utilizing computed tomography. Int J Comput Assist Radiol Surg 14:525–535

    Article  PubMed  Google Scholar 

  13. Hajek J, Unberath M, Fotouhi J, Bier B, Lee SC, Osgood G, Maier A, Armand M, Navab N (2018) Closing the calibration loop: an inside-out-tracking paradigm for augmented reality in orthopedic surgery. arXiv preprint arXiv:1803.08610

  14. Horn BKP (1987) Closed-form solution of absolute orientation using unit quaternions. J Opt Soc Am A 4(4):629–642. https://doi.org/10.1364/JOSAA.4.000629

    Article  Google Scholar 

  15. Hossain MS, Hardy S, Alamri A, Alelaiwi A, Hardy V, Wilhelm C (2016) Ar-based serious game framework for post-stroke rehabilitation. Multimed Syst 22(6):659–674

    Article  Google Scholar 

  16. Hölzle F, Kesting M, Hölzle G, Watola A, Loeffelbein D, Ervens J, Wolff KD (2007) Clinical outcome and patient satisfaction after mandibular reconstruction with free fibula flaps. Int J Oral Maxillofac Surg 36:802–6. https://doi.org/10.1016/j.ijom.2007.04.013

    Article  PubMed  Google Scholar 

  17. Itoh Y, Klinker G (2014) Interaction-free calibration for optical see-through head-mounted displays based on 3d eye localization. In: 2014 IEEE symposium on 3D user interfaces (3DUI), pp 75–82. https://doi.org/10.1109/3DUI.2014.6798846

  18. Kersten-Oertel M, Jannin P, Collins DL (2013) The state of the art of visualization in mixed reality image guided surgery. Comput Med Imaging Graph 37(2):98–112

    Article  PubMed  Google Scholar 

  19. Kuzhagaliyev T, Clancy NT, Janatka M, Tchaka K, Vasconcelos F, Clarkson MJ, Gurusamy K, Hawkes DJ, Davidson B, Stoyanov D (2018) Augmented reality needle ablation guidance tool for irreversible electroporation in the pancreas. In: Medical imaging 2018: image-guided procedures, robotic interventions, and modeling, international society for optics and photonics, vol 10576, p 1057613

  20. Liu Y, Dong H, Zhang L, El Saddik A (2018) Technical evaluation of hololens for multimedia: a first look. IEEE Multimed 25(4):8–18

    Article  Google Scholar 

  21. Marschollek M, Barthel C, Behrends M, Schmeer R, Meyenburg-Altwarg I, Becker MB (2016) Smart glasses in nursing training-redundant gadget or precious tool? a pilot study. In: Nursing informatics, pp 377–381

  22. Otsu N (2007) A threshold selection method from gray-level histograms. IEEE Trans Syst Man Cybern 9(1):62–66

    Article  Google Scholar 

  23. Owen CB, Zhou J, Tang A, Xiao F (2004) Display-relative calibration for optical see-through head-mounted displays. In: Third IEEE and ACM international symposium on mixed and augmented reality, pp 70–78

  24. Pires JN, Neves J, Serrario D (2018) Application of mixed reality in robot manipulator programming. Ind Robot 45(6):784–793. https://doi.org/10.1108/IR-06-2018-0120

    Article  Google Scholar 

  25. Qian L, Winkler A, Fuerst B, Kazanzides P, Navab N (2016) Reduction of interaction space in single point active alignment method for optical see-through head-mounted display calibration. In: 2016 IEEE international symposium on mixed and augmented reality (ISMAR-Adjunct), pp 156–157. https://doi.org/10.1109/ISMAR-Adjunct.2016.0066

  26. Qian L, Barthel A, Johnson A, Osgood G, Kazanzides P, Navab N, Fuerst B (2017) Comparison of optical see-through head-mounted displays for surgical interventions with object-anchored 2d-display. Int J Comput Assist Radiol Surg 12(6):901–910

    Article  PubMed  PubMed Central  Google Scholar 

  27. Qin C, Cao Z, Fan S, Wu Y, Sun Y, Politis C, Wang C, Chen X (2019) An oral and maxillofacial navigation system for implant placement with automatic identification of fiducial points. Int J Comput Assist Radiol Surg 14(2):281–289

    Article  PubMed  Google Scholar 

  28. Sharp GC, Lee SW, Wehe DK (2002) ICP registration using invariant features. IEEE Trans Pattern Anal Mach Intell 24(1):90–102

    Article  Google Scholar 

  29. Tuceryan M, Genc Y, Navab N (2002) Single-point active alignment method (spaam) for optical see-through hmd calibration for augmented reality. Teleoper Virtual Environ Presence 11:259–276. https://doi.org/10.1162/105474602317473213

    Article  Google Scholar 

  30. Walker MW, Shao L, Volz RA (1991) Estimating 3-d location parameters using dual number quaternions. CVGIP Image Underst 54(3):358–367

    Article  Google Scholar 

  31. Wiles AD, Thompson DG, Frantz DD (2004) Accuracy assessment and interpretation for optical tracking systems. In: Medical imaging 2004: visualization, image-guided procedures, and display, international society for optics and photonics, vol 5367, pp 421–432

  32. Zitová B, Flusser J (2003) Image registration methods: a survey. Image Vis Comput 21(11):977–1000

    Article  Google Scholar 

Download references

Acknowledgements

The authors gratefully acknowledge the support provided by the National Key R&D Program of China (2017YFB1302903; 2017YFB1104100), National Natural Science Foundation of China (81971709; 81828003; M-0019; 82011530141), the Foundation of Ministry of Education of China Science and Technology Development Center (2018C01038), the Foundation of Science and Technology Commission of Shanghai Municipality (19510712200), and Shanghai Jiao Tong University Foundation on Medical and Technological Joint Science Research (ZH20182DA15, YG2019ZDA06,ZH2018QNA23).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaojun Chen.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Qichang Sun and Yongfeng Mai have contributed equally to this work.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sun, Q., Mai, Y., Yang, R. et al. Fast and accurate online calibration of optical see-through head-mounted display for AR-based surgical navigation using Microsoft HoloLens. Int J CARS 15, 1907–1919 (2020). https://doi.org/10.1007/s11548-020-02246-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-020-02246-4

Keywords

Navigation