Abstract
Purpose
We report on the development and accuracy assessment of a hybrid tracking system that integrates optical spatial tracking into a video pass-through head-mounted display.
Methods
The hybrid system uses a dual-tracked co-calibration apparatus to provide a co-registration between the origins of an optical dynamic reference frame and the VIVE Pro controller through a point-based registration. This registration provides the location of optically tracked tools with respect to the VIVE controller’s origin and thus the VIVE’s tracking system.
Results
The positional accuracy was assessed using a CNC machine to collect a grid of points with 25 samples per location. The positional trueness and precision for the hybrid tracking system were \(0.48\,\hbox {mm}\) and \(0.23\,\hbox {mm}\), respectively. The rotational accuracy was assessed through inserting a stylus tracked by all three systems into a hemispherical phantom with cylindrical openings at known angles and collecting 25 samples per cylinder for each system. The rotational trueness and precision for the hybrid tracking system were \(0.64^{\circ }\) and \(0.05^{\circ }\), respectively. The difference in position and rotational trueness between the OTS and the hybrid tracking system was \(0.27\,\hbox {mm}\) and \(0.04^{\circ }\), respectively.
Conclusions
We developed a hybrid tracking system that allows the pose of optically tracked surgical instruments to be known within a first-person HMD visualization system, achieving submillimeter accuracy. This research validated the positional and rotational accuracy of the hybrid tracking system and subsequently the optical tracking and VIVE tracking systems. This work provides a method to determine the position of an optically tracked surgical tool with a surgically acceptable accuracy within a low-cost commercial-grade video pass-through HMD. The hybrid tracking system provides the foundation for the continued development of virtual reality or augmented virtuality surgical navigation systems for training or practicing surgical techniques.






Similar content being viewed by others
References
Ewers R, Schicho K, Undt G, Wanschitz F, Truppe M, Seemann R, Wagner A (2005) Basic research and 12 years of clinical experience in computer-assisted navigation technology: a review. Int J Oral Maxillofac Surg 34(1):1
Cleary K, Peters TM (2010) Image-guided interventions: technology review and clinical applications. Annu Rev Biomed Eng 12:119
Lin HH, Lo LJ (2015) Three-dimensional computer-assisted surgical simulation and intraoperative navigation in orthognathic surgery: a literature review. J Formos Med Assoc 114(4):300
Mahmoud N, Grasa ÓG, Nicolau SA, Doignon C, Soler L, Marescaux J, Montiel JMM (2017) On-patient see-through augmented reality based on visual SLAM. Int J Comput Assist Radiol Surg 12(1):1
Macedo de Farias MC, Junior ALA (2014) Improving on-patient medical data visualization in a markerless augmented reality environment by volume clipping. In: 27th SIBGRAPI Conference on Graphics, Patterns and Images IEEE, pp 149–156
Müller M, Rassweiler MC, Klein J, Seitel A, Gondan M, Baumhauer M, Teber D, Rassweiler JJ, Meinzer HP, Maier-Hein L (2013) Mobile augmented reality for computer-assisted percutaneous nephrolithotomy. Int J Comput Assist Radiol Surg 8(4):663
Kilgus T, Heim E, Haase S, Prüfer S, Müller M, Seitel A, Fangerau M, Wiebe T, Iszatt J, Schlemmer HP, Hornegger J, Yen K, Maier-Hein L (2015) Mobile markerless augmented reality and its application in forensic medicine. Int J Comput Assist Radiol Surg 10(5):573. https://doi.org/10.1007/s11548-014-1106-9
Chauvet P, Collins T, Debize C, Novais-Gameiro L, Pereira B, Bartoli A, Canis M, Bourdel N (2018) Augmented reality in a tumor resection model. Surg Endosc 32(3):1192
Peters TM, Linte CA, Yaniv Z, Williams J (2018) Mixed and augmented reality in medicine. CRC Press, Boca Raton
Gibby JT, Swenson SA, Cvetko S, Rao R, Javan R, Rao R (2018) Head-mounted display augmented reality to guide pedicle screw placement utilizing computed tomography. Int J Comput Assist Radiol Surg 14(3):525–535
Chen X, Xu L, Wang Y, Wang H, Wang F, Zeng X, Wang Q, Egger J (2015) Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display. J Biomed Inform 55:124
Wang H, Wang F, Leong APY, Xu L, Chen X, Wang Q (2016) Precision insertion of percutaneous sacroiliac screws using a novel augmented reality-based navigation system: a pilot study. Int Orthop 40(9):1941
Navab N, Traub J, Sielhorst T, Feuerstein M, Bichlmeier C (2007) Action-and workflow-driven augmented reality for computer-aided medical procedures. IEEE Comput Graph Appl 27(5):10
Kaneko N, Tsunoda M, Mitsuhashi M, Okubo K, Takeshima T, Sehara Y, Nagai M, Kawai K (2017) Ultrasound-guided fine-needle aspiration in the neck region using an optical see-through head-mounted display: a randomized controlled trial. J Ultrasound Med 36(10):2071
Calabrò EM, Cutolo F, Carbone M, Ferrari V (2017) Wireless mobile communication and healthcare. Springer, Cham, pp 345–356
Rolland JP, Fuchs H (2000) Optical versus video see-through head-mounted displays in medical visualization. Presence Teleoperators Virtual Environ 9(3):287
Niehorster DC, Li L, Lappe M (2017) The accuracy and precision of position and orientation tracking in the HTC vive virtual reality system for scientific research. i-Perception 8(3):204166951770820
Frantz DD, Wiles AD, Leis SE, Kirsch SR (2003) Accuracy assessment protocols for electromagnetic tracking systems. Phys Med Biol 48(14):2241
Lasso A, Heffter T, Rankin A, Pinter C, Ungi T, Fichtinger G (2014) PLUS: open-source toolkit for ultrasound-guided intervention systems. IEEE Trans Biomed Eng 61(10):2527
Horn BKP (1987) Closed-form solution of absolute orientation using unit quaternions. J Opt Soc Am A 4(4):629
Markley FL, Cheng Y, Crassidis JL, Oshman Y (2007) Averaging quaternions. J Guid Control Dyn 30(4):1193
Ma B, Banihaveb N, Choi J, Chen ECS, Simpson AL (2017) Is pose-based pivot calibration superior to sphere fitting? In: Webster RJ, Fei B (eds). International Society for Optics and Photonics, 2017, vol 10135, p 101351U
Wan EA, Van Der Merwe R (2000) In: Proceedings of the IEEE 2000 adaptive systems for signal processing, communications, and control symposium (Cat. No.00EX373), pp 153–158
Acknowledgements
Authors would like to acknowledge Mr. John Moore of Robarts Research institute for his technical assistance. Funding: This study was funded by Canadian Foundation for Innovation (20994), the Ontario Research Fund (IDCD), and the Canadian Institutes for Health Research (FDN 201409).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Ethical approval
This article does not contain any studies with human participants or animals performed by any of the authors.
Informed consent
This article does not contain patient data. For this type of study, formal consent is not required.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Groves, L.A., Carnahan, P., Allen, D.R. et al. Accuracy assessment for the co-registration between optical and VIVE head-mounted display tracking. Int J CARS 14, 1207–1215 (2019). https://doi.org/10.1007/s11548-019-01992-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11548-019-01992-4