Abstract
Optical tracking is widely used for surgical Augmented Reality systems because it provides relatively high accuracy over a large workspace. But, it requires line-of-sight between the camera and the markers, which can be difficult to maintain. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which causes large cumulative errors, especially for the measurement of position. This paper proposes a sensor fusion approach to handle cases where incomplete optical tracking information, such as just the 3D position of a single marker, is obtained. In this approach, when the optical tracker provides full 6D pose information, it is used to estimate the bias of the inertial sensors. Then, as long as the optical system can track the position of at least one marker, that 3D position can be combined with the orientation estimated from the inertial measurements to recover the full 6D pose information. Experiments are performed with a head-mounted display (HMD) that integrates an optical tracker and inertial measurement unit (IMU). The results show that with the sensor fusion approach we can still estimate the 6D pose of the head with respect to the reference frame, under partial occlusion conditions. The results generalize to a conventional navigation setup, where the inertial sensor would be co-located with the optical markers instead of with the camera.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Azimi, E., Doswell, J., Kazanzides, P.: Augmented reality goggles with an integrated tracking system for navigation in neurosurgery. In: Proc. IEEE Virtual Reality, Orange County, CA, pp. 123–124 (March 2012)
Azuma, R., Bishop, G.: Improving static and dynamic registration in an optical see-through HMD. In: SIGGRAPH, pp. 197–204 (July 1994)
Chai, L., Hoff, W.A., Vincent, T.: Three-dimensional motion and structure estimation using inertial sensors and computer vision for augmented reality. Presence 11(5), 474–492 (2002)
Hoff, W., Vincent, T.: Analysis of head pose accuracy in augmented reality. IEEE Trans. Visualization and Computer Graphics 6(4), 319–334 (2000)
Park, F.C., Martin, B.J.: Robot sensor calibration: Solving AX = XB on the Euclidean group. IEEE Trans. on Robotics and Auto. 10(5), 717–721 (1994)
Ren, H., Kazanzides, P.: Hybrid attitude estimation for laparoscopic surgical tools: A preliminary study. In: Intl. Conf. of IEEE Engineering in Medicine and Biology Society (EMBC), pp. 5583–5586 (September 2009)
Ren, H., Kazanzides, P.: Investigation of attitude tracking using an integrated inertial and magnetic navigation system for hand-held surgical instruments. IEEE/ASME Trans. on Mechatronics 17(2), 210–217 (2012)
Sadda, P., Azimi, E., Jallo, G., Doswell, J., Kazanzides, P.: Surgical navigation with a head-mounted tracking system and display. Medicine Meets Virtual Reality (MMVR) 20, 363–369 (2013)
Sauer, F., Wenzel, F., Vogt, S., Tao, Y., Genc, Y., Bani-Hashemi, A.: Augmented workspace: designing an AR testbed. In: Proc. IEEE Intl. Symp. on Augmented Reality (ISAR), pp. 47–53 (October 2000)
Tao, Y., Hu, H., Zhou, H.: Integration of vision and inertial sensors for 3D arm motion tracking in home-based rehabilitation. Intl. J. of Robotics Research 26(6), 607–624 (2007)
Tobergte, A., Pomarlan, M., Hirzinger, G.: Robust multi sensor pose estimation for medical applications. In: Proc. IEEE/RSJ Intl. Conf. on Intell, St. Louis, MO, pp. 492–497 (October 2009)
Vaccarella, A., De Momi, E., Enquobahrie, A., Ferrigno, G.: Unscented Kalman filter based sensor fusion for robust optical and electromagnetic tracking in surgical navigation. IEEE/ASME Trans. on Instrumentation and Measurement 62(7), 2067–2081 (2013)
You, S., Neumann, U., Azuma, R.: Hybrid inertial and vision tracking for augmented reality registration. In: Proc. IEEE Conf. on Virtual Reality, Houston, TX, pp. 260–267 (March 1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
He, C., Şen, H.T., Kim, S., Sadda, P., Kazanzides, P. (2014). Fusion of Inertial Sensing to Compensate for Partial Occlusions in Optical Tracking Systems. In: Linte, C.A., Yaniv, Z., Fallavollita, P., Abolmaesumi, P., Holmes, D.R. (eds) Augmented Environments for Computer-Assisted Interventions. AE-CAI 2014. Lecture Notes in Computer Science, vol 8678. Springer, Cham. https://doi.org/10.1007/978-3-319-10437-9_7
Download citation
DOI: https://doi.org/10.1007/978-3-319-10437-9_7
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-10436-2
Online ISBN: 978-3-319-10437-9
eBook Packages: Computer ScienceComputer Science (R0)