Elsevier

Medical Image Analysis

Volume 16, Issue 6, August 2012, Pages 1317-1328
Medical Image Analysis

Refraction-compensated motion tracking of unrestrained small animals in positron emission tomography

https://doi.org/10.1016/j.media.2012.04.005Get rights and content

Abstract

Motion-compensated radiotracer imaging of fully conscious rodents represents an important paradigm shift for preclinical investigations. In such studies, if motion tracking is performed through a transparent enclosure containing the awake animal, light refraction at the interface will introduce errors in stereo pose estimation. We have performed a thorough investigation of how this impacts the accuracy of pose estimates and the resulting motion correction, and developed an efficient method to predict and correct for refraction-based error. The refraction model underlying this study was validated using a state-of-the-art motion tracking system. Refraction-based error was shown to be dependent on tracking marker size, working distance, and interface thickness and tilt. Correcting for refraction error improved the spatial resolution and quantitative accuracy of motion-corrected positron emission tomography images. Since the methods are general, they may also be useful in other contexts where data are corrupted by refraction effects.

Highlights

► We model refraction-induced position and pose error in stereo motion tracking. ► We examine the dependencies of this error and describe a method to correct for it. ► The correction improves image-based quantification in motion-compensated PET. ► The methods are applicable to commercial and custom-designed stereo tracking systems.

Introduction

Molecular imaging techniques such as positron emission tomography (PET) are used to study the early stages of neuropsychiatric disorders and other chronic diseases, typically in animal models. Due to the negative impact of uncompensated motion on the image reconstruction, animals undergoing PET imaging are nearly always anaesthetised. However, anaesthesia not only disturbs the underlying biology, it precludes study of an animal’s response to external stimuli or the functional changes occurring as it interacts with its environment. This limits the potential of PET in preclinical investigations. The use of physical restraint (e.g. Javors et al., 2005) is a known stressor for species such as rats and mice (Ohata et al., 1981), therefore allowing animals freedom to move is preferable. This has led to the development of motion compensation techniques, relying on accurate motion measurements, to enable PET imaging of unrestrained awake animals (Kyme et al., 2009, Weisenberger et al., 2008).

Stereo-optical motion tracking is a feasible and accurate way to measure the head pose of awake rodents having limited or full movement (Kyme et al., 2010, Kyme et al., 2009, Weisenberger et al., 2008). Awake rodents imaged in tubes are able to move their heads freely and respond to external stimuli, thus enabling a more diverse range of experiments compared to that possible using anaesthetised subjects. Open-ended tubes (Kyme et al., 2009) allow the head to be tracked directly but animals must be trained to remain near the end of the tube. Enclosed tubes (Weisenberger et al., 2008) have the advantage of confining the animal to the scanner field of view (FOV), however the tube must be transparent to allow tracking of the head by externally placed cameras. In contrast to tubes, a chamber environment is even more appealing for research because of the greater range of animal behaviours possible (Zhou et al., 2010). Of particular research interest is the ability to detect correlations between function and behaviour for a freely moving animal in such an enclosure (Shultz et al., 2011). Similarly to enclosed tubes, a chamber requires transparent walls through which motion tracking can occur.

For both enclosed tubes and chambers, light rays will refract at the surfaces of the transparent walls. This violates a basic assumption of the linear pinhole camera model – that the object point, camera centre and image point are collinear. Failure to account for this is expected to cause errors in pose estimation. However, investigations into the nature, extent and correction of refraction-based error in stereo pose measurements are scant in the literature. Therefore, our goals in this work were to (i) develop and validate a model to accurately predict stereo motion tracking errors caused by refraction at plane-parallel interfaces; (ii) explore the severity of refraction errors as a function of system parameters (e.g. interface thickness and tilt); (iii) derive the refraction error for position measurements of individual points and pose measurements of rigid ensembles of points; (iv) specify the conditions for which correction of refraction error may be necessary in small animal imaging; and (v) formulate and validate a method to accurately correct tracker measurements for refraction-based errors. Although the motivation for this work is awake animal PET, we anticipate that the methods may be applicable in other contexts where objects are tracked through a refractive medium.

Section snippets

Related work

In a linear projection model, three-dimensional (3D) object points map to the image plane along straight lines. However, when a scene contains reflective and/or refractive objects, the light path may be treated as piece-wise linear and the contiguous segments are related by well-known physical relationships such as the law of reflection and Snell’s Law of refraction (Kutulakos and Steger, 2005). In this case the linear projection model is not valid (Treibitz et al., 2008). Three areas where

Refraction model

We consider a tracking system comprising two calibrated cameras, each modelled as a pinhole system with infinite CCD resolution and no sources of noise. A smooth, flat transparent interface with uniform thickness (i.e. plane-parallel) and uniform refractive index exists between the tracking system and target point. Therefore, two changes in ray direction occur along the light path from object to tracker, namely, at the front and back surfaces of the interface. We further assume that the

Model validation

Rotational components of the motion applied to the 5 mm thick interface are shown in Fig. 5. The motion was oscillatory in nature, ranging in amplitude from −60° to +40° about x and −40° to +50° about y. Motion of the 1.1 mm thick interface (not shown) was similar. Fig. 6 and Table 1 indicate excellent agreement (D < 0.1 mm for x and y) between measured and simulated target point locations for these moving interfaces. Agreement was slightly worse in z due to increased jitter of the raw tracker

Discussion

A method to characterise and correct for the refraction-induced error in pose estimates derived from stereo vision systems has been developed. To our knowledge, this error has not been characterised previously nor has a correction method been described that does not rely on detailed photogrammetric knowledge of the cameras. We address the particular case of refraction at a plane-parallel interface between tracker and object. The underlying model assumes pinhole camera geometry which is a close

Conclusion

Pose measurements obtained using stereo vision systems can be in error when the target is observed through a refractive medium. The potential impact of this error should not be overlooked in applications such as motion-compensated PET of small animals where motion needs to be known with sub-millimeter and sub-degree accuracy. Characterising and correcting for the error using the methods we describe led to improved quantitative performance in microPET imaging studies. Since the methods are

Acknowledgements

We would like to thank Ahmad Kolahi from Claron Technology Inc. for useful discussions regarding the MicronTracker optics and calibration, Dr Sergio Leon-Saval for helping us to isolate the problem with our Perspex™ interfaces and providing optical quality test interfaces, and Professor Ruibin Zhang for discussions regarding the refraction model. The work was supported by Australian Research Council Discovery Grants DP0663519 and DP0988166.

References (42)

  • T. Yanai et al.

    Three-dimensional videography of swimming with panning periscopes

    J. Biomech.

    (1996)
  • P.M. Bloomfield et al.

    The design and implementation of a motion correction scheme for neurological PET

    Phys. Med. Biol.

    (2003)
  • C. Cao et al.

    Interactive rendering of non-constant, refractive media using the ray equations of gradient-index optics

    Eurograph. Symp. Render.

    (2010)
  • Chari, V., Sturm, P., 2009. Multi-view geometry of the refractive plane. In: Proc. 20th Brit. Mach. Vis. Conf., pp....
  • M. Defrise

    A factorisation method for the 3-D X-ray transform

    Inverse Probl.

    (1995)
  • Fulton, R., Nickel, I., Tellmann, L., Meikle, S., Pietrzyk, U., Herzog, H., 2003. Event-by-event motion compensation in...
  • Hata, S., Saitoh, Y., Kumamura, S., Kaida, K., 1996. Shape extraction of transparent object using genetic algorithm....
  • J. Hohle

    Reconstruction of the underwater object

    Photogramm. Eng.

    (1971)
  • B.K.P. Horn

    Closed-form solution of absolute orientation using unit quaternions

    J. Opt. Soc. Am. A

    (1987)
  • M. Hudson et al.

    Accelerated image reconstruction using ordered subsets of projection data

    IEEE Trans. Med. Imag.

    (1994)
  • I. Ihrke et al.

    Eikonal rendering: efficient light transport in refractive objects

    ACM Trans. Graph.

    (2007)
  • Ihrke, I., Kutulakos, K., Lensch, H., Magnor, M., Heidrich, W., 2008. State of the art in transparent and specular...
  • M.A. Javors et al.

    Rat breathalyzer

    Alcohol: Clin. Exp. Res.

    (2005)
  • Kutulakos, K., Steger, E., 2005. A theory of refractive and specular 3D shape by light-path triangulation. Proc. 10th...
  • Y.-H. Kwon

    Object plane deformation due to refraction in two-dimensional underwater motion analysis

    J. Appl. Biomech.

    (1999)
  • Kwon, Y.-H., 1999b. A camera calibration algorithm for underwater motion analysis. In: Proc. XVII Int. Symp. Biomech....
  • Y.-H. Kwon et al.

    Effects of light refraction on the accuracy of camera calibration and reconstruction in underwater motion analysis

    Sports Biomech.

    (2006)
  • A. Kyme et al.

    Real-time 3D motion tracking for small animal brain PET

    Phys. Med. Biol.

    (2008)
  • Kyme, A., Zhou, V., Meikle, S., Popovic, K., Man, J.-P., Akhtar, M., Karllsson, I., Fulton, R., 2009. Motion tracking...
  • Kyme, A., Miekle, S., Eisenhuth, J., Baldock, C., Fulton, R, 2010. An investigation of motion tracking for freely...
  • A. Kyme et al.

    Optimised motion tracking for positron emission tomography studies of brain function in awake rats

    PLoS ONE

    (2011)
  • View full text