Refraction-compensated motion tracking of unrestrained small animals in positron emission tomography
Graphical abstract
Highlights
► We model refraction-induced position and pose error in stereo motion tracking. ► We examine the dependencies of this error and describe a method to correct for it. ► The correction improves image-based quantification in motion-compensated PET. ► The methods are applicable to commercial and custom-designed stereo tracking systems.
Introduction
Molecular imaging techniques such as positron emission tomography (PET) are used to study the early stages of neuropsychiatric disorders and other chronic diseases, typically in animal models. Due to the negative impact of uncompensated motion on the image reconstruction, animals undergoing PET imaging are nearly always anaesthetised. However, anaesthesia not only disturbs the underlying biology, it precludes study of an animal’s response to external stimuli or the functional changes occurring as it interacts with its environment. This limits the potential of PET in preclinical investigations. The use of physical restraint (e.g. Javors et al., 2005) is a known stressor for species such as rats and mice (Ohata et al., 1981), therefore allowing animals freedom to move is preferable. This has led to the development of motion compensation techniques, relying on accurate motion measurements, to enable PET imaging of unrestrained awake animals (Kyme et al., 2009, Weisenberger et al., 2008).
Stereo-optical motion tracking is a feasible and accurate way to measure the head pose of awake rodents having limited or full movement (Kyme et al., 2010, Kyme et al., 2009, Weisenberger et al., 2008). Awake rodents imaged in tubes are able to move their heads freely and respond to external stimuli, thus enabling a more diverse range of experiments compared to that possible using anaesthetised subjects. Open-ended tubes (Kyme et al., 2009) allow the head to be tracked directly but animals must be trained to remain near the end of the tube. Enclosed tubes (Weisenberger et al., 2008) have the advantage of confining the animal to the scanner field of view (FOV), however the tube must be transparent to allow tracking of the head by externally placed cameras. In contrast to tubes, a chamber environment is even more appealing for research because of the greater range of animal behaviours possible (Zhou et al., 2010). Of particular research interest is the ability to detect correlations between function and behaviour for a freely moving animal in such an enclosure (Shultz et al., 2011). Similarly to enclosed tubes, a chamber requires transparent walls through which motion tracking can occur.
For both enclosed tubes and chambers, light rays will refract at the surfaces of the transparent walls. This violates a basic assumption of the linear pinhole camera model – that the object point, camera centre and image point are collinear. Failure to account for this is expected to cause errors in pose estimation. However, investigations into the nature, extent and correction of refraction-based error in stereo pose measurements are scant in the literature. Therefore, our goals in this work were to (i) develop and validate a model to accurately predict stereo motion tracking errors caused by refraction at plane-parallel interfaces; (ii) explore the severity of refraction errors as a function of system parameters (e.g. interface thickness and tilt); (iii) derive the refraction error for position measurements of individual points and pose measurements of rigid ensembles of points; (iv) specify the conditions for which correction of refraction error may be necessary in small animal imaging; and (v) formulate and validate a method to accurately correct tracker measurements for refraction-based errors. Although the motivation for this work is awake animal PET, we anticipate that the methods may be applicable in other contexts where objects are tracked through a refractive medium.
Section snippets
Related work
In a linear projection model, three-dimensional (3D) object points map to the image plane along straight lines. However, when a scene contains reflective and/or refractive objects, the light path may be treated as piece-wise linear and the contiguous segments are related by well-known physical relationships such as the law of reflection and Snell’s Law of refraction (Kutulakos and Steger, 2005). In this case the linear projection model is not valid (Treibitz et al., 2008). Three areas where
Refraction model
We consider a tracking system comprising two calibrated cameras, each modelled as a pinhole system with infinite CCD resolution and no sources of noise. A smooth, flat transparent interface with uniform thickness (i.e. plane-parallel) and uniform refractive index exists between the tracking system and target point. Therefore, two changes in ray direction occur along the light path from object to tracker, namely, at the front and back surfaces of the interface. We further assume that the
Model validation
Rotational components of the motion applied to the 5 mm thick interface are shown in Fig. 5. The motion was oscillatory in nature, ranging in amplitude from −60° to +40° about x and −40° to +50° about y. Motion of the 1.1 mm thick interface (not shown) was similar. Fig. 6 and Table 1 indicate excellent agreement (D < 0.1 mm for x and y) between measured and simulated target point locations for these moving interfaces. Agreement was slightly worse in z due to increased jitter of the raw tracker
Discussion
A method to characterise and correct for the refraction-induced error in pose estimates derived from stereo vision systems has been developed. To our knowledge, this error has not been characterised previously nor has a correction method been described that does not rely on detailed photogrammetric knowledge of the cameras. We address the particular case of refraction at a plane-parallel interface between tracker and object. The underlying model assumes pinhole camera geometry which is a close
Conclusion
Pose measurements obtained using stereo vision systems can be in error when the target is observed through a refractive medium. The potential impact of this error should not be overlooked in applications such as motion-compensated PET of small animals where motion needs to be known with sub-millimeter and sub-degree accuracy. Characterising and correcting for the error using the methods we describe led to improved quantitative performance in microPET imaging studies. Since the methods are
Acknowledgements
We would like to thank Ahmad Kolahi from Claron Technology Inc. for useful discussions regarding the MicronTracker optics and calibration, Dr Sergio Leon-Saval for helping us to isolate the problem with our Perspex™ interfaces and providing optical quality test interfaces, and Professor Ruibin Zhang for discussions regarding the refraction model. The work was supported by Australian Research Council Discovery Grants DP0663519 and DP0988166.
References (42)
- et al.
Three-dimensional videography of swimming with panning periscopes
J. Biomech.
(1996) - et al.
The design and implementation of a motion correction scheme for neurological PET
Phys. Med. Biol.
(2003) - et al.
Interactive rendering of non-constant, refractive media using the ray equations of gradient-index optics
Eurograph. Symp. Render.
(2010) - Chari, V., Sturm, P., 2009. Multi-view geometry of the refractive plane. In: Proc. 20th Brit. Mach. Vis. Conf., pp....
A factorisation method for the 3-D X-ray transform
Inverse Probl.
(1995)- Fulton, R., Nickel, I., Tellmann, L., Meikle, S., Pietrzyk, U., Herzog, H., 2003. Event-by-event motion compensation in...
- Hata, S., Saitoh, Y., Kumamura, S., Kaida, K., 1996. Shape extraction of transparent object using genetic algorithm....
Reconstruction of the underwater object
Photogramm. Eng.
(1971)Closed-form solution of absolute orientation using unit quaternions
J. Opt. Soc. Am. A
(1987)- et al.
Accelerated image reconstruction using ordered subsets of projection data
IEEE Trans. Med. Imag.
(1994)
Eikonal rendering: efficient light transport in refractive objects
ACM Trans. Graph.
Rat breathalyzer
Alcohol: Clin. Exp. Res.
Object plane deformation due to refraction in two-dimensional underwater motion analysis
J. Appl. Biomech.
Effects of light refraction on the accuracy of camera calibration and reconstruction in underwater motion analysis
Sports Biomech.
Real-time 3D motion tracking for small animal brain PET
Phys. Med. Biol.
Optimised motion tracking for positron emission tomography studies of brain function in awake rats
PLoS ONE
Cited by (2)
Markerless motion estimation for motion-compensated clinical brain imaging
2018, Physics in Medicine and BiologyStereo Image Based Motion Measurements in Fluids: Experimental Validation and Application in Friction Extrusion
2015, Experimental Mechanics