Keywords

1 Introduction

In rheology, the deformation and flow properties of matter are studied using a rheometer, a device that applies forces to the medium. However, fluids having low viscosity and/or elasticity values cannot be analyzed by this conventional technique, being assessed instead by microrheology. The latter consists in evaluating the behavior of tracer particles in the fluid via microscopic images. It can be passive, when the tracer particles move due to inherent thermal energy, or active, if external forces are applied to induce flow of the tracers. In this work, we consider only the case of passive microrheology, which involves analyzing the Brownian motion of the tracer particles, trough tracking techniques that are applied to microscopy image sequences.

Particle tracking in microrheology deals with specific motion conditions and particle types, but with a broad range of image properties. This means that, even though the tracer beads move randomly around a certain point, with low movement amplitude and, ideally, no drift, and have the same size and appearance, they can be immersed in fluids that create inhomogeneous backgrounds, be temporarily occluded, form unwanted agglomerates with other particles, or even be affected by light artifacts that change their apparent shape and color.

Microrheology studies do not require that all the particles in a certain frame sequence are correctly identified and tracked, but their statistical relevance is enhanced if the mean squared displacements of many particles in tracks are taken into account. Moreover, particles that are agglomerated should not be considered in such studies, since the forces involved in their interaction have additional influence in their movement, which is not purely Brownian [1].

There are many available tools for particle tracking in microrheology problems, such as the various adaptations and improvements of Crocker and Grier’s techniques [2], in programming languages and environments such as IDL (Exelis Visual Information Solutions, Boulder, Colorado, United States), Matlab (The MathWorks, Inc., Natick, Massachusetts, United States) and Python (Python Software Foundation, https://www.python.org/), or in tools such as the ImageJ plugin ParticleTracker [3]. These have a low level of automation, demanding the definition of multiple parameters about the dataset in use. Because researchers that work in microrheology studies are not necessarily proficient programmers, the task of particle tracking should be facilitated as much as possible, adding efficiency to the pipeline, but also avoiding error in the complex modulus computation, which is highly dependant on the accuracy of the tracking methods.

The main contributions of this work are the following: the design of more robust tracking methods for microscopic images; the creation of a pipeline that automatically deals with different microrheology settings; the extension of an existing synthetic database [4] by including more conditions that may appear in real images of microrheology, such as drift and large wavelength noise.

2 Particle Tracking

Particle tracking involves finding the trajectories of particles along a sequence of images, and is often divided into two steps - particle detection and linking - which we also address separately. In this work, we aim to automatize and standardize the process of particle tracking in microrheology image sequences coming from different settings, in order to facilitate the studies that experts of the field conduct.

To achieve this goal, we start by performing pre-processing to ensure that images have proper contrast and to attenuate background noise. Afterwards, calibration is made in one of the frames, in order to infer whether particles are darker or lighter than the background, and estimate their radius, a parameter that will be crucial in following steps. We also find a relatively small but robust set of search regions which are important when the method next described overlooks certain areas due to occlusion. To find particles at each image of the sequence, we start by applying Crocker and Grier’s approach [2] to detect plausible searching regions. As this traditional algorithm tends to produce an overwhelming number of false positives with such limited knowledge of the data (as can be seen in Fig. 3), more robust approaches have to follow. In this work, we investigate how adequate and robust Gaussian fitting and Radial Symmetry are to produce the final detection of particles. In the end, we filter the particle aggregates, leaving out the particles that have an Euclidean distance smaller than 20 pixels between each-other, and the coordinates of the remaining particles are linked into trajectories. An illustration of the described pipeline can be seen in Fig. 1.

Fig. 1.
figure 1

Particle detection pipeline.

2.1 Pre-processing and Initial Calibration

The contrast of the images is enhanced via gamma adjustment [5]. This is followed by an estimation of the image noise variance [6], assuming that the noise is modeled by an additive zero mean Gaussian. A Gaussian filter [5] having the estimated variance is then applied to reduce noise. To revert the enlarging effect that this filter has on foreground objects, the erosion morphological operation is applied [7], having as kernel a circle with the calculated noise variance as radius.

Concerning calibration, an image of the sequence is used to estimate the radius of the tracer particles, required later in the Crocker and Grier’s detection framework, and also to set some of the regions in which the particles are going to be searched for, in every frame. To do so, we start by computing the local image gradient as the difference between the local maximum and minimum in a 4-neighborhood. The gradient is computed with a rank filter [8], meaning that the local histogram is used for its calculation. The gradient image is denoised through a rolling-ball background subtraction [9]. A subsequent histogram of gradient values throughout the whole image is computed, since one wants to keep the edges of the particles, which are ideally represented by similar gradient values. The number of bins used in the histogram is given by the Stone rule, which resorts to the minimization of a loss function defined by the bin probabilities and number of bins [13]. Using Otsu’s method [10] to establish a gradient threshold, one can separate the image gradient values into two classes, where the class encompassing the higher gradient values is the most likely to represent the particles and their steep intensity transitions. This method is invariant to the foreground being lighter or darker than the background, but is highly sensitive to high frequency noise, reason why previous and further denoising are required. Therefore, applying the morphological opening and closing to the thresholded (binary) gradient image, with a kernel that takes the 4-neighborhood of each pixel, one can remove salt and pepper noise. Then the particles’ contours are filled in, creating solid particle regions. Through the analysis of histograms created for the region area values, the most common area value is defined as representative of true particles if the eccentricities of the regions associated with these areas are characteristic of circular regions. Regions are considered far from being circular, and therefore rejected, if their eccentricities surpass the value of 0.6 (in a scale from 0 to 1, being 0 the eccentricity of a perfect circle). An illustration of this pre-processing and initial segmentation pipeline can be seen in Fig. 2.

Fig. 2.
figure 2

Illustration of the pre-processing and initial segmentation pipeline. (a) Example image from blood sample video used in [15]. (b) Smoothed image according to the estimated noise variance. (c) Local image gradients. (d) Clustering through histogram analysis. (e) Identified particles.

The found particles allow to estimate an average radius and tell whether they are brighter or darker than the background, condition to which Gaussian fitting is not invariant. Also, as previously stated, the regions where the particles are found can be used as searching regions in the following frames, helping in the scenarios where Crocker and Grier’s approach does not retrieve adequate results. Even though it could be used for particle detection, if followed by sub-pixel refinement, this method is only used for calibration, since it is not expected to detect all the particles in a certain frame, but only those which are highlighted by the gradient operation. Moreover, its computation time is prohibitive of its use for whole frame sequences, being of around 7 s per image.

2.2 Frame-by-Frame Detection Pipeline

In order to detect tracer particles in each image of a given sequence, we start by using the Crocker and Grier’s algorithm. This method allows to tune several parameters in order to successfully detect particles in different microrheology settings by its own. However, the process of finding good parameters is very tedious, especially for the practitioners, as typically they are not deeply aware of how the algorithm works. Thus, in order to automatize as much as possible the procedure for different types of data, we only give as input the previously estimated particle radius and whether or not the background is darker than the particles. In such conditions, Crocker and Grier’s method tends to produce many false positives (as can be seen in Fig. 3), which is not a problem, as we just use its outputs to obtain search areas that will be further processed using either Gaussian fitting or a Radial Symmetry-based approach and features of those search areas, such as integrated brightness, and size eccentricity of the particle region. These features are going to be used in combination with method-specific ones, either coming from the Gaussian profile that is fitted to the particle or from the radial symmetry of the region, to generate data clusters that will separate particles from non-particles.

Gaussian Fitting Through Least-Squares Estimation. The Gaussian fitting method for particle center estimation assumes that the intensity profile of particles can be modeled as a Gaussian distribution. The least-squares estimation minimizes the sum of squares of an error function defined for the fitting of Gaussian parameters to a given search region. This error function is the subtraction between a Gaussian with a set of parameters and the image region it is fitted to. The initial parameter estimates of the Gaussian that is fitted are given by the region’s moments, namely the mean in each dimension (first moment) and the variance (second moment). The fitting process is aware of the particles being darker or lighter than the background, as inferred from the calibration step. To filter false positive search areas coming from the Crocker and Grier’s algorithm, we start by neglecting all search areas where the fitting does not converge after 1000 iterations. Next, we cluster the fitted Gaussians through Fuzzy C-means [12], using as features their standard deviation and region features: the integrated brightness, size and eccentricity of the circular region in which the Gaussian is being fitted. A membership degree is assigned to each feature point, being higher for points that are closer to the center of the cluster. The “fuzzifier” coefficient, which defines how fuzzy the clusters will be (how small can the membership values become) is set to 2, a common value when there is little information about the features, and the stopping criteria are an error of 0.005 or a maximum number of iterations of 1000. Two data clusters are defined with this method. Then, cluster analysis is processed in a simple manner: the values of the integrated brightness that are placed in each cluster are compared and, if it was previously seen that the background is darker than the foreground/particle candidates, the cluster that is kept is the one with the higher integrated brightness values, and vice-versa. It must be highlighted that data clustering is only done if the number of points detected is superior to two times the number of points in the calibration phase or if the coefficient of variance of the features (given by the mean of the quotients between standard deviations and means of each feature) exceeds 0.5. An illustration of the Gaussian fitting method as a particle center estimation layer is presented in Fig. 4.

Fig. 3.
figure 3

Circled in red, the detections made by Crocker and Grier’s method [2], when only the particle radius and whether the particles are brighter or darker than the background are specified. These detections are made in a frame of a microrheology setup for a blood analogue described in [15]. (Color figure online)

Radial Symmetry Method. The Radial Symmetry-based approach, as described in [4], tries to identify spherical particles and accurately compute their center by assuming that the intensity of a particle representation is radially symmetric in relation to the particle center. This algorithm estimates the particle center as the point with the maximum value of radial symmetry, working for differently shaped particles, such as concentric rings and Gaussian intensity distributions. To do so, it assumes that the lines parallel to the intensity gradient of an image region that has perfect radial symmetry will be pointing, in any point, towards the origin, which is the center one wants to find. Therefore, the origin that minimizes the weighted sum of the squared distances between the lines parallel to the intensity gradient is the center of the particle representation. The mean squared distance between the lines parallel to the gradient and the center, weighted by intensity, are a measure of goodness of fit. If the coefficient of variation of the set of mean distance squared values of the detected particles is superior to 0.2, clustering is done through the method presented in the last subsection, but adding the mean squared distance to the feature space, instead of the Gaussian’s standard deviation, which is not evaluated in this case. The cluster that is kept is the one that contains the lower mean distance squared values, denoting greater radial symmetry of the regions. An illustration of the Radial Symmetry method as a particle center estimation layer is presented in Fig. 4.

Fig. 4.
figure 4

Illustration of the particle center detection through both approaches. (a) Synthetic 2-particle image generated with Poisson noise of SNR = 10 and fractal noise merged with a weight of 0.1. (b) Centers of all the defined search areas. (c) Detail of the XY-plane projection of the Gaussian that is fitted to the particle on the left. (e) Detail of a search region in which some of the lines parallel to the image gradient are drawn and their maximum convergence point identified. (d), (f) Refined positions of the centers through the Radial Symmetry method and Gaussian fitting, respectively, with the true centers in blue. (Color figure online)

2.3 Linking

After detection, the coordinates of the particles are linked into trajectories, also following an approach by Crocker and Grier [2], which, being adapted to particles undergoing Brownian diffusion, looks for the particles near their most recent locations. The ideal case of particle motion is the one where the particle’s velocity is uncorrelated from one frame to the next one, as is characteristic of Brownian diffusion. However, particles can drift together in a directed movement, getting carried away from their initial positions. Calculating and correcting the drift is possible through a simple post-processing routine, in which we average the particle velocities per frame and subtract the displacement caused by this average speed to the final tracks. A simulation of drifting particles through the synthetic database (detailed in the Experiments and Results section) and further drift correction is presented in Fig. 5.

Fig. 5.
figure 5

Illustration of drifting particles with Brownian motion, following a [2, 2] pixel drift vector (pointing down and right from the upper left corner) for 7 frames (a), and their trajectories after drift correction (b).

3 Experiments and Results

To validate the proposed methodology, we conducted experiments on an extension of the synthetic database of [4], and also on sequences of microrheology setups used in [15]. In this section, we start by describing how the database was extended by us, then we evaluate our detection steps in this synthetic data. Finally, we report the accuracy in terms of number of detected particles in the real datasets.

3.1 Synthetic Database Extension

Our simulated multi-particle images are generated from an extension of the generator described in [4] (Matlab code available in https://pages.uoregon.edu/raghu/particle_tracking.html), which models single fluorescent particles as point sources in random positions of the image, convolved with a point-spread-function (PSF):

$$\begin{aligned} PSF(r)=\left( \frac{2J_1(v)}{v}\right) ^2 \end{aligned}$$
(1)

that is characteristic of the representation of a particle in fluorescence microscopy, where \(J_1\) is the Bessel function of the first kind, of order 1, \(v=(2\pi NA r)/(\lambda n_w)\), \(\lambda \) is the wavelength of light, NA is the numerical aperture of the objective lens, \(n_w\) is the index of refraction of water and r is the radial coordinate [4]. The simulated particles were created by applying this function with \(\lambda =550\) nm and NA = 0.55. The multi-particle images are complemented with Poisson-distributed noise. For this type of noise, the signal-to-noise ratio is the square root of the number of photons detected at the brightest pixel. Therefore, the pixel intensities are scaled so that the peak intensity is equal to the square of the signal-to-noise ratio, and a constant background is replaced by pixels with a random intensity drawn from a Poisson distribution whose mean is the expected background intensity [4]. Because this image transformation only creates short wavelength noise, an additional source of large wavelength noise was added, to simulate the background effects that exist in inhomogeneous fluids, such as blood (see Fig. 8(a)). This noise layer has a cloud-like 2D texture generated by the superposition of increasingly upsampled portions of a white noise image, which creates a fractal noise pattern [14]. Asides from adding noise to the images, an optional gamma correction [5] is made. Besides mimicking Brownian motion, we also allow the existence of a drifting motion that is equally applied to all the particles, through the definition of a motion vector that is added to the previously defined random position of the particles. The pipeline for image generation is visually detailed in Fig. 6.

Fig. 6.
figure 6

Illustration of the image sequence generation pipeline. (a) Detail of particle generated with the previously detailed properties and applied Poisson noise (SNR = 8). (b) Cloud-like structure to be merged with a weight of 0.2, forming the final image, (c).

3.2 Performance Evaluation on Synthetic Data

We evaluated the detection capabilities of our methodology in this synthetic data. The Gaussian fitting method showed better accuracy and less occurrence of missed particles and false positives. However, the execution time of the method for each 50-particle image from a sequence, SNR = 10 and a cloud-like texture overlapped with a weight of 0.1 is of around 0.8 s, while the task is completed through the Radial Symmetry method in around 0.16 s.

Fig. 7.
figure 7

Accuracy metrics of the detection methods on synthetic data. (a) Euclidean distance to true center in function of each center’s true position, for Gaussian fitting (in blue), and Radial Symmetry method (in red). (b) Total error, calculated as the square root of the mean of the sum of the squares of the error in each dimension, in function of SNR. (d) Total error in function of long wavelength noise weight. (Color figure online)

In Fig. 7(a), one can see a three-dimensional representation of the euclidean distance to the true center in an image sequence of 100 frames with 10 particles, to which no drift is applied, but only Brownian motion. These images have a SNR of 10 and long wavelength noise with a merging weight of 0.1. It can be seen through this figure that the maximum Euclidean distance to the true center never exceeds 1.5 pixels, for any method. Also, in Fig. 7(b), the total error for both detection methods is plotted when varying the SNR from 5 to 12, and in (c) when varying the overlapping weight of fractal noise from 0.1 to 0.4, for 100 images with 10 particles. All plots denote the superior accuracy of Gaussian fitting over the Radial Symmetry method. One must, however, notice that the generated images have particle representations whose PSF forms an intensity profile that can be approximated to a Gaussian function, ensuring a good fit with the Gaussian fitting method.

3.3 Application to Real Data

To test the algorithms with real passive microrheology data, we used the videos of the study detailed in [15]. These include two videos of blood analogues, in which the tracer particles are highly contrasting with the background, but also a video of real blood, in which the background has intensity fluctuations, due to the heterogeneity of components, and the particles are often occluded and aggregated. The latter is challenging for this type of study (see Fig. 8(a)).

Fig. 8.
figure 8

Representation of the particle detections in a blood image, (a), with applied gamma correction (gamma = 2.5), and a blood analogue image, (b), with no pre-processing applied. Gaussian fitting detections are labeled with red circles and Radial Symmetry method detections are labeled with yellow triangles. (Color figure online)

Since there is no manual annotation of the centroids, but only of the particles’ their bounding boxes, the detection accuracy cannot be evaluated in terms of total error. Instead, it can be shown whether, for real case scenarios, the number of particles that were segmented by hand and the number of particles that could be detected by the automated methods match. Therefore, 1 in each 100 frames of three microrheology videos with 5000 frames were annotated and, for the blood analogue images (Fig. 9(c)), 92% of the particles could be identified with the Radial Symmetry method, and 93% with Gaussian fitting. 86% of the particles could be tracked continuously throughout the duration of the video, for both methods. As for true blood images, 53% of the particles could be identified with the Gaussian fitting method, and 56% of the particles were detected through the Radial Symmetry method. 60% of the detected particles could be tracked throughout the duration of the video, for both methods. A representation of the detected particles in both image cases can be seen in Fig. 8.

Fig. 9.
figure 9

Mean squared displacement in function of lag time for a blood analogue 5000 frame dataset, analyzed with the Radial Symmetry method and through Gaussian fitting. The MSD acquired for the same particles with the method that was used in the reference [15], Crocker and Grier’s [2] approach for particle detection, is plotted in gray. The relative deviation between the MSD curves for Gaussian fitting and the Radial Symmetry method is 126%.

After particle tracking, drift correction, and removal of aggregates, we could calculate the mean square displacement of the particles in each video. Evaluating the mean squared displacement results for the particles in one of the blood analogue fluids detailed in [15], one could see that, even though the Radial Symmetry method has advantage over Gaussian fitting in terms of computational time, its slight inaccuracy at the sub-pixel level leads to the overestimation of the particles’ movement, and, therefore, of their mean squared displacement, as can be seen in Fig. 9. Estimating the microrheology parameters with the Radial Symmetry method would lead to highly biased results.

4 Conclusion and Future Work

We have successfully created a way of automating the process for particle tracking in microrheology images. It was shown that the methods in use for this type of problem still have margin to improve, and that their combination and adaptation for different cases is possible. Nevertheless, it was shown that the Gaussian fitting method is adequate for the microrheology setup, since it detects the particle centroids with enough accuracy not to induce erroneous movement in the track calculations. The same cannot be said about the Radial Symmetry technique, which showed to be more inaccurate at the sub-pixel level. Increasing the number of tests on real microrheology scenarios is imperative, since the heterogeneity that is possible for images in these studies is not sufficiently represented in the used real data, and the synthetic database cannot model all the possible conditions.