Abstract
We present the study of a data-driven motion synthesis approach based on a 1D affine image-matching equation. We start by deriving the relevant properties of the exact matching operator, such as the existence of a singular point. Next, we approximate such operator by the Green’s function of a second-order differential equation, finding that it leads to a more compelling motion impression, due to the incorporation of blur. We then proceed to show that, by judicious choice of the matching parameters, the 1D affine Green’s filter allows the simulation of a broad class of effects, such as zoom-in and zoom-out, and of complex nonrigid motions such as that of a pulsating heart.
Similar content being viewed by others
References
Shinya M, Fournier A (1992) Stochastic motion – motion under the influence of wind. Comput Graph Forum 11(3):119–128
Oziem D, Campbell N, Dalton C, Gibson D, Thomas B (2004) Combining sampling and autoregression for motion synthesis. Proc. of the Computer Graphics International Conference, pp. 510–513
Foster N, Metaxas D (1996) Realistic animation of liquids. CVGIP 58(5):471–483
Freeman W, Adelson E (1991) The design and use of steerable filters. IEEE Trans PAMI 13(9):891–906
Freeman W, Adelson E, Heeger D (1991) Motion without movement. Comput Graph 4 25:27–30
Brostow GJ, Essa I (2001) Image-based motion blur for stop motion animation. Proc. of the 28th annual conference on Computer Graphics and Interactive Techniques, pp. 561–566
Glassner A (1999) An open and shut case computer graphics. IEEE Comput Graph Appl 19:82–92
Potmesil M, Chakravarty I (1983) Modeling motion blur in computer generated images. Comput Graph 17:389–399
Max NL, Lerner DM (1985) A two-and-a-half-D motion blur algorithm. Comput Graph 19:85–93
Horn B, Schunck B (1981) Determining optical flow. Artif Intell 17:185–203
Lucas D, Kanade T (1981) An Interative Image registration technique with an application to stereo vision. In Proc Seventh IJCAI, Vancouver, pp. 674–679
Black M, Anandan P (1996) The robust estimation of multiple motions: parametric and piecewise-smooth flow fields. Comput Vis Image Underst 63(1):75–104
Torreão JRA (2001) A Green’s function approach to shape from shading. Pattern Recognit 34:2367–2382
Torreão JRA (2003) Geometric-photometric approach to monocular shape estimation. Image Vis Comput 21:1045–1061
Rav-Acha A, Peleg S (2000) Restoration of multiple images with motion blur in different directions. Workshop on applications of computer vision, Palm Springs, pp. 22–28
Martinsen T, Quintero FM, Skarda D (1996) Plug-in to the GIMP (Open Source Code version 1.22)
GIMP (2001) The GIMP Team, {\em web site available at http://www.gimp.org}
Zill DG, Cullen MR (1993) Differential equations with boundary-value problems. PWS Publishing Company
Evans LC (1997) Partial differential equations. American Mathematical Society
Verri A, Girosi F, Torre V (1989) Mathematical properties of the 2D motion field: from singular points to motion parameters. J Opt Soc Am A 6(5):698–712
Corpetti T, Mémin E, Pérez P (2003) Extraction of singular points from dense motion fields: an analytic approach. J Math Imaging Vis 19(3):175–198
Ford R, Strickland R (1995) Representing and visualizing fluid flow images and velocimetry data by nonlinear dynamical systems. CVGIP: Graph Models Image Process 57(6):462–482
Nogawa H, Nakajima Y, Sato Y (1997) Acquisition of symbolic description from flow fields: a new approach based on a fluid model. IEEE Trans Pattern Anal Mach Intell 19(1):58–63
Wohn K, Waxman A (1990) The analytic structure of image flows: deformation and segmentation. Comput Vis Graph Image Process 2(49):127–151
Maurizot M, Bouthemy P, Delyon B, Juditski A, Odobez J (1995) Determination of singular points in 2D deformable flow fields. Proc 2nd IEEE Int Conf Image Process 3:488–491
Foley J, Van Dam A, Feiner S, Hughes J (1990) Computer graphics: principles and practice in C, 2nd Edition, Addison-Wesley systems programming series
Rekleitis IM (1995) Visual motion estimation based on motion blur Interpretation, M. Sc. Thesis of Computer Science, School of Computer Science, McGill University, Montreal
Rekleitis IM (1996) Steerable filters and cepstral analysis for optical flow calculation from a single blurred image. Vision Interface, pp. 159–166
Rekleitis IM (1996) Optical flow recognition from the power spectrum of a single blurred image. Proc of IEEE International Conference on Image Processing
Ferreira Jr PE, Torreão JRA, Carvalho PCP (2004) Data-based motion simulation through a Green’s function approach. Proc of XVII SIBGRAPI, pp. 193–199
Jahne B, Hauβecker H, Geiβler P (1999) Handbook of computer vision and applications, Vol 2, Academic, London
Ferreira Jr PE, Torreão JRA, Carvalho PCP, Velho L (2005) Video interpolation through Green’s functions of matching equations. Proc of IEEE International Conference on Image Processing
Beylkin G (1993) Chapter in the book Wavelets: mathematics and applications, CRC Press
Black M (1996) Area-based optical flow: robust affine regression, Software available on-line at http://www.cs.brown.edu/people/black/
Acknowledgments
The research reported here has been partially developed at IMPA’s VISGRAF Laboratory, with the sponsorship of CAPES, and at the Department of Computer Science at UFBA, with the sponsorship of FAPESB. The first author would like to thank Professor Augusto C. P. L. da Costa, and the secretary Dilson Anunciação, for their support of his work at UFBA. J.R.A. Torreão acknowledges a grant from CNPq-Brasil. The authors would also like to thank Professor Michael Black for allowing the use of his affine optical flow code [34].
Author information
Authors and Affiliations
Corresponding author
Appendix: Numerical validation of the experiments
Appendix: Numerical validation of the experiments
As a means of validating the experiments in Section 4, we have used a software for motion estimation—kindly provided to us by Professor Michael Black—which is based on affine regression [34]. In it, the 2D affine model is expressed as
where (c x , c y )T denotes the coordinates of the central image point. Comparing the above with Eq. (16), we find the relations
Taking the above into account, we have thus employed Michael Black’s program to estimate the affine motion components, in order to compare them with the input parameters of the Green’s filter. In each considered sequence, the input image and a synthesized one have been used for this purpose. It should be noted that, since we have restricted ourselves here to a separable 2D affine model, it is expected that we should find \(\tilde{u_2}\approx\tilde{v_1}\approx 0,\) in all the experiments. Below, we present the validation results only for a subset of the more complex simulated sequences, namely those illustrated in Figs. 12–16.
Zoom-out. Table 1 presents the optical flow parameters yielded by [34], along with those used as input to the Green’s filter. The third frame in Fig. 13 has been used.
We see that a very good correspondence is obtained in this case, for all the parameters.
Zoom-in. Again, the estimated and input parameters are very consistent, as shown by Table 2. The second frame in Fig. 14 has been used.
Funny eye. The second frame in Fig. 16 has been used. Table 3 shows the estimated and input parameters. Again, the correspondence is fairly good.
Next, we discuss two examples where the validation through Michael Black’s program has not been possible, those of the pulsating heart and the deforming ball simulations.
Pulsating heart. Table 4 shows the input parameters and those estimated from the third frame of Fig. 12. The data are inconsistent.
A similar situation occurs with the deforming ball, as shown below:
Deforming ball. Table 5 shows the input parameters and those estimated from the second frame of Fig. 15. Again, the data are not consistent, although the errors are somewhat smaller than in the pulsating heart experiment.
We conjecture that the problem, in the above simulations, may arise from the fact that, in both cases, the input images consist of the superposition of a central object over a dark background, what could somehow induce errors in the estimation process. In order to check such hypothesis, we performed an additional test based on an image where such a clear-cut figure/background segmentation is not present. For this purpose, we chose the input image to the zoom experiments, applying over it a Green’s filter with parameters (u 0, u 1, x U ) = (2, −0.031, 64), in order to simulate only horizontal motion, as in the deforming ball and pulsating heart examples. The generated pair appears in Fig. 17.
Table 6, below, shows the estimated and input parameters, which, in this case, prove fairly consistent.
From the foregoing discussion, we may conclude that, except in the case of images with the characteristics of Figs. 12 and 15—i.e., with a sharp figure/background separation—the Green’s function simulations can be numerically validated by the motion estimation algorithm of [34].
Rights and permissions
About this article
Cite this article
Ferreira, P.E., Torreão, J.R.A., Carvalho, P.C.P. et al. Motion synthesis through 1D affine matching. Pattern Anal Applic 11, 45–58 (2008). https://doi.org/10.1007/s10044-007-0078-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10044-007-0078-6