Skip to main content
Log in

On the Fourier Properties of Discontinuous Motion

  • Published:
Journal of Mathematical Imaging and Vision Aims and scope Submit manuscript

Abstract

Retinal image motion and optical flow as its approximation are fundamental concepts in the field of vision, perceptual and computational. However, the computation of optical flow remains a challenging problem as image motion includes discontinuities and multiple values mostly due to scene geometry, surface translucency and various photometric effects such as reflectance. In this contribution, we analyze image motion in the frequency space with respect to motion discontinuities and translucence. We derive the frequency structure of motion discontinuities due to occlusion and we demonstrate its various geometrical properties. The aperture problem is investigated and we show that the information content of an occlusion almost always disambiguates the velocity of an occluding signal suffering from the aperture problem. In addition, the theoretical framework can describe the exact frequency structure of Non-Fourier motion and bridges the gap between Non-Fourier visual phenomena and their understanding in the frequency domain.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. G. Adiv, “Determining three-dimensional motion and structure from optical flow generated by several moving objects,” IEEE PAMI, Vol. 7, No. 4, pp. 384-401, 1985.

    Google Scholar 

  2. N. Ancona, “A fast obstacle detection method based on optical flow,” in Proceedings of ECCV, Santa Mangherita Ligure, Italy, May 1992, pp. 267-271.

  3. S.T. Barnard and W.B. Thompson, “Disparity analysis of images,” IEEE PAMI, Vol. 2, No. 4, pp. 333-340, 1980.

    Google Scholar 

  4. J.L. Barron, D.J. Fleet, and S.S. Beauchemin, “Performance of optical flow techniques,” IJCV, Vol. 12, No. 1, pp. 43-77, 1994.

    Google Scholar 

  5. J.L. Barron, A.D. Jepson, and J.K. Tsotsos, “The feasibility of motion and structure from noisy time-varying image velocity information,” IJCV, Vol. 5, No. 3, pp. 239-269, 1990.

    Google Scholar 

  6. J.L. Barron and A. Liptay, “Optic flow to measure minute increments in plant growth,” Bioimaging, Vol. 2, pp. 57-61, 1994.

    Google Scholar 

  7. M.J. Black and P. Anandan, “A model for the detection of motion over time,” in Proceedings of ICCV, Osaka, Japan, December 1990, pp. 33-37.

  8. P. Bouthemy and E. Francois, “Motion segmentation and qualitative dynamic scene analysis from an image sequence,” IJCV, Vol. 10, No. 2, pp. 159-182, 1993.

    Google Scholar 

  9. P. Burlina and R. Chellappa, “Time-to-x: Analysis of motion through temporal parameters,” in IEEE Proceedings of CVPR, Seattle, Washington, June 1994, pp. 461-468.

  10. B. Carpentieri and J.A. Storer, “A split-merge parallel block-matching algorithm for video displacement estimation,” in Data Compression Conference, Snowbird, Utah, March 1992, pp. 239-248.

  11. C. Chubb and G. Sperling, “Drift-balanced random stimuli: A general basis for studying non-fourier motion perception,” J. Opt. Soc. Am. A, Vol. 5, No. 11, pp. 1986-2007, 1988.

    Google Scholar 

  12. V. Cornilleau-Peres and J. Droulez, “Stereo correspondence from optical flow,” in Proceedings of ECCV, Antibes, France, April 1990, pp. 326-330.

  13. E. Dubois, “The sampling and reconstruction of time-varying imagery with application in video systems,” Proceedings of IEEE, Vol. 73, No. 4, pp. 502-522, 1985.

    Google Scholar 

  14. J.H. Duncan and T. Chou, “On the detection of motion and the computation of optical flow,” IEEE PAMI, Vol. 14, No. 3, pp. 346-352, 1992.

    Google Scholar 

  15. W. Enkelmann, “Obstacle detection by evaluation of optical flow fields from image sequences,” in Proceedings of ECCV, Antibes, France, April 1990, pp. 134-138.

  16. I. Fermin and A. Imiya, “Two-dimensional motion computation by randomized method,” Technical Report TR ICS-4-6-1994, Dept. of Information and Computer Sciences, Chiba University, Japan, 1994.

    Google Scholar 

  17. D.J. Fleet, Measurement of Image Velocity, Kluwer Academic Publishers: Norwell, 1992.

    Google Scholar 

  18. D.J. Fleet and A.D. Jepson, “Computation of component image velocity from local phase information,” IJCV, Vol. 5, No. 1, pp. 77-104, 1990.

    Google Scholar 

  19. D.J. Fleet and K. Langley, “Computational analysis of nonfourier motion,” Vision Research, Vol. 34, No. 22, pp. 3057-3079, 1995.

    Google Scholar 

  20. A. Giachetti, M. Campani, and V. Torre, “The use of optical flow for autonomous navigation,” in Proceedings of ECCV, Stockholm, Sweden, May 1994, pp. 146-151.

  21. J.C. Hay, “Optical motions and space perception: An extension of gibson's analysis,” Psychological Review, Vol. 73, No. 6, pp. 550-565, 1966.

    Google Scholar 

  22. D.J. Heeger and A.D. Jepson, “Subspace methods for recovering rigid motion 2: Algorithm and implementation,” IJCV, Vol. 7, No. 2, pp. 95-117, 1992.

    Google Scholar 

  23. M. Irani, B. Rousso, and S. Peleg, “Recovery of egomotion using image stabilization,” in CVPR, Seattle, Washington, June 1994, pp. 454-460.

  24. R.C. Jain, “Direct computation of the focus of expansion,” IEEE PAMI, Vol. 5, No. 1, pp. 58-63, 1983.

    Google Scholar 

  25. R.C. Jain, “Segmentation of frame sequences obtained by a moving observer,” IEEE PAMI,Vol. 6, No. 5, pp. 624-629, 1984.

    Google Scholar 

  26. M.R.M. Jenkin, A.D. Jepson, and J.K. Tsotsos, “Techniques for disparity measurement,” CVGIP,Vol. 53, No. 1, pp. 14-30, 1991.

    Google Scholar 

  27. A.D. Jepson and M. Black, “Mixture models for optical flow computation,” in IEEE Proceedings of CVPR, New York, June 1993, pp. 760-761.

  28. K. Langley, T.J. Atherton, R.G. Wilson, and M.H.E. Lacombe, “Vertical and horizontal disparities from phase,” Image and Vision Computing, Vol. 9, No. 4, pp. 296-302, 1991.

    Google Scholar 

  29. A. Liptay, J.L. Barron, T. Jewett, and I. Van Wesenbeeck, “Optic flow as an ultra-sensitive technique for measuring seedling growth in long image sequences,” J. Amer. Soc. Hort. Sci., Vol. 120, No. 3, pp. 379-385, 1995.

    Google Scholar 

  30. H.C. Longuet-Higgins, “A computer algorithm for reconstructing a scene from two projections,” Nature,Vol. 223, pp. 133-135, 1981.

    Google Scholar 

  31. H.C. Longuet-Higgins and K. Prazdny, “The interpretation of a moving retinal image,” in Proceedings of Royal Society London B, Vol. 208, pp. 385-397, 1980.

    Google Scholar 

  32. B.D. Lucas and T. Kanade, “An iterative image-registration technique with an application to stereo vision,” in Proceedings of IJCAI, Vancouver, British Columbia, 1981, pp. 674-679.

  33. D. Marr and S. Ullman, “Directional selectivity and its use in early visual processing,” in Proceedings of Royal Society London B, Vol. 211, pp. 151-180, 1981.

    Google Scholar 

  34. E. De Micheli, V. Torre, and S. Uras, “The accuracy of the computation of optical flow and of the recovery of motion parameters,” IEEE PAMI, Vol. 15, No. 5, pp. 434-447, 1993.

    Google Scholar 

  35. F.W. Mounts, “A video encoding system using conditional picture-element replenishment,” Bell System Technical Journal, vol. 48, pp. 2545-2554, 1969.

    Google Scholar 

  36. D.W. Murray and B.F. Buxton, “Scene segmentation from visual motion using global optimization,” IEEE PAMI, Vol. 9, No. 2, pp. 220-228, 1987.

    Google Scholar 

  37. H.G. Musmann, P. Pirsch, and H.J. Grallert, “Advances in picture coding,” Proc of IEEE, Vol. 73, No. 4, pp. 523-548, 1985.

    Google Scholar 

  38. S. Negahdaripour and S. Lee, “Motion recovery from image sequences using only first order optical flow information,” IJCV, Vol. 9, No. 3, pp. 163-184, 1992.

    Google Scholar 

  39. A.N. Netravali and J.D. Robbins, “Motion compensated television coding: Part 1,” Bell System Technical Journal, Vol. 58, pp. 631-670, 1979.

    Google Scholar 

  40. M. Ogata and T. Sato, “Motion-detection model with two stages: Spatiotemporal filtering and feature matching,” J. Opt. Soc. Am. A, Vol. 9, No. 3, pp. 377-387, 1992.

    Google Scholar 

  41. I. Overington, “Gradient-based flow segmentation and location of the focus of expansion,” in Alvey Vision Conference, University of Cambridge, England, September 1987, pp. 860-870.

    Google Scholar 

  42. K. Prazdny, “Motion and structure from optical flow,” in Proceedings of IJCAI, Tokyo, Japan, August 1979, pp. 702-704.

  43. J.L. Prince and E.R. McVeigh, “Motion estimation from tagged mr image sequences,” IEEE Trans. on Medical Images, Vol. 11, No. 2, pp. 238-249, 1992.

    Google Scholar 

  44. D. Regan and K.I. Beverley, “How do we avoid confounding the direction we are looking and the direction we are moving,” Science, Vol. 215, pp. 194-196, 1982.

    Google Scholar 

  45. W. Reichardt, R.W. Schlogl, and M. Egelhoaf, “Movement detectors of the correlation type provide sufficient information for local computation of 2d velocity fields,” Naturwissenschaften, Vol. 75, pp. 313-315, 1988.

    Google Scholar 

  46. A. Rognone, M. Campani, and A. Verri, “Identifying multiple motions from optical flow,” in Proceedings of ECCV, Santa Mangherita Ligue, Italy, May 1992, pp. 258-266.

  47. L.A. Spacek, “Edge detection and motion detection,” Image and Vision Computing, Vol. 4, No. 1, pp. 43-56, 1986.

    Google Scholar 

  48. M. Subbarao, “Bounds on time-to-collision and rotational component from first-order derivatives of image flow,” CVGIP, Vol. 50, pp. 329-341, 1990.

    Google Scholar 

  49. V. Sundareswaran, “A fast method to estimate sensor translation,” in Proceedings of ECCV, Santa Mangherita Ligure, Italy, May 1992, pp. 263-267.

  50. R.Y. Tsai and T.S. Huang, “Uniqueness and estimation of three-dimensional motion parameters of rigid objects with curved surfaces,” IEEE PAMI, Vol. 6, No. 1, pp. 13-27, 1984.

    Google Scholar 

  51. R.Y. Tsai, T.S. Huang, and W. Zhu, “Estimating three-dimensional motion parameters of a rigid planar patch, Part 2: Singular value decomposition,” IEEE Trans. on Acoustics, Speech and Signal Processing, Vol. 30, No. 4, pp. 525-534, 1982.

    Google Scholar 

  52. J.D. Victor and M.M. Conte, “Coherence and transparency of moving plaids composed of fourier and non-fourier gratings,” Perception & Psychophysics, Vol. 52, No. 4, pp. 403-414, 1988.

    Google Scholar 

  53. M.J. Zanker, “Theta motion: A paradoxical stimulus to explore higher-order motion extraction,” Vision Research, Vol. 33, pp. 553-569, 1993.

    Google Scholar 

  54. Z. Zhang and O.D. Faugeras, “Three-dimensional motion computation and object segmentation in a long sequence in stereo frames,” IJCV, Vol. 7, No. 3, 1992.

  55. H. Zheng and S.D. Blostein, “An error-weighted regularization algorithm for image motion-field estimation,” IEEE Trans. on Image Processing, Vol. 2, No. 2, pp. 246-252, 1993.

    Google Scholar 

  56. Q. Zheng and R. Chellappa, “Automatic feature point extraction and tracking in image sequences for unknown image motion,” in Proceedings of ICCV, Berlin, Germany, May 1993, pp. 335-339.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Beauchemin, S.S., Barron, J.L. On the Fourier Properties of Discontinuous Motion. Journal of Mathematical Imaging and Vision 13, 155–172 (2000). https://doi.org/10.1023/A:1011220130307

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1011220130307

Navigation