Skip to main content
Log in

A neural paradigm for time-varying motion segmentation

  • Regular Papers
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

This paper proposes a new neural algorithm to perform the segmentation of an observed scene, into regions corresponding to different moving objects by analyzing a time-varying images sequence. The method consists of a classification step, where the motion of small patches, is characterized through an optimization approach, and a segmentation step merging neighboring patches characterized by the same motion. Classification of motion is performed without optical flow computation, but considering only the spatial and temporal image gradients into an appropriate energy function minimized with a Hopfield-like neural network giving as output directly the 3D motion parameter estimates. Network convergence is accelerated by integrating the quantitative estimation of motion parameters with a qualitative estimate of dominant motion using the geometric theory of differential equations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bajcsy R, Campos M. Active and exploratory perception.International Journal of Computer Vision, 1992, 56(1): 31–40.

    MATH  Google Scholar 

  2. Shu C F, Jain R. Vector field analysis for oriented pattern.IEEE Trans. PAMI, 1994, 16(9): 946–950.

    Google Scholar 

  3. Campani M, Verri A. Motion analysis from first-order properties of optical flow.CVGIP: Image Understanding, 1992, 56(1): 90–107.

    Article  MATH  Google Scholar 

  4. Black M J, Anandan P. The robust estimation of multiple motions: Parametric and plecewise-smooth flow fields.Computer Vision and Image Understanding 1996, 63(1): 75–104.

    Article  Google Scholar 

  5. Rao A R, Jain R C. Computerized flow field analysis: Oriented texture fields.IEEE Trans. PRMI, 1992, 14(7): 693–709.

    Google Scholar 

  6. Verri A, Girosi F, Torre V. Mathematical properties of the two-dimensional motion field, from singular points to motion parameters.J. Opt. Soc. Am, 1989 6(5): 912–921.

    Article  MathSciNet  Google Scholar 

  7. Verri A, Poggio T Motion field and optical flow: Qualitative properties.IEEE Trans. PRMI, 1990, 11(5): 490–498.

    Google Scholar 

  8. Chellappa W R, Zheng Q. Experiments on estimating egomotion and structure parameters using long monocular image sequence.International Journal of Computer Vision 1995, 15: 492–515.

    Google Scholar 

  9. Zhou Y T, Chellappa R. A heural network for motion processing. InNeural Network for Perception, 1993, Vol. 1 (Academic Press-ISBN0-12-741251-4), pp.492–515.

Download references

Author information

Authors and Affiliations

Authors

Additional information

This paper was supported by the National Natural Science Foundation of China under Nos.69585002 and 69785003.

YANG Jingan is presently a Professor in computer sciences, supervisor of Ph.D. candidates, Deputy Director, School of Computer & Information Sciences, Fellow Academy Member, New York Academy of Sciences, Senior Member of the IEEE Senior Member of the IEEE Computer Society, Vice-Director of Anhui Province Computer Federation, National Distinguished Expert of China. He studied at University of Science & Technology of China for two years after he graduated from Hefei University of Technology in 1969. He was awarded the title of 1998 Anhui provinc distinguished teacher. He is currently engaged in teaching and scientific research on computer vision, artificial intelligence and robotics, knowledge engineering, and multimedia computer technique and visual reality. He has authored more than 140 papers and published 2 books.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yang, J. A neural paradigm for time-varying motion segmentation. J. Comput. Sci. & Technol. 14, 539–550 (1999). https://doi.org/10.1007/BF02951873

Download citation

  • Received:

  • Revised:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02951873

Keywords

Navigation