Abstract
In this work we consider the application context of planar passive navigation in which the visual control of locomotion requires only the direction of translation, and not the full set of motion parameters. If the temporally changing optic array is represented as a vector field of optical velocities, the vectors form a radial pattern emanating from a centre point, called the Focus of Expansion (FOE), representing the heading direction. The FOE position is independent of the distances of world surfaces, and does not require assumptions about surface shape and smoothness. We investigate the performance of an artificial neural network for the computation of the image position of the FOE of an Optical Flow (OF) field induced by an observer translation relative to a static environment. The network is characterized by a feed-forward architecture, and is trained by a standard supervised back-propagation algorithm which receives as input the pattern of points where the lines generated by 2D vectors are projected using the Hough transform. We present results obtained on a test set of synthetic noisy optical flows and on optical flows computed from real image sequences.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Verri A, Poggio T. Motion field and optical flow: qualitative properties. IEEE Trans PAMI 1990; 11(5): 490–498
Gibson JJ. The Perception of the Visual World. Houghton Mifflin, Boston, 1950
Longuet-Higgins HC, Prazdny K. The interpretation of a moving retinal image. Proc Roy Soc Lond, Ser B 1980; 208: 385–397
Warren WH, Morris J, Kalish M. Perception of translational heading from optical flow. J Experimental Psychology: Human Perception and Performance 1988; 14(4)
Nelson RC, Aloimonos J. Obstacle avoidance using flow field divergence. IEEE Trans Patt Anal Mach Intell 1989; 11(10)
Burger W, Bhanu B. Estimating 3D egomotion from perspective image sequences. IEEE Trans PAMI 1990; 12(18): 1040–1058
De Micheli E, Torre V, Uras S. The accuracy of the computation of optical flow and of the recovery of motion parameters. IEEE Trans Patt Anal Mach Intell 1993; 15(5).
Hummel R, Sundareswaran V. Motion parameter estimation from global flow field data. IEEE Trans Patt Anal Mach Intell 1993; 15(5).
Meyer FG. Time to collision from first-order models of the motion field. IEEE Trans Rob Autom 1994; 10(6)
Prazdny K. Determining the instantaneous direction of motion from optical flow generated by a curvilinearly moving observer. CGIP 1981; 17: 238–248
Ballard DH, Brown CM. Computer Vision. Prentice-Hall, Englewood Cliffs, NJ
Moravec HP. The Stanford cart and the CMU rover. Proc IEEE 1983; 71(7): 872–878
Branca A, Convertino G, Distante A. Hopfield neural network for correspondence problems in dynamic image analysis. Int Conf Artificial Neural Networks. October 1995; Paris
Campani M, Verri A. Motion analysis from first-order properties of optical flow. CVGIP: Image Understanding 1992; 56(1): 90–107
Rumelhart DE, McClelland JL. PDP Research Group. Parallel Distributed Processing: Explorations in the Micro-structure of Cognition. Vol. 1: Foundations. MIT Press, Cambridge, MA 1987
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Branca, A., Stella, E., Attolico, G. et al. Focus of Expansion estimation by an error backpropagation neural network. Neural Comput & Applic 6, 142–147 (1997). https://doi.org/10.1007/BF01413825
Issue Date:
DOI: https://doi.org/10.1007/BF01413825