Abstract.
A robot navigating in an unstructured environment needs to avoid obstacles in its way and determine free spaces through which it can safely pass. We present here a set of optical-flow-based behaviors that allow a robot moving on a ground plane to perform these tasks. The behaviors operate on a purposive representation of the environment called the “virtual corridor” which is computed as follows: the images captured by a forward-facing camera rigidly attached to the robot are first remapped using a space-variant transformation. Then, optical flow is computed from the remapped image stream. Finally, the virtual corridor is extracted from the optical flow by applying simple but robust statistics. The introduction of a space-variant image preprocessing stage is inspired by biological sensory processing, where the projection and remapping of a sensory input field onto higher-level cortical areas represents a central processing mechanism. Such transformations lead to a significant data reduction, making real-time execution possible. Additionally, they serve to “re-present” the sensory data in terms of ecologically relevant features, thereby simplifying the interpretation by subsequent processing stages. In accordance with these biological principles we have designed a space-variant image transformation, called the polar sector map, which is ideally suited to the navigational task. We have validated our design with simulations in synthetic environments and in experiments with real robots.
Similar content being viewed by others
Author information
Authors and Affiliations
Additional information
Received: 1 July 1999 / Accepted in revised form: 20 March 2000
About this article
Cite this article
Baratoff, G., Toepfer, C. & Neumann, H. Combined space-variant maps for optical-flow-based navigation. Biol Cybern 83, 199–209 (2000). https://doi.org/10.1007/s004220000164
Issue Date:
DOI: https://doi.org/10.1007/s004220000164