Skip to main content
Log in

Robust classification of arbitrary object classes based on hierarchical spatial feature-matching

  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract.

We present a novel approach to the robust classification of arbitrary object classes in complex, natural scenes. Starting from a re-appraisal of Marr's ‘primal sketch’, we develop an algorithm that (1) employs local orientations as the fundamental picture primitives, rather than the more usual edge locations, (2) retains and exploits the local spatial arrangement of features of different complexity in an image and (3) is hierarchically arranged so that the level of feature abstraction increases at each processing stage. The resulting, simple technique is based on the accumulation of evidence in binary channels, followed by a weighted, non-linear sum of the evidence accumulators. The steps involved in designing a template for recognizing a simple object are explained. The practical application of the algorithm is illustrated, with examples taken from a broad range of object classification problems. We discuss the performance of the algorithm and describe a hardware implementation. First successful attempts to train the algorithm, automatically, are presented. Finally, we compare our algorithm with other object classification algorithms described in the literature.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lang, G., Seitz, P. Robust classification of arbitrary object classes based on hierarchical spatial feature-matching. Machine Vision and Applications 10, 123–135 (1997). https://doi.org/10.1007/s001380050065

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s001380050065

Navigation