Skip to main content

The effective VC dimension of the n-tuple classifier

  • Part III: Learning: Theory and Algorithms
  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN'97 (ICANN 1997)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1327))

Included in the following conference series:

Abstract

One family of classifiers which has has considerable experimental success over the last thirty year is that of the n-tuple classifier and its descendents. However, the theoretical basis for such classifiers is uncertain despite attempts from time to time to place it in a statistical framework. In particular the most commonly used training algorithms do not even try to minimise recognition error on the training set. In this paper the tools of statistical learning theory are applied to the classifier in an attempt to describe the classifier's effectiveness. In particular the effective VC dimension of the classifier for various input distributions is calculated experimentally, and these results used as the basis for a discussion of the behaviour of the n-tuple classifier. As a side-issue an error-minimising algorithm for the n-tuple classifier is also proposed and briefly examined.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. I. Aleksander and T.J. Stonham. Guide to pattern recogntion using random-access memories. Computers and Digital Techniques, 2:29–40, 1979.

    Google Scholar 

  2. W.W. Bledsoe and C.L. Bisson. Improved memory matrices for the n-tuple recogntion method. In IRE Joint Computer Conference,11, pages 414–415, 1962.

    Google Scholar 

  3. W.W. Bledsoe and L. Browning. Pattern recognition and reading by machine. In Proc. Eastern Joint Computer Conf., pages 232–255, 1959.

    Google Scholar 

  4. N.P. Bradshaw. An Analysis of Learning in Weightless Neural Systems. PhD thesis, Imperial College, London., 1996.

    Google Scholar 

  5. N.P. Bradshaw. Improving the generalisation of the n-tuple classifier with the effective VC-dimension. Technical report, IRIDIA, Universite Libre de Bruxelles, 1997.

    Google Scholar 

  6. A. Kolcz and N.M. Allinson. N-tuple regression network. Neural Networks, 9(5):855–869, 1999.

    Google Scholar 

  7. R. Rohwer and M. Morciniec. A theoretical and experimental account of n-tuple classifier performance. Neural Computation, 8:657–670, 1996.

    Google Scholar 

  8. J. Shawe-Taylor, P.L. Bartlett, R.C. Williamson, and M. Anthony. A framework for structural risk minimisation. In Proceedings of the 9th Annual Conference on Computational Learning Theory, 1996.

    Google Scholar 

  9. M.J. Sixsmith, G.D. Tattershall, and J.M. Rollett. Speech recognition using n-tuple techniques. Br Telecom J, 8(2), April 1990.

    Google Scholar 

  10. V. Vapnik. The Nature of Statistical Learning Theory. Spinger-Verlag, 1995.

    Google Scholar 

  11. V. Vapnik, E Levin, and Y LeCun. Measuring the VC-dimension of a learning machine. Neural Computation, 6:851–876, 1994.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Wulfram Gerstner Alain Germond Martin Hasler Jean-Daniel Nicoud

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Bradshaw, N.P. (1997). The effective VC dimension of the n-tuple classifier. In: Gerstner, W., Germond, A., Hasler, M., Nicoud, JD. (eds) Artificial Neural Networks — ICANN'97. ICANN 1997. Lecture Notes in Computer Science, vol 1327. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0020206

Download citation

  • DOI: https://doi.org/10.1007/BFb0020206

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-63631-1

  • Online ISBN: 978-3-540-69620-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics