Skip to main content

VC Dimension

  • Reference work entry
  • 743 Accesses

Motivation and Background

We define an important combinatorial parameter that measures the combinatorial complexity of a family of subsets taken from a given universe (learning domain) X. This parameter was originally defined by Vapnik and Chervonenkis (1971) and is thus commonly referred to as Vapnik–Chervonenkis dimension, abbreviated as VC dimension. Subsequently, Dudley (19781979) generalized Vapnik and Chervonenkis (1971) results. The reader is also referred to Vapnik’s (2000) book in which he greatly extends the original ideas. This results in a theory which is called structural risk minimization.

The importance of the VC dimension for PAC Learning was discovered by Blumer, Ehrenfeucht, Haussler, & Warmuth (1989), who introduced the notion to computational learning theory.

As Anthony and Biggs (1992, p. 71) have put it, “The development of this notion is probably the most significant contribution that mathematics has made to Computational Learning Theory.”

Recall that we use | S...

This is a preview of subscription content, log in via an institution.

Recommended Reading

  • Anthony, M., & Bartlett, P. L. (1999). Neural network learning: Theoretical foundations. Cambridge: Cambridge University Press.

    Book  MATH  Google Scholar 

  • Anthony, M., & Biggs, N. (1992). Computational learning theory. Cambridge tracts in theoretical computer science (No. 30). Cambridge: Cambridge University Press.

    Google Scholar 

  • Blumer, A., Ehrenfeucht, A., Haussler, D., & Warmuth, M. K. (1989). Learnability and the Vapnik–Chervonenkis dimension. Journal of the ACM, 36(4), 929–965.

    Article  MATH  MathSciNet  Google Scholar 

  • Dudley, R. M. (1978). Central limit theorems for empirical measures. Annals of Probability, 6(6), 899–929.

    Article  MATH  MathSciNet  Google Scholar 

  • Dudley, R. M. (1979). Corrections to “Central limit theorems for empirical measures”. Annals of Probability, 7(5), 909–911.

    Article  MATH  MathSciNet  Google Scholar 

  • Goldberg, P. W., & Jerrum, M. R. (1995). Bounding the Vapnik–Chervonenkis dimension of concept classes parameterized by real numbers. Machine Learning, 18(2–3), 131–148.

    MATH  Google Scholar 

  • Gurvits, L. (1997). Linear algebraic proofs of VC-dimension based inequalities. In S. Ben-David (Ed.), Computational learning theory, third European conference, EuroCOLT ’97, Jerusalem, Israel, March 1997, Proceedings, Lecture notes in artificial intelligence (Vol. 1208, pp. 238–250). Springer.

    Google Scholar 

  • Haussler, D., & Welz, E. (1987). Epsilon nets and simplex range queries. Discrete & Computational Geometry, 2, 127–151.

    Article  MATH  MathSciNet  Google Scholar 

  • Haussler, D., & Littlestone, N., & Warmuth, M. K. (1994). Predicting f0; 1g functions on randomly drawn points. Information and Computation, 115(2), 248–292.

    Article  MATH  MathSciNet  Google Scholar 

  • Karpinski, M., & Macintyre, A. (1995). Polynomial bounds for VC dimension of sigmoidal neural networks. In Proceedings of twenty-seventh annual ACM symposium on theory of computing (pp. 200–208). New York: ACM Press.

    Chapter  Google Scholar 

  • Karpinski, M., & Werther, T. (1994). VC dimension and sampling complexity of learning sparse polynomials and rational functions. In S. J. Hanson, G. A. Drastal, and R. L. Rivest (Eds.), Computational learning theory and natural learning systems, Vol. I: Constraints and prospects (Chap. 11, pp. 331–354). Cambridge, MA: MIT Press.

    Google Scholar 

  • Kearns, M. J., & Vazirani, U. V. (1994). An introduction to computational learning theory. Cambridge, MA: MIT Press.

    Google Scholar 

  • Littlestone, N. (1988). Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm. Machine Learning, 2(4), 285–318.

    Google Scholar 

  • Maass, W., & Turan, G. (1990). On the complexity of learning from counterexamples and membership queries. In Proceedings of the thirty-first annual symposium on Foundations of Computer Science (FOCS 1990), St. Louis, Missouri, October 22–24, 1990 (pp. 203–210). Los Alamitos, CA: IEEE Computer Society Press.

    Google Scholar 

  • Mitchell, A., Scheffer, T., Sharma, A., & Stephan, F. (1999). The VC-dimension of subclasses of pattern languages. In O. Watanabe & T. Yokomori (Eds.), Algorithmic learning theory, tenth international conference, ALT’99, Tokyo, Japan, December 1999, Proceedings, Lecture notes in artificial intelligence (Vol. 1720, pp. 93–105). Springer.

    Google Scholar 

  • Natschläger, T., & Schmitt, M. (1996). Exact VC-dimension of Boolean monomials. Information Processing Letters, 59(1), 19–20.

    Article  MathSciNet  Google Scholar 

  • Sakurai, A. (1995). On the VC-dimension of depth four threshold circuits and the complexity of Boolean-valued functions. Theoretical Computer Science, 137(1), 109–127. Special issue for ALT ’93

    Google Scholar 

  • Sauer, N. (1972). On the density of families of sets. Journal of Combinatorial Theory (A), 13(1), 145–147.

    Article  MATH  MathSciNet  Google Scholar 

  • Schaefer, M. (1999). Deciding the Vapnik–Červonenkis dimension is Σ 3 p-complete. Journal of Computer System Sciences, 58(1), 177–182.

    Article  MATH  MathSciNet  Google Scholar 

  • Vapnik, V. N. (2000). The nature of statistical learning theory, (2nd ed.). Berlin: Springer.

    MATH  Google Scholar 

  • Vapnik, V. N., & Chervonenkis, A. Y. (1971). On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probabability and Its Applications, 16(2), 264–280.

    Article  MATH  MathSciNet  Google Scholar 

  • Vapnik, V. N., & Chervonenkis, A. Y. (1974). Theory of pattern recognition. Moskwa: Nauka (in Russian).

    Google Scholar 

  • Wenocur, R. S., & Dudley, R. M. (1981). Some special Vapnik–Chervonenkis classes. Discrete Mathematics, 33, 313–318.

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer Science+Business Media, LLC

About this entry

Cite this entry

Zeugmann, T. (2011). VC Dimension. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-30164-8_875

Download citation

Publish with us

Policies and ethics