Skip to main content

On Ordinal VC-Dimension and Some Notions of Complexity

  • Conference paper
Algorithmic Learning Theory (ALT 2003)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2842))

Included in the following conference series:

Abstract

We generalize the classical notion of VC-dimension to ordinal VC-dimension, in the context of logical learning paradigms. Logical learning paradigms encompass the numerical learning paradigms commonly studied in Inductive inference. A logical learning paradigm is defined as a set \(\mathcal{W}\) of structures over some vocabulary, and a set \(\mathcal{D}\) of first-order formulas that represent data. The sets of models of ϕ in \(\mathcal{W}\), where ϕ varies over \(\mathcal{D}\), generate a natural topology \(\mathbb{W}\) over \(\mathcal{W}\).

We show that if \(\mathcal{D}\) is closed under boolean operators, then the notion of ordinal VC-dimension offers a perfect characterization for the problem of predicting the truth of the members of \(\mathcal{D}\) in a member of \(\mathcal{W}\), with an ordinal bound on the number of mistakes. This shows that the notion of VC-dimension has a natural interpretation in Inductive Inference, when cast into a logical setting. We also study the relationships between predictive complexity, selective complexity—a variation on predictive complexity—and mind change complexity. The assumptions that \(\mathcal{D}\) is closed under boolean operators and that \(\mathbb{W}\) is compact often play a crucial role to establish connections between these concepts.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ambainis, A., Jain, S., Sharma, A.: Ordinal mind change complexity of language identification. Theoretical Computer Science 220(2), 323–343 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  2. Ambainis, A., Freivalds, R., Smith, C.: Inductive Inference with Procrastination: Back to Definitions. Fundamenta Informaticae 40, 1–16 (1999)

    MATH  MathSciNet  Google Scholar 

  3. Angluin, D.: Inductive Inference of Formal Languages from Positive Data. Information and Control 45, 117–135 (1980)

    Article  MATH  MathSciNet  Google Scholar 

  4. Ben-David, S., Gurvits, L.: A note on VC-Dimension and Measure of Sets of Reals. Combinatorics Probability and Computing 9, 391–405 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  5. Ben-David, S., Jacovi, M.: On Learning in the Limit and Non-Uniform (ε, δ)- Learning. In: Proceedings of the Sixth Conference on Computational Learning Theory, pp. 209–217. ACM Press, New York (1993)

    Chapter  Google Scholar 

  6. Blumer, A., Ehrenfeucht, A., Haussler, D., Warmuth, M.: Learnability and the Vapnik-Chervonenkis Dimension. J. ACM 36(4), 929–965 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  7. Freivalds, R., Smith, C.: On the role of procrastination for machine learning. Inform. Comput. 107(2), 237–271 (1993)

    Article  MATH  MathSciNet  Google Scholar 

  8. Gold, E.: Language Identification in the Limit. Information and Control 10 (1967)

    Google Scholar 

  9. Martin, E., Sharma, A., Stephan, F.: A General Theory of Deduction, Induction, and Learning. In: Jantke, K.P., Shinohara, A. (eds.) DS 2001. LNCS (LNAI), vol. 2226, pp. 228–242. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  10. Martin, E., Sharma, A., Stephan, F.: Logic, Learning, and Topology in a Common Framework. In: Cesa-Bianchi, N., Numao, M., Reischuk, R. (eds.) ALT 2002. LNCS (LNAI), vol. 2533, pp. 248–262. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  11. Sharma, A.: A note on batch and incremental learnability. Journal of Computer and System Sciences 56, 272–276 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  12. Valiant, L.: A Theory of the Learnable. Commun. ACM 27(11), 1134–1142 (1984)

    Article  MATH  Google Scholar 

  13. Vapnik, V., Chervonenkis, A.: On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probabilities and its Applications 16(2), 264–280 (1971)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Martin, E., Sharma, A., Stephan, F. (2003). On Ordinal VC-Dimension and Some Notions of Complexity. In: Gavaldá, R., Jantke, K.P., Takimoto, E. (eds) Algorithmic Learning Theory. ALT 2003. Lecture Notes in Computer Science(), vol 2842. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-39624-6_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-39624-6_7

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-20291-2

  • Online ISBN: 978-3-540-39624-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics