Abstract
We generalize the classical notion of VC-dimension to ordinal VC-dimension, in the context of logical learning paradigms. Logical learning paradigms encompass the numerical learning paradigms commonly studied in Inductive inference. A logical learning paradigm is defined as a set \(\mathcal{W}\) of structures over some vocabulary, and a set \(\mathcal{D}\) of first-order formulas that represent data. The sets of models of ϕ in \(\mathcal{W}\), where ϕ varies over \(\mathcal{D}\), generate a natural topology \(\mathbb{W}\) over \(\mathcal{W}\).
We show that if \(\mathcal{D}\) is closed under boolean operators, then the notion of ordinal VC-dimension offers a perfect characterization for the problem of predicting the truth of the members of \(\mathcal{D}\) in a member of \(\mathcal{W}\), with an ordinal bound on the number of mistakes. This shows that the notion of VC-dimension has a natural interpretation in Inductive Inference, when cast into a logical setting. We also study the relationships between predictive complexity, selective complexity—a variation on predictive complexity—and mind change complexity. The assumptions that \(\mathcal{D}\) is closed under boolean operators and that \(\mathbb{W}\) is compact often play a crucial role to establish connections between these concepts.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Ambainis, A., Jain, S., Sharma, A.: Ordinal mind change complexity of language identification. Theoretical Computer Science 220(2), 323–343 (1999)
Ambainis, A., Freivalds, R., Smith, C.: Inductive Inference with Procrastination: Back to Definitions. Fundamenta Informaticae 40, 1–16 (1999)
Angluin, D.: Inductive Inference of Formal Languages from Positive Data. Information and Control 45, 117–135 (1980)
Ben-David, S., Gurvits, L.: A note on VC-Dimension and Measure of Sets of Reals. Combinatorics Probability and Computing 9, 391–405 (2000)
Ben-David, S., Jacovi, M.: On Learning in the Limit and Non-Uniform (ε, δ)- Learning. In: Proceedings of the Sixth Conference on Computational Learning Theory, pp. 209–217. ACM Press, New York (1993)
Blumer, A., Ehrenfeucht, A., Haussler, D., Warmuth, M.: Learnability and the Vapnik-Chervonenkis Dimension. J. ACM 36(4), 929–965 (1989)
Freivalds, R., Smith, C.: On the role of procrastination for machine learning. Inform. Comput. 107(2), 237–271 (1993)
Gold, E.: Language Identification in the Limit. Information and Control 10 (1967)
Martin, E., Sharma, A., Stephan, F.: A General Theory of Deduction, Induction, and Learning. In: Jantke, K.P., Shinohara, A. (eds.) DS 2001. LNCS (LNAI), vol. 2226, pp. 228–242. Springer, Heidelberg (2001)
Martin, E., Sharma, A., Stephan, F.: Logic, Learning, and Topology in a Common Framework. In: Cesa-Bianchi, N., Numao, M., Reischuk, R. (eds.) ALT 2002. LNCS (LNAI), vol. 2533, pp. 248–262. Springer, Heidelberg (2002)
Sharma, A.: A note on batch and incremental learnability. Journal of Computer and System Sciences 56, 272–276 (1998)
Valiant, L.: A Theory of the Learnable. Commun. ACM 27(11), 1134–1142 (1984)
Vapnik, V., Chervonenkis, A.: On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probabilities and its Applications 16(2), 264–280 (1971)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Martin, E., Sharma, A., Stephan, F. (2003). On Ordinal VC-Dimension and Some Notions of Complexity. In: Gavaldá, R., Jantke, K.P., Takimoto, E. (eds) Algorithmic Learning Theory. ALT 2003. Lecture Notes in Computer Science(), vol 2842. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-39624-6_7
Download citation
DOI: https://doi.org/10.1007/978-3-540-39624-6_7
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-20291-2
Online ISBN: 978-3-540-39624-6
eBook Packages: Springer Book Archive