Abstract
We present a Probably Approximate Correct (PAC) learning paradigm for boolean formulas, which we call PAC meditation, where the class of formulas to be learnt are not known in advance. On the contrary we split the building of the hypothesis in various levels of increasing description complexity according to additional constraints received at run time. In particular, starting from atomic forms constituted by clauses and monomials learned from the examples at the 0-level, we provide a procedure for computing hypotheses in the various layers of a polynomial hierarchy including k_term-DNF formulas at the second level. Assessment of the sample complexity is based on the notion of sentry functions, introduced in a previous paper, which extends naturally to the various levels of the learning procedure. We make a distinction between meditations which waste some sample information and those which exploit all information at each description level, and propose a procedure that is free from information waste. The procedure takes only a polynomial time if we restrict us to learn an inner and outer boundary to the target formula in the polynomial hierarchy, while an access to an NP-oracle is needed if we want to fix the hypothesis in a proper representation.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Valiant, L.G.: A theory of the learnable. Comm. of the ACM 11 (1984) 1134–1142
Blum, A.: Learning boolean functions in an infinite attribute space. Machine Learning9 (1992)373–386
Schapire, R.E.: The strength of weak learnability. Machine Learning 2 (1990) 197–227
Linial, N., Mansour, Y., Rivest, R.L.: Results on learnability and the vapnik-chervonenkis dimension. Information and Computation 90 (1991) 33–49
Qunilan, J.R.: C4.5: programs for machine learning. Morgan Kaufmann Publishers, San Mateo, California (1993)
Andrews, R., Geva, S.: Inserting and extracting knowledge from constrained backpropa-gation network. In: Proc. 6th Australian Conference on Neural Networks, Sidney (1995) 29–32
Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Apolloni, B., Malchiodi, D., Orovas, C., Palmas, G.: From synapses to rules. Cognitive Systems Research (2002) in press.
Apolloni, B., Chiaravalli, S.: Pac learning of concept classes through the boundaries of their items. Theoretical Computer Science 172 (1997) 91–120
Apolloni, B., Baraghini, F., Palmas, G.: PAC meditation on boolean formulas. Technical report, Università degli Studi di Milano (2001)
Pitt, L., Valiant, L.: Computational limitations on learning from examples. J. ACM 35 (1988)965–984
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Apolloni, B., Baraghini, F., Palmas, G. (2002). PAC Meditation on Boolean Formulas. In: Koenig, S., Holte, R.C. (eds) Abstraction, Reformulation, and Approximation. SARA 2002. Lecture Notes in Computer Science(), vol 2371. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45622-8_20
Download citation
DOI: https://doi.org/10.1007/3-540-45622-8_20
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-43941-7
Online ISBN: 978-3-540-45622-3
eBook Packages: Springer Book Archive