Abstract
We describe a stochastic algorithm learning Boolean functions from positive and negative examples. The Boolean functions are represented by disjunctive normal form formulas. Given a target DNF F depending on n variables and a set of uniformly distributed positive and negative examples, our algorithm computes a hypothesis H that rejects a given fraction of negative examples and has an ɛ-bounded error on positive examples. The stochastic algorithm utilises logarithmic cooling schedules for inhomogeneous Markov chains. The paper focuses on experimental results and comparisons with a previous approach where all negative examples have to be rejected [4]. The computational experiments provide evidence that a relatively high percentage of correct classifications on additionally presented examples can be achieved, even when misclassifications are allowed on negative examples. The detailed convergence analysis will be presented in a forthcoming paper [3].
Research partially supported by the AIF Research Programme under Grant No. FKV 0352401N7. Part of the research was done while the first author was visiting the CUHK within the SRP under Grant No. 9505.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
E.H.L. Aarts and J.H.M. Korst. Simulated Annealing and Boltzmann Machines: A Stochastic Approach (Wiley & Sons, New York, 1989).
H. Aizenstein and L. Pitt. On the Learnability of Disjunctive Normal Form Formulas. Machine Learning, 19:183–208, 1995.
A. Albrecht and C.K. Wong. A DNF Approximation Algorithm Based on Inhomogeneous Markov Chains (submitted for publication, 1999).
A. Albrecht, R. Müller, and M. Patze. A Stochastic Learning Procedure for Boolean Functions. In: Derek Bridge et al., editors, Proc. l0 th Annual Irish Conference on Artificial Intelligence & Cognitive Science, pp. 65–71, 1999.
D. Angluin. Queries and Concept Learning. Machine Learning, 2:319–342, 1988.
O. Catoni. Rough Large Deviation Estimates for Simulated Annealing: Applications to Exponential Schedules. The Annals of Probability, 20(3):1109–1146, 1992.
O. Catoni. Metropolis, Simulated Annealing, and Iterated Energy Transformation Algorithms: Theory and Experiments. Journal of Complexity, 12(4):595–623, 1996.
V. Černy. A Thermodynamical Approach to the Travelling Salesman Problem: An Efficient Simulation Algorithm. Preprint, Inst. of Physics and Biophysics, Comenius Univ., Bratislava, 1982 (see also: J. Optim. Theory Appi, 45:41–51, 1985).
P. Clark and T. Niblett. The CN2 Induction Algorithm. Machine Learning, 3:261–283, 1989.
B. Hajek. Cooling Schedules for Optimal Annealing. Mathem. Oper. Res., 13:311–329, 1988.
J. Jackson. An Efficient Membership-Query Algorithm for Learning DNF with Respect to the Uniform Distribution. In Proc. of the 35 th Annual Symposium on Foundations of Computer Science, pp. 42–53, 1994.
M. Kearns, M. Li, L. Pitt, and L.G. Valiant. Recent Results on Boolean Concept Learning. In Proc. 4 th Int. Workshop on Machine Learning, pp. 337–352, 1987.
M. Kearns and M. Li. Learning in the Presence of Malicious Errors. In Proc. of the 20 th Annual Symposium on the Theory of Computations, pp. 267–279, 1988.
S. Kirkpatrick, CD. Gelatt, Jr., and M.P. Vecchi. Optimization by Simulated Annealing. Science, 220:671–680, 1983.
E. Kushilevitz and D. Roth. On Learning Visual Concepts and DNF Formulae. Machine Learning, 24:65–85, 1996.
Y. Mansour. An n o(log log n) Learning Algorithm for DNF under the Uniform Distribution. In Proc. of the 5 th Annual Workshop on Computational Learning Theory, pp. 53–61, 1992.
H.D. Mathias. DNF-If You Can’t Learn’ em, Teach’ em: An Interactive Model of Teaching. In Proc. of the 8 th Annual Workshop on Computational Learning Theory, pp. 222–229, 1995.
N. Metropolis, A.W. Rosenbluth, M.N. Rosenbluth, A.H. Teller, and E. Teller. Equation of State Calculations by Fast Computing Machines. The Journal of Chemical Physics, 21(6):1087–1092, 1953.
K. Pillaipakkamnatt and V. Raghavan. On the Limits of Proper Learnability of Subclasses of DNF Formulas. In Proc. of the 7 th Annual Workshop on Computational Learning Theory, pp. 118–129, 1994.
R.L. Rivest. Learning Decision Lists, Machine Learning, 2(3):229–246, 1987.
H. Shvaytser. Learnable and Nonlearnable Visual Concepts. IEEE Transactions on Pattern Analysis and Machine Intelligence, 12(5):459–466, May 1990.
L.G. Valiant. A Theory of the Learnable. Comm. ACM, 27(11):1134–1142, 1984.
L.G. Valiant. Learning Disjunctions of Conjunctions. In Proc. of the 9 th International Joint Congerence on Artificial Intelligence, pp. 560–566, 1985.
K. Verbeurgt. Learning DNF under the Uniform Distribution in Quasi-Polynomial Time. In Proc. of the 3 rd Annual Workshop on Computational Learning Theory, pp. 314–326, 1990.]
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Albrecht, A., Steinhöfel, K. (1999). A Simulated Annealing-Based Learning Algorithm for Boolean DNF. In: Foo, N. (eds) Advanced Topics in Artificial Intelligence. AI 1999. Lecture Notes in Computer Science(), vol 1747. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46695-9_17
Download citation
DOI: https://doi.org/10.1007/3-540-46695-9_17
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-66822-0
Online ISBN: 978-3-540-46695-6
eBook Packages: Springer Book Archive