Abstract
The number of samples needed to learn an instance of the representation class kDNF sn of Boolean formulas is predicted using some tolerance parameters by the PAC framework. When the learning machine is a simple genetic algorithm, the initial population is an issue. Using PAC-learning we derive the population size that has at least one individual at some given Hamming distance from the optimum. Then we show that the population does not need to be close to the optimum in order to learn the concept.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
M. Anthony and N. Biggs. Computational Learning Theory. Cambridge University Press, Cambridge, England, 1992.
Anselm Blumer, Andrzej Ehrenfeucht, David Haussler, and Manfred K. Warmuth. Learnability and the Vapnik-Chervonenkis Dimension. Journal of the ACM, 36(4):929–965, October 1989.
Erick Cantú-Paz. Efficient and Accurate Parallel Genetic Algorithms. Genetic Algorithms and Evolutionary Computation. Kluwer Academic Press, 2000.
David E. Goldberg. Sizing Populations for Serial and Parallel Genetic Algorithms. In J. David Schaffer, editor, Proceedings of the Third International Conference on Genetic Algorithms, pages 70–79, San Mateo, California, 1989. Morgan Kaufmann Publishers.
Arturo Hernández-Aguirre. Sample Complexity and Generalization in Feedforward Neural Networks. PhD thesis, Department of Electrical Engineering and Computer Science, Tulane University, 2000.
Arturo Hernández-Aguirre, Bill Buckles, and Antonio Martínez-Alcántara. The pac population size of a genetic algorithm. In Twelfth International Conference on Tools with Artificial Intelligence, pages 199–202, Vancouver British Columbia, Canada, 13–15 November 2000. IEEE Computer Society.
S.B. Holden and P.J.W. Rayner. Generalization and PAC Learning: Some New Results for the Class of Generalized Single-layer Networks. IEEE Transactions of Neural Networks, 6(2):368–380, March 1995.
Hitoshi Iba, Masaya Iwata, and Tetsuya Higuchi. Machine Learning Approach to Gate-Level Evolvable Hardware. In Tetsuya Higuchi, Masaya Iwata, and Weixin Liu, editors, Evolvable Systems: From Biology to Hardware. First International Conference (ICES’96), pages 327–343, Tsukuba, Japan, October 1996. Springer-Verlag.
Hitoshi Iba, Masaya Iwata, and Tetsuya Higuchi. Gate-Level Evolvable Hardware: Empirical Study and Application. In Dipankar Dasgupta and Zbigniew Michalewicz, editors, Evolutionary Algorithms in Engineering Applications, pages 259–276. Springer-Verlag, Berlin, 1997.
Tatiana G. Kalganova. Evolvable Hardware Design of Combinational Logic Circuits. PhD thesis, Napier University, Edinburgh, Scotland, 2000.
Michael J. Kearns. The Computational Complexity of Machine Learning. MIT Press, Cambridge, Massachusetts, 1990.
Julian F. Miller, Dominic Job, and Vesselin K. Vassilev. Principles in the Evolutionary Design of Digital Circuits—Part I. Genetic Programming and Evolvable Machines, 1(1/2):7–35, April 2000.
Tom Mitchell. Machine Learning. McGraw-Hill, Boston, Massachusetts, 1997.
Heinz Mühlenbein and Dirk Schlierkamp-Voosen. PredictiveModels for the Breeder Genetic Algorithm, I: Continuous Parameter Optimization. Evolutionary Computation, 1(1):25–49, Spring 1993.
Heinz Mühlenbein and Dirk Schlierkamp-Voosen. The Science of Breeding and Its Application to the Breeder Genetic Algorithm (BGA). Evolutionary Computation, 1(4):335–360, Winter 1994.
Giulia Pagallo and David Haussler. Boolean Feature Discovery in Empirical Learning. Machine Learning, 5:71–99, 1990.
Colin R. Reeves. Using Genetic Algorithms with Small Populations. In Stephanie Forrest, editor, Proceedings of the Fifth International Conference on Genetic Algorithms, pages 92–99, San Mateo, California, July 1993. University of Illinois at Urbana Champaign, Morgan Kaufmann Publishers.
Adrian Thompson, Paul Layzell, and Ricardo Salem Zebulum. Explorations in Design Space: Unconventional Design Through Artificial Evolution. IEEE Transactions on Evolutionary Computation, 3(3):167–196, September 1999.
Leslie G. Valiant. A Theory of the Learnable. Communications of the ACM, 27(11):1134–1142, November 1984.
Vladimir Naumovich Vapnik. The Nature of Statistical Learning Theory. Springer-Verlag, New York, 1995.
Vladimir Naumovich Vapnik. Statistical Learning Theory. Wiley, New York, 1996.
M. Vidyasagar. A theory of learning and generalization: with applications to neural networks and control systems. Springer-Verlag, London, 1997.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Aguirre, A.H., Buckles, B.P., Coello, C.C. (2001). GA-Based Learning of kDNF sn Boolean Formulas. In: Liu, Y., Tanaka, K., Iwata, M., Higuchi, T., Yasunaga, M. (eds) Evolvable Systems: From Biology to Hardware. ICES 2001. Lecture Notes in Computer Science, vol 2210. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45443-8_25
Download citation
DOI: https://doi.org/10.1007/3-540-45443-8_25
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42671-4
Online ISBN: 978-3-540-45443-4
eBook Packages: Springer Book Archive