Abstract
Inducing general functions from specific training examples is a central problem in the machine learning. Using sets of If-then rules is the most expressive and readable manner. To find If-then rules, many induction algorithms such as ID3, AQ, CN2 and their variants, were proposed. Sequential covering is the kernel technique of them. To avoid testing all possible selectors, Entropy gain is used to select the best attribute in ID3. Constraint of the size of star was introduced in AQ and beam search was adopted in CN2. These methods speed up their induction algorithms but many good selectors are filtered out. In this work, we introduce a new induction algorithm that is based on enumeration of all possible selectors. Contrary to the previous works, we use pruning power to reduce irrelative selectors. But we can guarantee that no good selectors are filtered out. Comparing with other techniques, the experiment results demonstrate that the rules produced by our induction algorithm have high consistency and simplicity.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Beyer, K.S., Goldstein, J., Ramakrishnan, R., Shaft, U.: When Is ‘Nearest Neighbor’ Meaningful. In: Beeri, C., Bruneman, P. (eds.) ICDT 1999. LNCS, vol. 1540, pp. 217–235. Springer, Heidelberg (1998)
Clark, P., Boswell, R.: Rule induction with CN2: some recent improvements. In: Kodratoff, Y. (ed.) EWSL 1991. LNCS, vol. 482, pp. 151–163. Springer, Heidelberg (1991)
Divina, F., Marc, E.: Evolutionary Concept Learning. In: GECCO, pp. 343–350 (2002)
Fayyad, U.M., Irani, K.B.: Multi-Interval Discretization of Continuous-Valued Attributes for Classification Learning. In: IJCAI 1993, pp. 1022–1027 (1993)
Jong, K., Spears, W.: Using Genetic Algorithms for Concept Learning. Machine Learning 13, 161–188 (1993)
Michalski, R.S.: On the quasi-minimal solution of the general covering problem. In: Proceedings of ISIP, vol. A3, pp. 125–128 (1969)
Michalski, R.S., Carbonell, J.G., Mitchell, T.M.: Machine learning an artificial intelligence approach. Morgan Kaufmann Publishers, INC., San Francisco (1983)
Mitchell, T.: Machine learning. McGraw-Hill, New York (1997)
Michalski, R.S., Mozetic, I., Hong, J., Lavrac, N.: The AQ15 inductive learning system: An overview and experiments. Technical Report UIUCDCS-R-86-1260, University of Illinois, Urbana-Champaign. IL (1986)
Potter, M.A.: The Coevolution of Antibodies for Concept Learning. Presented at 5th International Conference of Parallel Problem Solving from Nature, pp. 530–539 (1998)
Quinlan, J.R.: Induction of Decision Trees. Machine Learning 1(1), 81–106 (1986)
Winston, P.: C4.5 Decision Tree (2002), http://www2.cs.uregina.ca/~hamilton/
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
An, J., Chen, YP.P. (2005). Yet Another Induction Algorithm. In: Khosla, R., Howlett, R.J., Jain, L.C. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2005. Lecture Notes in Computer Science(), vol 3682. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11552451_6
Download citation
DOI: https://doi.org/10.1007/11552451_6
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-28895-4
Online ISBN: 978-3-540-31986-3
eBook Packages: Computer ScienceComputer Science (R0)