Abstract
In this paper we describe the improved version of a novel rule induction algorithm, namely ILA. We first outline the basic algorithm, and then present how the algorithm is enhanced using the new evaluation metric that handles uncertainty in a given data set. In addition to having a faster induction than the original one, we believe that our contribution comes into picture with a new metric that allows users to define their preferences through a penalty factor. We use this penalty factor to tackle with over-fitting bias, which is inherently found in a great many of inductive algorithms. We compare the improved algorithm ILA-2 to a variety of induction algorithms, including ID3, OC1, C4.5, CN2, and ILA. According to our preliminary experimental work, the algorithm appears to be comparable to the well-known algorithms such as CN2 and C4.5 in terms of accuracy and size.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Clark, P. & Niblett, T., (1989). “The CN2 Induction Algorithm”, Machine Learning, 3, pp.261–283.
Deogun, J. S., Raghavan, V. V., Sarkar A., and Sever, H., (1997). “Data Mining: Trends in Research and Development”, Rough Sets and Data Mining: Analysis for imprecise Data. (T. Y. Lin and N. Cercone, Eds), Kluwer Academic Publishers.
Fayyad U.M. (1996). “Data Mining and Knowledge Discovery: Making Sense Out of Data”, IEEE Expert, October, pp. 20–25.
Kohavi, R., Sommerfield, D., & Dougherty, J., (1996). “Data Mining Using MLC++: A Machine Learning Library in C++”, Tools with Al, pp. 234–245.
Langley, P., (1996). Elements of Machine Learning. San Francisco: Morgan Kaufmann Publishers.
Matheus, C.J., Chan, P. K., & Piatetsky-Shapiro, G., (1993). “Systems for Knowledge Discovery in Databases”, IEEE Trans. on Knowledge and Data Engineering, 5(6), pp.903–912.
Merz, C. J., & Murphy, P. M., (1997). UCI Repository of Machine Learning Databases, http://www.ics.uci.edu/~mlearn/MLRepository.html, Irvine, CA: University of California, Department of Information and Computer Science.
Murthy, S.K., Kasif, S., & Salzberg, S., (1994). “A System for Induction of Oblique Decision Trees”, Journal of Artificial Intelligence Research, 2, pp. l–32.
Quinlan J.R. 1986. “Induction of Decision Trees”, Machine Learning, 1, pp. 811–06.
Quinlan, J.R., (1993). C4.5: Programs for Machine Learning. Philadelphia, PA: Morgan Kaufmann.
Quinlan, J.R., (1994). “The Minimum Description Length Principle and Categorical Theories”, Proceedings of the 11th International Conference on Machine Learning, pp. 233–241.
Salzberg, S., (1995). “On Comparing Classifiers: A critique of current research and Methods”, Technical Report JHU-5/06, Department of Computer Science, John Hopkins University, May 1995.
Simoudis E. (1996), “Reality Check for Data Mining”, IEEE Expert, October, pp. 26–33.
Tolun, M.R., & Abu-Soud, S.M., (1998). “ILA: An Inductive Learning Algorithm for Rule Extraction”, to appear in Expert Systems with Applications.
Zadeh, L. A., (1994). “Soft Computing and Fuzzy Logic”, IEEE Software, pp 48–56.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1998 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Tolun, M.R., Sever, H., Uludağ, M. (1998). Improved rule discovery performance on uncertainty. In: Wu, X., Kotagiri, R., Korb, K.B. (eds) Research and Development in Knowledge Discovery and Data Mining. PAKDD 1998. Lecture Notes in Computer Science, vol 1394. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-64383-4_26
Download citation
DOI: https://doi.org/10.1007/3-540-64383-4_26
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-64383-8
Online ISBN: 978-3-540-69768-8
eBook Packages: Springer Book Archive