Abstract
This paper provides a new pruning method for classification trees based on the impurity-complexity measure. Advantages of the proposed approach compared to the error-complexity pruning method are outlined showing an example on a real data set.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Breiman, L., Friedman, J. H., Olshen, R. A. & Stone, C. J. (1984). Classification and Regression Trees. Wadsworth, Belmont CA.
Cappelli, C. & Siciliano, R. (1997). On simplification methods for decision trees. In: Proceedings of the Ngus-97 Conference, 104–108.
Esposito, F., Malerba, D. & Semeraro, G. (1993). Decision tree pruning as a search in the state space. In: Machine Learning: ECML-93, Lecture Notes in Artificial Intelligence (ed. P. Bradzil), 165–184. Berlin: Springer Verlag.
Light, R.J. & Margolin, B.H. (1971). An analysis of variance for categorical data. Journal of the American Statistical Association, 66, 534–544.
Mingers, J. (1989). An empirical comparison of pruning methods for decision tree induction. Machine Learning, 4, 227–243.
Mola, F & Siciliano, R (1997). A Fast Splitting Procedure for Classification trees. Statistics and Computing, 7, 208–216.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1998 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Cappelli, C., Mola, F., Siciliano, R. (1998). An Alternative Pruning Method Based on the Impurity-Complexity Measure. In: Payne, R., Green, P. (eds) COMPSTAT. Physica, Heidelberg. https://doi.org/10.1007/978-3-662-01131-7_25
Download citation
DOI: https://doi.org/10.1007/978-3-662-01131-7_25
Publisher Name: Physica, Heidelberg
Print ISBN: 978-3-7908-1131-5
Online ISBN: 978-3-662-01131-7
eBook Packages: Springer Book Archive