Abstract
Decision tree is one of the popular data mining algorithms and it has been applied on many classification application areas. In many applications, the number of attribute values may be over hundreds and that will be difficult to analyze the result. The purpose of this paper will focus on the construction of categorical decision trees. A binary splitting decision tree algorithm is proposed to simplify the classification outcomes. It adopts the complement operation to simplify the split of interior nodes and it is suitable to apply on the decision trees where the number of outcomes is numerous. In addition, meta-attribute could be applied on some applications where the number of outcomes is numerous and the meta-attribute is meaningful. The benefit of meta-attribute representation is that it could transfer the original attributes into higher level concepts and that could reduce the number of outcomes.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Quinlan, J.R.: Induction of decision trees. Machine Learning 1(1), 81–106 (1986)
Shannon, C.E.: A mathematical theory of communication. Bell System Technical Journal 27, 379–423, 623–656 (1948)
Quinlan, J.R. (ed.): C4.5: Programs for Machine Learning. Kaufmann Publishers Inc., San Francisco (1993)
Breiman, L., Friedman, J., Stone, C.J., Olshen, R.: Classification And Regression Trees. Chapman & Hall/CRC, Boca Raton (1984)
Mehta, M., Agrawal, R., Rissanen, J.: Sliq: A fast scalable classifier for data mining. In: Proc. of the Fifth International Conference on Extending Database Technology, pp. 18–32 (1996)
Shafer, J.C., Agrawal, R., Mehta, M.: Sprint: A scalable parallel classifier for data mining. In: Proc. 22nd Int. Conf. Very Large Databases, VLDB, pp. 544–555 (1996)
Fayyad, U.M., Irani, K.B.: Multi-interval discretization of continuous-valued attributes for classification learning. In: Proc. of the 13th International Joint Conference on Artificial Intelligence, pp. 1022–1029 (1993)
Mehta, M., Rissanen, J., Agrawal, R.: Mdl-based decision tree pruning. In: Proceedings of the First International Conference on Knowledge Discovery and Data Mining (KDD 1995), pp. 216–221 (1995)
Quinlan, J.R., Rivest, R.L.: Inferring decision trees using the minimum description length principle. Information and Computation 80(3), 227–248 (1989)
Quinlan, J.R.: Simplifying decision trees. International Journal of Man-Machine Studies 27(3), 221–234 (1987)
Garofalakis, M., Hyun, D., Rastogi, R., Shim, K.: Building decision trees with constraints. Data Mining and Knowledge Discovery 7(2), 187–214 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Liu, CL., Lee, CH. (2008). Simplify Multi-valued Decision Trees. In: Kang, L., Cai, Z., Yan, X., Liu, Y. (eds) Advances in Computation and Intelligence. ISICA 2008. Lecture Notes in Computer Science, vol 5370. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-92137-0_64
Download citation
DOI: https://doi.org/10.1007/978-3-540-92137-0_64
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-92136-3
Online ISBN: 978-3-540-92137-0
eBook Packages: Computer ScienceComputer Science (R0)