Abstract
Decision tree is one of the most commonly used methods of machine learning, and ordinal decision tree is an important way to deal with ordinal classification problems. Through researches and analyses on ordinal decision trees based on rank entropy, the rank mutual information for every cut of each continuous-valued attribute is necessary to determine during the selection of expanded attributes for constructing decision trees based on rank entropy in ordinal classification. Then we need to compare these values of rank mutual information to get the maximum which corresponds to the expanded attribute. As the computational complexity is high, an improved algorithm which establishes a mathematical model is proposed. The improved algorithm is theoretically proved that it only traverses the unstable cut-points without computing the values of stable cut-points. Therefore, the computational efficiency of constructing decision trees is greatly improved. Experiments also confirm that the computational time of the improved algorithm can be reduced greatly.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Mitchell, T.M.: Machine Learning, 1st edn. McGraw-Hill Science/Engineering/Math (March 1, 1997)
Wang, X.Z., Hong, J.R.: Learning Algorithm of Decision Tree Generation for Interval-Valued Attributes. Journal of Software 9(8), 637–640 (1998)
Quinlan, J.R.: Induction of Decision Tree. Machine Learning 1(1), 81–106 (1986)
Wu, X.D., Kumar, V., Quinlan, J.R., et al.: Top 10 algorithms in data mining. Knowledge and Information Systems 14(1), 1–37 (2008)
Breiman, L., Friedman, J.H., Olshen, R.A., et al.: Classification and Regression Tree. Wadsworth International Group (1984)
Ben-David, A., Sterling, L., Pao, Y.H.: Learning and classification of monotonic ordinal concepts. Computational Intelligence 5(1), 45–49 (1989)
Zopounidis, C., Doumpos, M.: Multieriteria classification and sorting methods-A literature review. European Journal of Operational Research 138, 229–246 (2002)
Krzysztof, D., Wojciech, K., Roman, S.: Ensemble of Decision Rules for Ordinal Classification with Monotonicity Constraints
Potharst, R., Bioch, J.: Decision trees for ordinal Classification. Intelligent Data Analysis 4(2), 97–112 (2000)
Cao-Van, K., Baets, B.D.: Growing decision trees in an ordinal setting. International Journal of Intelligent Systems 18, 733–750 (2003)
Baril, N., Feelders, A.J.: Nonparametric Monotone Classification with MOCA//ICDM, pp. 731–736 (2008)
Potharst, R., Feelders, A.J.: Classification trees for problems with monotonicity constrains. SIGKDD Explorations 4(1), 1–10 (2002)
Potharst, R., Bioch, J.C.: Decision trees for ordinal classification. Intelligent Data Analysis 4, 97–111 (2000)
Kotlowski, W., Slowinski, R.: Rule learning with monotonicity constrains. In: Proceedings of the 26th Annual International Conference on Machine Learning, Montreal, Quebec, Canada, pp. 537–544 (2009)
Hu, Q.H., Guo, M.Z., Yu, D.R., et al.: Information entropy for ordinal classification. Science China Information Sci. 53(6), 1188–1200 (2010)
Hu, Q., Che, X., et al.: Rank Entropy-Based Decision Trees for Monotonic Classification. IEEE Transactions on Knowledge and Data Engineering 24(11), 2052–2064 (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Chen, J., Zhai, J., Wang, X. (2014). Study and Improvement of Ordinal Decision Trees Based on Rank Entropy. In: Wang, X., Pedrycz, W., Chan, P., He, Q. (eds) Machine Learning and Cybernetics. ICMLC 2014. Communications in Computer and Information Science, vol 481. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-45652-1_22
Download citation
DOI: https://doi.org/10.1007/978-3-662-45652-1_22
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-45651-4
Online ISBN: 978-3-662-45652-1
eBook Packages: Computer ScienceComputer Science (R0)