Abstract
Traditional decision tree algorithms for interval valued data only can deal with non-ordinal classification problems. In this paper, we presented an algorithm to solve the ordinal classification problems, where both the condition attributes with interval values and the decision attributes meet the monotonic requirement. The algorithm uses the rank mutual information to select extended attributes, which guarantees that the outputted decision tree is monotonic. The proposed algorithm is illustrated by a numerical example, and a monotonically consistent decision tree is generated. The design of algorithm can provide some useful guidelines for extending real-vauled to interval-valued attributes in ordinal decision tree induction.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Quinlan, J.R.: Induction of decision trees. Machine Learning 1(1), 81–106 (1986)
Han, J., Kamber, M., Pei, J.: Data Mining: Concepts and Techniques, 3rd edn. Elsevier Inc. (2012)
Christopher, M.B.: Pattern Recognition and Machine Learning. Springer (2007)
Mitchell, T.M.: Machine Learning. McGraw-Hill Companies, Inc. (1997)
Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Matco (1993)
Cheng, J., Fayyad, U.M., Irani, K.B., et al.: Improved decision trees: a generalized version of ID3. In: Dietterich, T. (ed.) Proceedings of the 5th International Conference on Machine Learning, pp. 100–108. Morgan Kaufmann Publishers, San Matyeo (1988)
Quinlan, J.R.: Probabilistic decision trees. In: Kodratoff, Y., Michalski, R. (eds.) Maching Learing: An Artificial Intelligence Approach, vol. 3, pp. 140–152. Morgan Kaufmann Publishers, San Matyeo (1990)
Hong, J.R., Ding, M.F., Li, X.Y., et al.: A new decision tree inductive learning algorithm. Journal of Computer 18(6), 470–474 (1995)
Fayyad, U.M., Irani, K.B.: On the handling of continuous-valued attributes in decision tree generation. Machine Learning 8, 87–102 (1992)
Yuan, Y., Shaw, M.J.: Induction of fuzzy decision trees. Fuzzy Sets and Systems 69, 125–139 (1995)
Potharst, R., Bioch, J.C.: Decision trees for ordinal classification. Intelligent Data Analysis 4(2), 97–111 (2000)
Frank, E., Hall, M.: A Simple Approach to Ordinal Classification. In: Flach, P.A., De Raedt, L. (eds.) ECML 2001. LNCS (LNAI), vol. 2167, pp. 145–156. Springer, Heidelberg (2001)
Greco, S., Matarazzo, B., Slowinski, R.: Rough sets methodology for sorting problems in presence of multiple attributes and criteria. European Journal of Operational Research 138(2), 247–259 (2002)
Pawlak, Z.: Rough Sets. Theoretical Aspects of Reasoning About Data. Kluwer Academic Publishers, Dordrecht (1991)
Greco, S., Matarazzo, B., Slowinski, R.: Rough approximation of a preference relation by dominance relations. European Journal of Operational Research 117(1), 63–83 (1999)
Greco, S., Matarazzo, B., Slowinski, R.: Rough approximation by dominance relations. International Journal of Intelligent Systems 17(2), 153–171 (2002)
Xia, F., Zhang, W., Li, F., et al.: Ranking with decision tree. Knowledge and Information Systems 17(3), 381–395 (2008)
Hu, Q.H., Guo, M.Z., Yu, D.R., et al.: Information entropy for ordinal classification. Science China Information Sciences 53(6), 1188–1200 (2010)
Hu, Q., Che, X., Zhang, L., et al.: Rank Entropy-Based Decision Trees for Monotonic Classification. IEEE Transactions on Knowledge and Data Engineering 24(11), 2052–2064 (2012)
Wang, X.Z., Hong, J.R.: Interval valued attributes decision tree learning algorithm. Journal of Software 9(8), 637–640 (1998)
Liu, J., Liu, S.F.: Sort research for multiple attribute object with interval valued attributes. Chinese Management Science 18(3), 90–94 (2010)
Liang, J.Y., Qian, Y.H.: Information granules and entropy theory in information systems. Science in China Series F: Information Sciences 51(10), 1427–1444 (2008)
Hu, D., Li, H.X., Yu, X.C.: The information content of rules and rule sets and its application. Science in China Series F: Information Sciences 51(12), 1958–1979 (2008)
Mingers, J.: An empirical comparison of selection measures for decision-tree induction. Machine Learning 3(4), 319–342 (1989)
Peng, H., Long, F., Ding, C.: Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(8), 1226–1238 (2005)
Fayyad, U.M., Irani, K.B.: On the handling of continuous-valued attributes in decision tree generation. Machine Learning 8(1), 87–102 (1992)
Viola, P., Wells III, W.M.: Alignment by maximization of mutual information. International Journal of Computer Vision 24(2), 137–154 (1997)
Spearman, C.: Footrule for measuring correlation. British Journal of Psychology, 1904–1920, 2(1), 89–108 (1906)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhu, H., Zhai, J., Wang, S., Wang, X. (2014). Monotonic Decision Tree for Interval Valued Data. In: Wang, X., Pedrycz, W., Chan, P., He, Q. (eds) Machine Learning and Cybernetics. ICMLC 2014. Communications in Computer and Information Science, vol 481. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-45652-1_24
Download citation
DOI: https://doi.org/10.1007/978-3-662-45652-1_24
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-45651-4
Online ISBN: 978-3-662-45652-1
eBook Packages: Computer ScienceComputer Science (R0)