Skip to main content

Monotonic Decision Tree for Interval Valued Data

  • Conference paper
  • First Online:

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 481))

Abstract

Traditional decision tree algorithms for interval valued data only can deal with non-ordinal classification problems. In this paper, we presented an algorithm to solve the ordinal classification problems, where both the condition attributes with interval values and the decision attributes meet the monotonic requirement. The algorithm uses the rank mutual information to select extended attributes, which guarantees that the outputted decision tree is monotonic. The proposed algorithm is illustrated by a numerical example, and a monotonically consistent decision tree is generated. The design of algorithm can provide some useful guidelines for extending real-vauled to interval-valued attributes in ordinal decision tree induction.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Quinlan, J.R.: Induction of decision trees. Machine Learning 1(1), 81–106 (1986)

    Google Scholar 

  2. Han, J., Kamber, M., Pei, J.: Data Mining: Concepts and Techniques, 3rd edn. Elsevier Inc. (2012)

    Google Scholar 

  3. Christopher, M.B.: Pattern Recognition and Machine Learning. Springer (2007)

    Google Scholar 

  4. Mitchell, T.M.: Machine Learning. McGraw-Hill Companies, Inc. (1997)

    Google Scholar 

  5. Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Matco (1993)

    Google Scholar 

  6. Cheng, J., Fayyad, U.M., Irani, K.B., et al.: Improved decision trees: a generalized version of ID3. In: Dietterich, T. (ed.) Proceedings of the 5th International Conference on Machine Learning, pp. 100–108. Morgan Kaufmann Publishers, San Matyeo (1988)

    Google Scholar 

  7. Quinlan, J.R.: Probabilistic decision trees. In: Kodratoff, Y., Michalski, R. (eds.) Maching Learing: An Artificial Intelligence Approach, vol. 3, pp. 140–152. Morgan Kaufmann Publishers, San Matyeo (1990)

    Google Scholar 

  8. Hong, J.R., Ding, M.F., Li, X.Y., et al.: A new decision tree inductive learning algorithm. Journal of Computer 18(6), 470–474 (1995)

    Google Scholar 

  9. Fayyad, U.M., Irani, K.B.: On the handling of continuous-valued attributes in decision tree generation. Machine Learning 8, 87–102 (1992)

    MATH  Google Scholar 

  10. Yuan, Y., Shaw, M.J.: Induction of fuzzy decision trees. Fuzzy Sets and Systems 69, 125–139 (1995)

    Article  MathSciNet  Google Scholar 

  11. Potharst, R., Bioch, J.C.: Decision trees for ordinal classification. Intelligent Data Analysis 4(2), 97–111 (2000)

    MATH  Google Scholar 

  12. Frank, E., Hall, M.: A Simple Approach to Ordinal Classification. In: Flach, P.A., De Raedt, L. (eds.) ECML 2001. LNCS (LNAI), vol. 2167, pp. 145–156. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  13. Greco, S., Matarazzo, B., Slowinski, R.: Rough sets methodology for sorting problems in presence of multiple attributes and criteria. European Journal of Operational Research 138(2), 247–259 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  14. Pawlak, Z.: Rough Sets. Theoretical Aspects of Reasoning About Data. Kluwer Academic Publishers, Dordrecht (1991)

    Google Scholar 

  15. Greco, S., Matarazzo, B., Slowinski, R.: Rough approximation of a preference relation by dominance relations. European Journal of Operational Research 117(1), 63–83 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  16. Greco, S., Matarazzo, B., Slowinski, R.: Rough approximation by dominance relations. International Journal of Intelligent Systems 17(2), 153–171 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  17. Xia, F., Zhang, W., Li, F., et al.: Ranking with decision tree. Knowledge and Information Systems 17(3), 381–395 (2008)

    Article  Google Scholar 

  18. Hu, Q.H., Guo, M.Z., Yu, D.R., et al.: Information entropy for ordinal classification. Science China Information Sciences 53(6), 1188–1200 (2010)

    Article  MathSciNet  Google Scholar 

  19. Hu, Q., Che, X., Zhang, L., et al.: Rank Entropy-Based Decision Trees for Monotonic Classification. IEEE Transactions on Knowledge and Data Engineering 24(11), 2052–2064 (2012)

    Article  Google Scholar 

  20. Wang, X.Z., Hong, J.R.: Interval valued attributes decision tree learning algorithm. Journal of Software 9(8), 637–640 (1998)

    Google Scholar 

  21. Liu, J., Liu, S.F.: Sort research for multiple attribute object with interval valued attributes. Chinese Management Science 18(3), 90–94 (2010)

    Google Scholar 

  22. Liang, J.Y., Qian, Y.H.: Information granules and entropy theory in information systems. Science in China Series F: Information Sciences 51(10), 1427–1444 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  23. Hu, D., Li, H.X., Yu, X.C.: The information content of rules and rule sets and its application. Science in China Series F: Information Sciences 51(12), 1958–1979 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  24. Mingers, J.: An empirical comparison of selection measures for decision-tree induction. Machine Learning 3(4), 319–342 (1989)

    Google Scholar 

  25. Peng, H., Long, F., Ding, C.: Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(8), 1226–1238 (2005)

    Article  Google Scholar 

  26. Fayyad, U.M., Irani, K.B.: On the handling of continuous-valued attributes in decision tree generation. Machine Learning 8(1), 87–102 (1992)

    MATH  Google Scholar 

  27. Viola, P., Wells III, W.M.: Alignment by maximization of mutual information. International Journal of Computer Vision 24(2), 137–154 (1997)

    Article  Google Scholar 

  28. Spearman, C.: Footrule for measuring correlation. British Journal of Psychology, 1904–1920, 2(1), 89–108 (1906)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Junhai Zhai .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhu, H., Zhai, J., Wang, S., Wang, X. (2014). Monotonic Decision Tree for Interval Valued Data. In: Wang, X., Pedrycz, W., Chan, P., He, Q. (eds) Machine Learning and Cybernetics. ICMLC 2014. Communications in Computer and Information Science, vol 481. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-45652-1_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-45652-1_24

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-662-45651-4

  • Online ISBN: 978-3-662-45652-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics