Skip to main content

A Novel Attributes Partition Method for Decision Tree

  • Conference paper
  • First Online:
  • 1096 Accesses

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 212))

Abstract

In the decision tree’s making phase, it is frequent to find the optimal partition of elements with different values of a category attribute at a node. This needs to search over all the partitions for the one with the minimal impurity, which is exponential in n. We present a new heuristic search algorithm, SORT_DP, to find an effective partition, which is polynomial in n. The method uses the mapping from the class probability space to the sub-spaces and the technique of dynamic programming. By comparing the performance against other methods through experiments, we demonstrated the effectiveness of the new method.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   259.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Breiman L, Friedman JH, Olshen RA, Stone CJ (1984) Classification and regression trees. Wadsworth International Group, Monterey

    MATH  Google Scholar 

  2. Chou PA (1991) Optimal partitioning for classification and regression trees. IEEE Trans Pattern Anal Mach Intell 13(4):340–354

    Article  Google Scholar 

  3. Burshtein D, Pietra VD, Kanevsky D, Nadas AJ (1992) Minimum impurity partitions. Ann Stat 20(3):1637–1646

    Article  MathSciNet  MATH  Google Scholar 

  4. Nadas A, Nahamoo D, Picheny MA, Powell J (1991) An iterative flip-flop approximation of the most informative split in the construction of decision trees. In: International conference on acoustics, speech, and signal processing. Toronto, pp 565–568

    Google Scholar 

  5. Mehta M, Agrawal R, Rissanen J (1996) SLIQ: a fast scalable classifier for data mining. In: Proceedings of the fifth international conference on extending database technology. Springer, Berlin, pp 18–33

    Google Scholar 

  6. NASA (1992) Introduction to IND Version 2.1, GA23-2475-02 edition. NASA Ames Research Center

    Google Scholar 

  7. Coppersmith D, Hong SJ, Hosking JRM (1999) Partitioning nominal attributes in decision trees. Data Min Knowl Disc 3(2):197–217

    Article  Google Scholar 

  8. Loh WY, Shih YS (1997) Split selection methods for classification trees. Statistica Sinica 7:815–840

    MathSciNet  MATH  Google Scholar 

  9. Parsons l, Haque E, Liu H (2004) Subspace clustering for high dimensional data: a review. SIGKDD Explorations 6(1):90–105

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Aili Han .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Li, Z., Han, A., Han, F. (2013). A Novel Attributes Partition Method for Decision Tree. In: Yin, Z., Pan, L., Fang, X. (eds) Proceedings of The Eighth International Conference on Bio-Inspired Computing: Theories and Applications (BIC-TA), 2013. Advances in Intelligent Systems and Computing, vol 212. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-37502-6_52

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-37502-6_52

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-37501-9

  • Online ISBN: 978-3-642-37502-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics