Skip to main content

Induction of Decision Trees Using an Internal Control of Induction

  • Conference paper
Book cover Computational Intelligence and Bioinspired Systems (IWANN 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3512))

Included in the following conference series:

Abstract

In this paper we present CIDIM (Control of Induction by sample DIvision Method), an algorithm that has been developed to induce small and accurate decision trees using a set of examples. It uses an internal control of induction to stop the induction and to avoid the overfitting. Other ideas like a dichotomic division or groups of consecutive values are used to improve the performance of the algorithm. CIDIM has been successfully compared with ID3 and C4.5. It induces trees that are significantly better than those induced by ID3 or C4.5 in almost every experiment.

This work has been partially supported by the MOISES project, number TIC2002-04019-C03-02, of the MCyT, Spain.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth (1984)

    Google Scholar 

  2. Quinlan, J.R.: Induction of decision trees. Machine Learning 1, 81–106 (1986)

    Google Scholar 

  3. Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1993)

    Google Scholar 

  4. Utgoff, P.E., Berkman, N.C., Clouse, J.A.: Decision tree induction based on efficient tree restructuring. Machine Learning 29, 5–44 (1997)

    Article  MATH  Google Scholar 

  5. Folino, G., Pizzuti, C., Spezzano, G.: A cellular genetic programming approach to classification. In: Proceedings of the Genetic and Evolutionary Computation Conference, vol. 2, pp. 1015–1020. Morgan Kaufmann Publishers Inc., San Francisco (1999)

    Google Scholar 

  6. Hyafil, L., Rivest, R.L.: Constructing optimal binary decision trees is np-complete. Information Processing Letters 5, 15–17 (1976)

    Article  MATH  MathSciNet  Google Scholar 

  7. Gehrke, J., Ganti, V., Ramakrishnan, R., Loh, W.: Boat – optimistic decision tree construction. In: Proceedings of the 1999 ACM SIGMOD International Conference on Management of Data, pp. 169–180. ACM Press, New York (1999)

    Chapter  Google Scholar 

  8. Ramos-Jiménez, G., Morales-Bueno, R., Villalba-Soria, A.: CIDIM. Control of induction by sample division methods. In: Proceedings of the International Conference on Artificial Intelligence (IC-AI 2000), Las Vegas, pp. 1083–1087 (2000)

    Google Scholar 

  9. Ruiz-Gómez, J., Ramos-Jiménez, G., Villalba-Soria, A.: Modelling based on rule induction learning. In: Computers and Computacional Engineering in Control, pp. 158–163. World Scientific and Engineering Society Press, Greece (1999)

    Google Scholar 

  10. Jerez-Aragonés, J.M., Gómez-Ruiz, J.A., Ramos-Jiménez, G., noz Pérez, J.M., Alba-Conejo, E.: A combined neural network and decision trees model for prognosis of breast cancer relapse. Artificial Intelligence in Medicine 27, 45–63 (2003)

    Article  Google Scholar 

  11. Blake, C., Merz, C.J.: UCI repository of machine learning databases. University of California, Department of Information and Computer Science (2000)

    Google Scholar 

  12. Witten, I.H., Frank, E.: Data Mining: Practical machine learning tools with Java implementations. Morgan Kaufmann, San Francisco (2000)

    Google Scholar 

  13. Herrera, F., Hervás, C., Otero, J., Sánchez, L.: Un estudio empírico preliminar sobre los tests estadísticos más habituales en el aprendizaje automático. In: Tendencias de la Minería de Datos en España. Red Española de Minería de Datos, pp. 403–412 (2004)

    Google Scholar 

  14. R Development Core Team: R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria (2004) 3-900051-07-0, http://www.R-project.org

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ramos-Jiménez, G., del Campo-Ávila, J., Morales-Bueno, R. (2005). Induction of Decision Trees Using an Internal Control of Induction. In: Cabestany, J., Prieto, A., Sandoval, F. (eds) Computational Intelligence and Bioinspired Systems. IWANN 2005. Lecture Notes in Computer Science, vol 3512. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11494669_97

Download citation

  • DOI: https://doi.org/10.1007/11494669_97

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-26208-4

  • Online ISBN: 978-3-540-32106-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics