Skip to main content

RBDT-1: A New Rule-Based Decision Tree Generation Technique

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 5858))

Abstract

Most of the methods that generate decision trees use examples of data instances in the decision tree generation process. This paper proposes a method called “RBDT-1”- rule based decision tree -for learning a decision tree from a set of decision rules that cover the data instances rather than from the data instances themselves. The method’sgoal is to create on-demand a short and accurate decision tree from a stable or dynamically changing set of rules. We conduct a comparative study of RBDT-1 with three existing decision tree methods based on different problems. The outcome of the study shows that RBDT-1 performs better than AQDT-1 andAQDT-2 which are rule-based decision tree methods in terms of tree complexity (number of nodes and leaves in the decision tree). It is also shown that RBDT-1 performs equally well in terms of tree complexity compared with C4.5, which generates a decision tree from data examples.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Imam, I.F.: An Empirical Comparison between Learning Decision Trees From Examples and From Decision Rules. In: 9th International Symposium on Methodologies for Intelligent Systems, Zakopane (1996)

    Google Scholar 

  2. Quinlan, J.R.: Discovering rules by induction from large collections of examples. In: Michie, D. (ed.) Expert Systems in the Microelectronic Age, pp. 168–201. Edinburgh University Press (1979)

    Google Scholar 

  3. Breiman, L., Friedman, J.H., Oishen, R.A., Stone, C.J.: Classification and Regression Structures. Wadsworth Int. Group, Belmont (1984)

    Google Scholar 

  4. Cestnik, B., Karalie, A.: The Estimation of Probabilities in Attribute Selection Measures for Decision Structure Induction. In: Proceeding of the European Summer School on Machine Learning, pp. 22–31. Priory Corsendonk, Belgium (1991)

    Google Scholar 

  5. Mingers, J.: An Empirical Comparison of Selection Measures for Decision-Structure Induction. Machine Learning 3(3), 319–342 (1989)

    Google Scholar 

  6. Imam, I.F., Michalski, R.S.: Learning Decision Trees from Decision Rules: A Method and Initial Results from a Comparative Study. J. JIIS. 2(3), 279–304 (1993)

    Google Scholar 

  7. Witten, I.H., MacDonald, B.A.: Using Concept Learning for Knowledge Acquisition. J. IJMMS., 349–370 (1988)

    Google Scholar 

  8. Michalski, R.S., Imam, I.F.: Learning Problem-Oriented Decision Structures From Decision Rules: the AQDT-2 System. In: Raś, Z.W., Zemankova, M. (eds.) ISMIS 1994. LNCS (LNAI), vol. 869, pp. 416–426. Springer, Heidelberg (1994)

    Google Scholar 

  9. Quinlan, R.J.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo (1993)

    Google Scholar 

  10. Akiba, Y., Kaneda, S., Almuallim, H.: Turning Majority Voting Classifiers Into A Single Decision Tree. In: 10th IEEE International Conference on Tools with Artificial Intelligence, pp. 224–230 (1998)

    Google Scholar 

  11. Chen, Y., Hung, L.T.: Using Decision Trees to Summarize Associative Classification Rules. Expert Syst. Appl. 36(2), 2338–2351 (2009)

    Article  Google Scholar 

  12. Michalski, R.S., Kaufman, K.: The AQ19 System for Machine Learning And Pattern Discovery: A General Description And User’s Guide. In: Reports of the Machine Learning and Inference Laboratory, MLI 01-2. George Mason University, Fairfax (2001)

    Google Scholar 

  13. Michalski, R.S., Imam, I.F.: On Learning Decision Structures. Fundamenta Informaticae 31(1), 49–64 (1997)

    MATH  Google Scholar 

  14. Michalski, R.S., Mozetic, I., Hong, J., Lavrac, N.: The Multi-Purpose Incremental Learning System AQ15 and its Testing Application to Three Medical Domains. In: Proceedings of AAAI 1986, Philadelphia, PA, pp. 1041–1045 (1986)

    Google Scholar 

  15. Bergadano, F., Matwin, S., Michalski, R.S., Zhang, J.: Learning Two-tiered Descriptions of Flexible Concepts: The POSEIDON System. Machine Learning 8(1), 5–43 (1992)

    Google Scholar 

  16. Colton, S.: Online Document (2004), http://www.doc.ic.ac.uk/~sgc/teaching/v231/lecture11.html

  17. Asuncion, A., Newman, D.J.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Abdelhalim, A., Traore, I., Sayed, B. (2009). RBDT-1: A New Rule-Based Decision Tree Generation Technique. In: Governatori, G., Hall, J., Paschke, A. (eds) Rule Interchange and Applications. RuleML 2009. Lecture Notes in Computer Science, vol 5858. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04985-9_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-04985-9_12

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-04984-2

  • Online ISBN: 978-3-642-04985-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics