Skip to main content

A “Top-Down and Prune” Induction Scheme for Constrained Decision Committees

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1642))

Abstract

It was previously argued that Decision Tree learning algorithms such as CART or C4.5 can also be useful to build small and accurate Decision Lists. In that paper, we investigate the possibility of using a similar “top-down and prune” scheme to induce formulae from a much different class: Decision Committees. A decision committee contains rules, each of which being a couple (monomial, vector), where the vector’s components are highly constrained with respect to classical polynomials. Each monomial is a condition that, when matched by an instance, returns its vector. When each monomial is tested, the sum of the returned vectors is used to take the decision. Decision Trees, Lists and Committees are complementary formalisms for the user: while trees are based on literal ordering, lists are based on monomial ordering, and committees remove any orderings over the tests. Our contribution is a new algorithm, WIDC, which learns using the same “top-down and prune” scheme, but building Decision Committees. Experimental results on twenty-two domains tend to show that WIDC is able to produce small, accurate, and interpretable decision committees.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   99.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. L. Breiman, J. H. Freidman, R. A. Olshen, and C. J. Stone. Classification and Regression Trees. Wadsworth, 1984.

    Google Scholar 

  2. C. Blake, E. Keogh, and C. J. Merz. UCI repository of machine learning databases. 1998. http://www.ics.uci.edu/~mlearn/MLRepository.html.

  3. W. Buntine and T. Niblett. A further comparison of splitting rules for DecisionTree induction. Machine Learning, pages 75–85, 1992.

    Google Scholar 

  4. P. Clark and T. Niblett. The CN2 induction algorithm. Machine Learning, 3:261–283, 1989.

    Google Scholar 

  5. P. Domingos. A Process-oriented Heuristic for Model selection. In Proc. of the 15 th ICML, pages 127–135, 1998.

    Google Scholar 

  6. E. Franck and I. Witten. Using a Permutation Test for Attribute selection in Decision Trees. In Proc. of the 15 th ICML, pages 152–160, 1998.

    Google Scholar 

  7. R. C. Holte. Very simple Classification rules perform well on most commonly used datasets. Machine Learning, pages 63–91, 1993.

    Google Scholar 

  8. L. Hyafil and R. Rivest. Constructing optimal decision trees is NP-complete. Inform. Process. Letters, pages 15–17, 1976.

    Google Scholar 

  9. George H. John, Ron Kohavi, and Karl Pfleger. Irrelevant features and the subset selection problem. In Proc. of the 11 th ICML, pages 121–129, 1994.

    Google Scholar 

  10. M. J. Kearns and Y. Mansour. A Fast, Bottom-up Decision Tree Pruning algorithm with Near-Optimal generalization. In Proc. of the 15 th ICML, 1998.

    Google Scholar 

  11. D. Kohavi and D. Sommerfield. Targetting Business users with Decision Table Classifiers. In Proc. of the 4 th Intl Conf. on KDD, 1998.

    Google Scholar 

  12. T. M. Mitchell. Machine Learning. McGraw-Hill, 1997.

    Google Scholar 

  13. R. S. Michalski, I. Mozetic, J. Hong, and N. Lavrac. The AQ15 inductive learning system: An overview and experiments. In Proc. of AAAI’86, pages 1041–1045, 1986.

    Google Scholar 

  14. R. Nock and O. Gascuel. On learning decision committees. In Proc. of the 12 th ICML, pages 413–420, 1995.

    Google Scholar 

  15. R. Nock and P. Jappy. On the power of decision lists. In Proc. of the 15 th ICML, pages 413–420, 1998.

    Google Scholar 

  16. R. Nock and P. Jappy. Decision Tree based induction of Decision Lists. International Journal of Intelligent Data Analysis (accepted), 1999.

    Google Scholar 

  17. J. R. Quinlan. C4.5: programs for machine learning. Morgan Kaufmann, 1994.

    Google Scholar 

  18. J. R. Quinlan. Bagging, Boosting and C4.5. In Proc. of AAAI-96, pages 725–730, 1996.

    Google Scholar 

  19. R. L. Rivest. Learning decision lists. Machine Learning, pages 229–246, 1987.

    Google Scholar 

  20. R. E. Schapire. The strength of weak learnability. Machine Learning, pages 197–227, 1990.

    Google Scholar 

  21. R. E. Schapire and Y. Singer. Improved boosting algorithms using confidence-rated predictions. In Proceedings of COLT’98, pages 80–91, 1998.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1999 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Nock, R., Jappy, P. (1999). A “Top-Down and Prune” Induction Scheme for Constrained Decision Committees. In: Hand, D.J., Kok, J.N., Berthold, M.R. (eds) Advances in Intelligent Data Analysis. IDA 1999. Lecture Notes in Computer Science, vol 1642. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-48412-4_3

Download citation

  • DOI: https://doi.org/10.1007/3-540-48412-4_3

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-66332-4

  • Online ISBN: 978-3-540-48412-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics