Abstract
It was previously argued that Decision Tree learning algorithms such as CART or C4.5 can also be useful to build small and accurate Decision Lists. In that paper, we investigate the possibility of using a similar “top-down and prune” scheme to induce formulae from a much different class: Decision Committees. A decision committee contains rules, each of which being a couple (monomial, vector), where the vector’s components are highly constrained with respect to classical polynomials. Each monomial is a condition that, when matched by an instance, returns its vector. When each monomial is tested, the sum of the returned vectors is used to take the decision. Decision Trees, Lists and Committees are complementary formalisms for the user: while trees are based on literal ordering, lists are based on monomial ordering, and committees remove any orderings over the tests. Our contribution is a new algorithm, WIDC, which learns using the same “top-down and prune” scheme, but building Decision Committees. Experimental results on twenty-two domains tend to show that WIDC is able to produce small, accurate, and interpretable decision committees.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
L. Breiman, J. H. Freidman, R. A. Olshen, and C. J. Stone. Classification and Regression Trees. Wadsworth, 1984.
C. Blake, E. Keogh, and C. J. Merz. UCI repository of machine learning databases. 1998. http://www.ics.uci.edu/~mlearn/MLRepository.html.
W. Buntine and T. Niblett. A further comparison of splitting rules for DecisionTree induction. Machine Learning, pages 75–85, 1992.
P. Clark and T. Niblett. The CN2 induction algorithm. Machine Learning, 3:261–283, 1989.
P. Domingos. A Process-oriented Heuristic for Model selection. In Proc. of the 15 th ICML, pages 127–135, 1998.
E. Franck and I. Witten. Using a Permutation Test for Attribute selection in Decision Trees. In Proc. of the 15 th ICML, pages 152–160, 1998.
R. C. Holte. Very simple Classification rules perform well on most commonly used datasets. Machine Learning, pages 63–91, 1993.
L. Hyafil and R. Rivest. Constructing optimal decision trees is NP-complete. Inform. Process. Letters, pages 15–17, 1976.
George H. John, Ron Kohavi, and Karl Pfleger. Irrelevant features and the subset selection problem. In Proc. of the 11 th ICML, pages 121–129, 1994.
M. J. Kearns and Y. Mansour. A Fast, Bottom-up Decision Tree Pruning algorithm with Near-Optimal generalization. In Proc. of the 15 th ICML, 1998.
D. Kohavi and D. Sommerfield. Targetting Business users with Decision Table Classifiers. In Proc. of the 4 th Intl Conf. on KDD, 1998.
T. M. Mitchell. Machine Learning. McGraw-Hill, 1997.
R. S. Michalski, I. Mozetic, J. Hong, and N. Lavrac. The AQ15 inductive learning system: An overview and experiments. In Proc. of AAAI’86, pages 1041–1045, 1986.
R. Nock and O. Gascuel. On learning decision committees. In Proc. of the 12 th ICML, pages 413–420, 1995.
R. Nock and P. Jappy. On the power of decision lists. In Proc. of the 15 th ICML, pages 413–420, 1998.
R. Nock and P. Jappy. Decision Tree based induction of Decision Lists. International Journal of Intelligent Data Analysis (accepted), 1999.
J. R. Quinlan. C4.5: programs for machine learning. Morgan Kaufmann, 1994.
J. R. Quinlan. Bagging, Boosting and C4.5. In Proc. of AAAI-96, pages 725–730, 1996.
R. L. Rivest. Learning decision lists. Machine Learning, pages 229–246, 1987.
R. E. Schapire. The strength of weak learnability. Machine Learning, pages 197–227, 1990.
R. E. Schapire and Y. Singer. Improved boosting algorithms using confidence-rated predictions. In Proceedings of COLT’98, pages 80–91, 1998.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Nock, R., Jappy, P. (1999). A “Top-Down and Prune” Induction Scheme for Constrained Decision Committees. In: Hand, D.J., Kok, J.N., Berthold, M.R. (eds) Advances in Intelligent Data Analysis. IDA 1999. Lecture Notes in Computer Science, vol 1642. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-48412-4_3
Download citation
DOI: https://doi.org/10.1007/3-540-48412-4_3
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-66332-4
Online ISBN: 978-3-540-48412-7
eBook Packages: Springer Book Archive