Skip to main content
Log in

Combining Different Methods and Numbers of Weak Decision Trees

  • Published:
Pattern Analysis & Applications Aims and scope Submit manuscript

Abstract:

Several ways of manipulating a training set have shown that weakened classifier combination can improve prediction accuracy. In the present paper, we focus on learning set sampling (Breiman’s Bagging) and random feature subset selections (Ho’s Random Subspaces). We present a combination scheme labelled ‘Bagfs’, in which new learning sets are generated on the basis of both bootstrap replicates and random subspaces. The performances of the three methods (Bagging, Random Subspaces and Bagfs) are compared to the standard Adaboost algorithm. All four methods are assessed by means of a decision-tree inducer (C4.5). In addition, we also study whether the number and the way in which they are created has a significant influence on the performance of their combination. To answer these two questions, we undertook the application of the McNemar test of significance and the Kappa degree-of-agreement. The results, obtained on 23 conventional databases, show that on average, Bagfs exhibits the best agreement between prediction and supervision.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Additional information

Received: 17 November 2000, Received in revised form: 30 October 2001, Accepted: 13 December 2001

Rights and permissions

Reprints and permissions

About this article

Cite this article

Latinne, P., Debeir, O. & Decaestecker, C. Combining Different Methods and Numbers of Weak Decision Trees. Pattern Anal Appl 5, 201–209 (2002). https://doi.org/10.1007/s100440200018

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s100440200018

Navigation