Abstract
Training set resampling based ensemble design techniques are successfully used to reduce the classification errors of the base classifiers. Boosting is one of the techniques used for this purpose where each training set is obtained by drawing samples with replacement from the available training set according to a weighted distribution which is modified for each new classifier to be included in the ensemble. The weighted resampling results in a classifier set, each being accurate in different parts of the input space mainly specified the sample weights. In this study, a dynamic integration of boosting based ensembles is proposed so as to take into account the heterogeneity of the input sets. An evidence-theoretic framework is developed for this purpose so as to take into account the weights and distances of the neighboring training samples in both training and testing boosting based ensembles. The effectiveness of the proposed technique is compared to the AdaBoost algorithm using three different base classifiers.
Similar content being viewed by others
References
Breiman L (1996) Bagging predictors. Mach Learn 24:123–140
Schapire RE (2002) The boosting approach to machine learning: an overview. MSRI workshop on nonlinear estimation and classification, Berkeley
Skurichina M, Duin RPW (2002) Bagging, boosting and the random subspace method for linear classifiers. Pattern Anal Appl 5:121–135
Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: Machine learning: proceedings of the thirteenth national conference, Morgan Kauffmann, pp 148–156
Freund Y, Schapire RE (1995) A decision-theoretic generalization of on-line learning and an application to boosting. In: Second European conference on computational learning theory, March 1995
Puuronen S, Tsymbal A (2000) Bagging and boosting with dynamic integration of classifiers. In: Zighed DA, Komorowski J, Zytkow J (eds) Principles of data mining and knowledge discovery, proceedings of PKDD 2000, Lyon, France, Lecture Notes in Artificial Intelligence, vol 1910. Springer, Berlin Heidelberg New York, pp 116–125
Koppel M, Engelson SP (1996) Integrating multiple classifiers by finding their areas of experise. In: Proceedings of the AAAI workshop on integrating multiple learning models, pp 53–58
Polikar R, Krause S, Burd L (2003) Dynamic weight update in weighted majority voting for Learn++. In: Proceedings of international joint conference on neural networks (IJCNN 2003), pp 2770–2775
Moerland P, Mayoraz E (1999) Dynaboost: combining boosted hypotheses in a dynamic way. Technical Report RR 99-09, IDIAP, Switzerland, May 1999
Kwek S, Nguyen C (2002) iBoost: boosting using an instance-based exponential weighting scheme. In: Elomaa T, Mannila H, Toivonen H (eds) Machine learning: ECML 2002, proceedings of 13th European conference, Lecture Notes in Artificial Intelligence vol 2430. Springer, Berlin Heidelberg New York, pp 245–257
Denœux T (1995) A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Trans Syst Man Cybern 25(5):804–813
Shafer G (1976) A mathematical theory of evidence. Princeton University Press, New Jersey
Bhattacharya P (2000) On the dempster-shafer evidence theory and non-hierarchical aggregation of belief structures. IEEE Trans Syst Man Cybern 30(5):526–536
Fung RM, Chong CY (1986) Metaprobability and dempster-shafer in evidental reasoning. In: Kanal LN, Lemmer JF (eds) Uncertainity in artificial intelligence. Elsevier, Amsterdam, pp 295–303
Shafer G, Logan R (1987) Implementing dempster’s rule for hierarchical evidence. Artif Intell 33:271–298
Lefevre E, Colot O, Vannoorenberghe P (2002) Belief function combination and conflict management. Inf Fusion 3:149–162
Zadeh L (1986) A simple view of the dempster-shafer theory of evidence and its implication for the rule of combination. AI Maga 7:85–90
Smets P (1990) The combination of evidence in the transferrable belief model. IEEE Trans Pattern Anal Mach Intell 12(5):447–458
Yager RR (1987) On the dempster-shafer framework and new combination rules. Inf Sci 41:93–138
Murphy CK (2000) Combining belief functions when evidence conflicts. Decis Support Syst 29:1–9
Dubois D, Prade H (1998) Representation and combination of uncertainty with belief functions and possibility measures. Comput Intell 4:244–264
Xu L, Krzyzak A, Suen CY (192) Methods of combining multiple classifiers and their applications to handwriting recognition. IEEE Trans Syst Man Cybern 22:418–435
Al-Ani A, Deriche M (2002) A new technique for combining multiple classifiers using the dempster-shafer theory of evidence. J Artif Intell Res 17:333–361
Smets P, Kennes R (1994) The transferrable belief model. Artif Intell 66:191–234
Hegarat-Mascle SL, Bloch I, Vidal-Madjar D (1998) Introduction of neighborhood information in evidence theory and application to data fusion of radar and optical images with partial cloud cover. Pattern Recogn 31(11):1811–1823
Liu W, Bundy A (1992) The combination of different pieces of evidence using incidence calculus. Technical Report RP 599, Department of Artificial Intelligence. University of Edinburgh, pp 1–60
Voorbraak F (1991) On the justification of Dempster’s rule of combination. Artif Intell 48:171–197
Yaghlane BB, Smets P, Mellouli K (2002) Independence concepts for belief functions. In: Technologies for constructing intelligent systems: tools. Physica-Verlag GmbH, Heidelberg, Germany, pp 45–58
Smets P (1993) Belief functions: the disjunctive rule of combination and the generalized Bayesian theorem. Int J Approx Reason 9:1–354
Delmotte F, Smets P (2004) Target identification based on the transferrable belief model interpretation of Dempster-shafer model. IEEE Trans Syst Man Cybern Part A Syst Hum 34(4):457–471
Bloch I, Maitre H (1997) Data fusion in 2D and 3D image processing: an overview. In: Proceedings of the X Brazilian symposium on computer graphics and image processing, October 1997
François J, Grandvalet Y, Denœux T, Roger JM (2003) Resample and combine: an approach to improving uncertainty representation in evidential pattern classification. Inf Fusion 4:75–85
Cattaneo MEGV (2003) Combining belief functions issued from dependent sources. In: Proceedings of the 3rd international symposium on imprecise probabilities and their applications, Lugano, Switzerland
Lin X, Ding X, Chen M, Zhang R, Wu Y (1998) Adaptive confidence transform based classifier combination for Chinese character recognition. Pattern Recogn Lett 19:975–988
Schapire RE (1999) A brief introduction to boosting. In: Proceedings of the sixteenth international joint conference on artificial intelligence
Blake C, Merz C (1998) UCI repository of machine learning databases http://www.ics.uci.edu/mlearn/ mlrepository.html. Department of Information and Computer Sciences, University of California, Irvine
The ELENA project. http://www.dice.ucl.ac.be/neural-nets/Research/Projects/ELENA/elena.htm
Duin RPW (2004) PRTOOLS (version 4.0). A Matlab toolbox for pattern recognition. Pattern Recognition Group, Delft University, Netherlands
Duda RO, Hart PE, Stork DG (2000) Pattern classification. Wiley, New York
Weiss GM, Provost F (2001) The effect of class distribution on classifier learning: an empirical study. Technical Report ML-TR-44, Department of Computer Science, Rutgers University
Monard MC, Batista GEAPA (2002) Learning with skewed class distribution. In: Abe JM, da Silva Filho JI (eds) Advances in logic, artificial intelligence and robotics. IOS Press, Bloomington, pp 173–180
Theodoridis S, Koutroumbas K (1999) Pattern recognition. Academic, New York
Skurichina M, Duin RPW (2000) Boosting in linear discriminant analysis. In: Kittler J, Roli F (eds) Multiple classifier systems. Proceedings of the second international workshop, MCS 2000. Lecture Notes in Computer Science. Springer, Berlin Heidelberg New York, pp 190–199
da Silva WT, Milidiu RL (1992) Algorithms for combining belief functions. Int J Approx Reason 7(1-s-2):73–94
Blaylock N, Allen J (2004) Statistical goal parameter recognition. In: Proceedings of the 14th international conference on automated planning and scheduling (ICAPS’04), Whistler, Canada
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Altınçay, H. A dempster-shafer theoretic framework for boosting based ensemble design. Pattern Anal Applic 8, 287–302 (2005). https://doi.org/10.1007/s10044-005-0010-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10044-005-0010-x