Skip to main content
Log in

A dempster-shafer theoretic framework for boosting based ensemble design

  • Theoretical Advances
  • Published:
Pattern Analysis and Applications Aims and scope Submit manuscript

Abstract

Training set resampling based ensemble design techniques are successfully used to reduce the classification errors of the base classifiers. Boosting is one of the techniques used for this purpose where each training set is obtained by drawing samples with replacement from the available training set according to a weighted distribution which is modified for each new classifier to be included in the ensemble. The weighted resampling results in a classifier set, each being accurate in different parts of the input space mainly specified the sample weights. In this study, a dynamic integration of boosting based ensembles is proposed so as to take into account the heterogeneity of the input sets. An evidence-theoretic framework is developed for this purpose so as to take into account the weights and distances of the neighboring training samples in both training and testing boosting based ensembles. The effectiveness of the proposed technique is compared to the AdaBoost algorithm using three different base classifiers.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Breiman L (1996) Bagging predictors. Mach Learn 24:123–140

    MATH  MathSciNet  Google Scholar 

  2. Schapire RE (2002) The boosting approach to machine learning: an overview. MSRI workshop on nonlinear estimation and classification, Berkeley

  3. Skurichina M, Duin RPW (2002) Bagging, boosting and the random subspace method for linear classifiers. Pattern Anal Appl 5:121–135

    Article  MATH  MathSciNet  Google Scholar 

  4. Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: Machine learning: proceedings of the thirteenth national conference, Morgan Kauffmann, pp 148–156

  5. Freund Y, Schapire RE (1995) A decision-theoretic generalization of on-line learning and an application to boosting. In: Second European conference on computational learning theory, March 1995

  6. Puuronen S, Tsymbal A (2000) Bagging and boosting with dynamic integration of classifiers. In: Zighed DA, Komorowski J, Zytkow J (eds) Principles of data mining and knowledge discovery, proceedings of PKDD 2000, Lyon, France, Lecture Notes in Artificial Intelligence, vol 1910. Springer, Berlin Heidelberg New York, pp 116–125

  7. Koppel M, Engelson SP (1996) Integrating multiple classifiers by finding their areas of experise. In: Proceedings of the AAAI workshop on integrating multiple learning models, pp 53–58

  8. Polikar R, Krause S, Burd L (2003) Dynamic weight update in weighted majority voting for Learn++. In: Proceedings of international joint conference on neural networks (IJCNN 2003), pp 2770–2775

  9. Moerland P, Mayoraz E (1999) Dynaboost: combining boosted hypotheses in a dynamic way. Technical Report RR 99-09, IDIAP, Switzerland, May 1999

  10. Kwek S, Nguyen C (2002) iBoost: boosting using an instance-based exponential weighting scheme. In: Elomaa T, Mannila H, Toivonen H (eds) Machine learning: ECML 2002, proceedings of 13th European conference, Lecture Notes in Artificial Intelligence vol 2430. Springer, Berlin Heidelberg New York, pp 245–257

  11. Denœux T (1995) A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Trans Syst Man Cybern 25(5):804–813

    Article  Google Scholar 

  12. Shafer G (1976) A mathematical theory of evidence. Princeton University Press, New Jersey

    MATH  Google Scholar 

  13. Bhattacharya P (2000) On the dempster-shafer evidence theory and non-hierarchical aggregation of belief structures. IEEE Trans Syst Man Cybern 30(5):526–536

    Article  Google Scholar 

  14. Fung RM, Chong CY (1986) Metaprobability and dempster-shafer in evidental reasoning. In: Kanal LN, Lemmer JF (eds) Uncertainity in artificial intelligence. Elsevier, Amsterdam, pp 295–303

    Google Scholar 

  15. Shafer G, Logan R (1987) Implementing dempster’s rule for hierarchical evidence. Artif Intell 33:271–298

    Article  MATH  MathSciNet  Google Scholar 

  16. Lefevre E, Colot O, Vannoorenberghe P (2002) Belief function combination and conflict management. Inf Fusion 3:149–162

    Article  Google Scholar 

  17. Zadeh L (1986) A simple view of the dempster-shafer theory of evidence and its implication for the rule of combination. AI Maga 7:85–90

    Google Scholar 

  18. Smets P (1990) The combination of evidence in the transferrable belief model. IEEE Trans Pattern Anal Mach Intell 12(5):447–458

    Article  Google Scholar 

  19. Yager RR (1987) On the dempster-shafer framework and new combination rules. Inf Sci 41:93–138

    Article  MATH  MathSciNet  Google Scholar 

  20. Murphy CK (2000) Combining belief functions when evidence conflicts. Decis Support Syst 29:1–9

    Article  Google Scholar 

  21. Dubois D, Prade H (1998) Representation and combination of uncertainty with belief functions and possibility measures. Comput Intell 4:244–264

    Article  Google Scholar 

  22. Xu L, Krzyzak A, Suen CY (192) Methods of combining multiple classifiers and their applications to handwriting recognition. IEEE Trans Syst Man Cybern 22:418–435

    Article  Google Scholar 

  23. Al-Ani A, Deriche M (2002) A new technique for combining multiple classifiers using the dempster-shafer theory of evidence. J Artif Intell Res 17:333–361

    MATH  MathSciNet  Google Scholar 

  24. Smets P, Kennes R (1994) The transferrable belief model. Artif Intell 66:191–234

    Article  MATH  MathSciNet  Google Scholar 

  25. Hegarat-Mascle SL, Bloch I, Vidal-Madjar D (1998) Introduction of neighborhood information in evidence theory and application to data fusion of radar and optical images with partial cloud cover. Pattern Recogn 31(11):1811–1823

    Article  Google Scholar 

  26. Liu W, Bundy A (1992) The combination of different pieces of evidence using incidence calculus. Technical Report RP 599, Department of Artificial Intelligence. University of Edinburgh, pp 1–60

  27. Voorbraak F (1991) On the justification of Dempster’s rule of combination. Artif Intell 48:171–197

    Article  MATH  MathSciNet  Google Scholar 

  28. Yaghlane BB, Smets P, Mellouli K (2002) Independence concepts for belief functions. In: Technologies for constructing intelligent systems: tools. Physica-Verlag GmbH, Heidelberg, Germany, pp 45–58

  29. Smets P (1993) Belief functions: the disjunctive rule of combination and the generalized Bayesian theorem. Int J Approx Reason 9:1–354

    Article  MATH  MathSciNet  Google Scholar 

  30. Delmotte F, Smets P (2004) Target identification based on the transferrable belief model interpretation of Dempster-shafer model. IEEE Trans Syst Man Cybern Part A Syst Hum 34(4):457–471

    Article  Google Scholar 

  31. Bloch I, Maitre H (1997) Data fusion in 2D and 3D image processing: an overview. In: Proceedings of the X Brazilian symposium on computer graphics and image processing, October 1997

  32. François J, Grandvalet Y, Denœux T, Roger JM (2003) Resample and combine: an approach to improving uncertainty representation in evidential pattern classification. Inf Fusion 4:75–85

    Article  Google Scholar 

  33. Cattaneo MEGV (2003) Combining belief functions issued from dependent sources. In: Proceedings of the 3rd international symposium on imprecise probabilities and their applications, Lugano, Switzerland

  34. Lin X, Ding X, Chen M, Zhang R, Wu Y (1998) Adaptive confidence transform based classifier combination for Chinese character recognition. Pattern Recogn Lett 19:975–988

    Article  Google Scholar 

  35. Schapire RE (1999) A brief introduction to boosting. In: Proceedings of the sixteenth international joint conference on artificial intelligence

  36. Blake C, Merz C (1998) UCI repository of machine learning databases http://www.ics.uci.edu/mlearn/ mlrepository.html. Department of Information and Computer Sciences, University of California, Irvine

  37. The ELENA project. http://www.dice.ucl.ac.be/neural-nets/Research/Projects/ELENA/elena.htm

  38. Duin RPW (2004) PRTOOLS (version 4.0). A Matlab toolbox for pattern recognition. Pattern Recognition Group, Delft University, Netherlands

  39. Duda RO, Hart PE, Stork DG (2000) Pattern classification. Wiley, New York

    Google Scholar 

  40. Weiss GM, Provost F (2001) The effect of class distribution on classifier learning: an empirical study. Technical Report ML-TR-44, Department of Computer Science, Rutgers University

  41. Monard MC, Batista GEAPA (2002) Learning with skewed class distribution. In: Abe JM, da Silva Filho JI (eds) Advances in logic, artificial intelligence and robotics. IOS Press, Bloomington, pp 173–180

  42. Theodoridis S, Koutroumbas K (1999) Pattern recognition. Academic, New York

    Google Scholar 

  43. Skurichina M, Duin RPW (2000) Boosting in linear discriminant analysis. In: Kittler J, Roli F (eds) Multiple classifier systems. Proceedings of the second international workshop, MCS 2000. Lecture Notes in Computer Science. Springer, Berlin Heidelberg New York, pp 190–199

  44. da Silva WT, Milidiu RL (1992) Algorithms for combining belief functions. Int J Approx Reason 7(1-s-2):73–94

    Article  MathSciNet  Google Scholar 

  45. Blaylock N, Allen J (2004) Statistical goal parameter recognition. In: Proceedings of the 14th international conference on automated planning and scheduling (ICAPS’04), Whistler, Canada

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hakan Altınçay.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Altınçay, H. A dempster-shafer theoretic framework for boosting based ensemble design. Pattern Anal Applic 8, 287–302 (2005). https://doi.org/10.1007/s10044-005-0010-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10044-005-0010-x

Keywords

Navigation