Skip to main content
Log in

Zone analysis: a visualization framework for classification problems

  • Published:
Artificial Intelligence Review Aims and scope Submit manuscript

Abstract

There have been large attempts to adopt the bias-variance framework from the regression problems to the classification problems. However, recently, it has been shown that only non-straightforward extensions exist for classification problems. In this paper, we present an alternative visualization framework for classification problems called zone analysis. Our zone analysis framework partly extends the bias-variance idea; instead of decomposing an error into two parts, i.e. the biased and unbiased components, our framework decomposes the error into K components. While bias-variance information is still contained in our framework, our framework provides interesting observations which are not obviously seen in the previous bias-variance framework, e.g. a prejudice behavior of the bagging algorithm to various unbiased instances. Our framework is suitable for visualizing an effect of context changes on learning performance. The type of context changes which we primarily investigate in the paper is “a change from a base learner to an ensemble learner such as bagging, adaboost, arc-x4 and multi-boosting”.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Bauer E, Kohavi R (1999) An empirical comparison of voting classification algorithms: bagging, boosting and variants. Mach Learn 36: 105–139

    Article  Google Scholar 

  • Breiman L (1996) Bagging predictors. Mach Learn 24: 123–140

    MATH  MathSciNet  Google Scholar 

  • Breiman L (1998) Arcing classifiers. Ann Stat 26: 801–849

    Article  MATH  MathSciNet  Google Scholar 

  • Bühlmann P, Yu B (2002) Analyzing bagging. Ann Stat 30: 927–961

    Article  MATH  Google Scholar 

  • Buja A, Stuetzle W (2006) Observations on bagging. Stat Sin 16: 323–351

    MATH  MathSciNet  Google Scholar 

  • Domingos P (2000) A unified bias-variance decomposition and its applications. In: Proceedings of the seventeenth international conference on machine learning, pp 231–238

  • Elisseeff A, Evgeniou T, Pontil M (2005) Stability of randomized learning algorithms. J Mach Learn Res 6: 55–79

    MathSciNet  Google Scholar 

  • Freund Y, Schapire R (1996) Experiments with a new boosting algorithm. In: Proceedings of the thirteenth international conference on machine learning, pp 148–156

  • Geman S, Bienenstock E, Doursat R (1992) Neural networks for the bias-variance dilemma. Neural Comput 4(1): 1–58

    Article  Google Scholar 

  • Grandvalet Y (2004) Bagging equalizes influence. Mach Learn 55(3): 251–270

    Article  MATH  Google Scholar 

  • James GM (2003) Variance and bias for general loss functions. Mach Learn 51: 115–135

    Article  MATH  Google Scholar 

  • Kohavi R, Wolpert D (1996) Bias plus variance decomposition for zero-one loss functions. In: Proceedings of the thirteenth international conference on machine learning, pp 275–283

  • Kong EB, Dietterich T (1995) Machine learning bias, statistical bias and statistical variance of decision tree algorithm. Technical report, Oreg State University

  • Kuncheva LI, Whitaker CJ (2003) Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach Learn 51: 181–207

    Article  MATH  Google Scholar 

  • Suen YL, Melville P, Mooney RJ (2005) Combining bias and variance reduction techniques for regression trees. In: The sixteenth european conference on machine learning, pp 741–749

  • Tibshirani R (1996) Bias, variance and prediction error for classification rules. Department of Statistics, University of Toronto

  • Valentini G (2005) An experimental bias-variance analysis of SVM ensembles based on resampling techniques. IEEE Trans Syst Man Cybern 35(6): 1252–1271

    Article  Google Scholar 

  • Valentini G, Dietterich T (2003) Low bias bagged SVMs. In: Proceedings of the twentieth international conference on machine learning, pp 752–759

  • Valentini G, Dietterich T (2004) Bias-variance analysis of SVMs for the development of SVM-based ensemble methods. J Mach Learn Res 5: 725–755

    MathSciNet  Google Scholar 

  • Webb GI (2000) Multiboosting: a technique for combining boosting and wagging. Mach Learn 40(2): 159–196

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ratthachat Chatpatanasiri.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Chatpatanasiri, R., Pungprasertying, P. & Kijsirikul, B. Zone analysis: a visualization framework for classification problems. Artif Intell Rev 31, 17 (2009). https://doi.org/10.1007/s10462-009-9122-9

Download citation

  • Published:

  • DOI: https://doi.org/10.1007/s10462-009-9122-9

Keywords

Navigation