Abstract
Boosting increases the recognition accuracy of many types of classifiers. However, studies show that for the Fisher Linear Discriminant (FLD), a simple and widely used classifier, boosting does not lead to a significant increase in accuracy. In this paper, a new method for adapting the FLD into the boosting framework is proposed. This method, the AdaBoost-RandomFeatureSubset-FLD (AB-RFS-FLD), uses a different, randomly chosen subset of features for learning in each boosting round. The new method achieves significantly better accuracy than both single FLD and FLD with boosting, with improvements reaching 6% in some cases. We show that the good performance can be attributed to higher diversity of the individual FLDs, as well as to the better generalization abilities.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Viola P, Jones MJ (2004) Robust real-time face detection. Int. J. Comput. Vision 57:137–154
Svetnik V, Liaw A, Tong C, Wang T (2004) Application of breiman’s random forest to modeling structure-activity relationships of pharmaceutical molecules. Lecture Notes in Computer Science 3077:334–343
Qu Y, Adam BL, Yasui Y, Ward MD, Cazares LH, Schellhammer PF, Feng Z, Semmes OJ, Wright GL (2002) Boosted decision tree analysis of surface-enhanced laser desorption/ionization mass spectral serum profiles discriminates prostate cancer from noncancer patients. Clinical Chemistry 48:1835–1843
Freund Y, Schapire R (1997) A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55:119–139
Breiman L (1996) Bagging predictors. Machine Learning 24:123–140
Ho TK (1995) Random decision forests. In: Proc. of the 3rd Int’l Conference on Document Analysis and Recognition:278–282
Bryll R, Gutierrez-Osuna R, Quek F (2003) Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets. Pattern Recognition 36:1291–1302
Freund Y, Schapire R (1996) Experiments with a new boosting algorithm. In: Proc. 13th International Conference on Machine Learning:148–156, Morgan Kaufmann
Schwenk H, Bengio Y (2000) Boosting neural networks. Neural Computation 12:1869–1887
Kim HC, Pang S, Je HM, Kim D, Bang SY (2003) Constructing support vector machine ensemble. Pattern Recognition 36:2757–2767
Skurichina M, Duin RPW (2000) Boosting in linear discriminant analysis. Lecture Notes in Computer Science 1857:190–199
Wang X, Tang X (2004) Multiple LDA classifier combination for high dimensional data classification. Lecture Notes in Computer Science 3077:344–353
Skurichina M, Duin RPW (2002) Bagging, boosting and the random subspace method for linear classifiers. Pattern Analysis and Applications 5:121–135
Schapire RE, Freund Y, Bartlett P, Lee WS (1997) Boosting the margin: a new explanation for the effectiveness of voting methods. In: Proc. 14th International Conference on Machine Learning:322–330, Morgan Kaufmann
Kuncheva L (2003) That elusive diversity in classifier ensembles. In: Proc. First Iberian Conference on Pattern Recognition and Image Analysis:1126–1138
Kuncheva LI, Whitaker CJ (2003) Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach. Learn. 51:181–207
Vapnik V (1982) Estimation of Dependences Based on Empirical Data. Springer, New York
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Arodź, T. (2005). Boosting the Fisher Linear Discriminant with Random Feature Subsets. In: Kurzyński, M., Puchała, E., Woźniak, M., żołnierek, A. (eds) Computer Recognition Systems. Advances in Soft Computing, vol 30. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-32390-2_7
Download citation
DOI: https://doi.org/10.1007/3-540-32390-2_7
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-25054-8
Online ISBN: 978-3-540-32390-7
eBook Packages: EngineeringEngineering (R0)