Abstract
Rotation Forest is a recently proposed method for building classifier ensembles using independently trained decision trees. It was found to be more accurate than bagging, AdaBoost and Random Forest ensembles across a collection of benchmark data sets. This paper carries out a lesion study on Rotation Forest in order to find out which of the parameters and the randomization heuristics are responsible for the good performance. Contrary to common intuition, the features extracted through PCA gave the best results compared to those extracted through non-parametric discriminant analysis (NDA) or random projections. The only ensemble method whose accuracy was statistically indistinguishable from that of Rotation Forest was LogitBoost although it gave slightly inferior results on 20 out of the 32 benchmark data sets. It appeared that the main factor for the success of Rotation Forest is that the transformation matrix employed to calculate the (linear) extracted features is sparse.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36, 105–142 (1999)
Blake, C.L., Merz, C.J.: UCI repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
Breiman, L.: Bagging predictors. Machine Learning 26(2), 123–140 (1996)
Breiman, L.: Arcing classifiers. The Annals of Statistics 26(3), 801–849 (1998)
Breiman, L.: Random forests. Machine Learning 45, 5–32 (2001)
Bressan, M., Vitrià, J.: Nonparametric discriminant analysis and nearest neighbor classification. Pattern Recognition Letters 24, 2743–2749 (2003)
Demšar, J.: Statistical comparison of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)
Fern, X.Z., Brodley, C.E.: Random projection for high dimensional data clustering: A cluster ensemble approach. In: Proc. 20th International Conference on Machine Learning, ICML, Washington,DC, pp. 186–193 (2003)
Foley, F.H., Sammon, J.W.: An optimal set of discriminant vectors. IEEE Transactions on Computers 24(3), 281–289 (1975)
Fukunaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, Boston (1990)
Fukunaga, K., Koontz, W.L.G.: Application of the karhunen-loeve expansion to feature selection and ordering. IEEE Transactions on Computers 19(4), 311–318 (1970)
Fukunaga, K., Mantock, J.: Nonparametric discriminant analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 5(6), 671–678 (1983)
Kittler, J.V., Young, P.C.: A new approach to feature selection based on the karhunen-loeve expansion. Pattern Recognition 5(4), 335–352 (1973)
Nadeau, C., Bengio, Y.: Inference for the generalization error. Machine Learning 62, 239–281 (2003)
Rodríguez, J.J., Kuncheva, L.I., Alonso, C.J.: Rotation forest: A new classifier ensemble method. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(10), 1619–1630 (2006)
Skurichina, M., Duin, R.P.W.: Combining feature subsets in feature selection. In: Oza, N.C., et al. (eds.) MCS 2005. LNCS, vol. 3541, pp. 165–175. Springer, Heidelberg (2005)
Tumer, K., Oza, N.C.: Input decimated ensembles. Pattern Analysis and Applications 6, 65–77 (2003)
van der Heijden, F., et al.: Classification, Parameter Estimation and State Estimation. Wiley, Chichester (2004)
Webb, A.: Statistical Pattern Recognition. Arnold, London (1999)
Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques, 2nd edn. Morgan Kaufmann, San Francisco (2005)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Kuncheva, L.I., Rodríguez, J.J. (2007). An Experimental Study on Rotation Forest Ensembles. In: Haindl, M., Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2007. Lecture Notes in Computer Science, vol 4472. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72523-7_46
Download citation
DOI: https://doi.org/10.1007/978-3-540-72523-7_46
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-72481-0
Online ISBN: 978-3-540-72523-7
eBook Packages: Computer ScienceComputer Science (R0)