Skip to main content

An Experimental Study on Rotation Forest Ensembles

  • Conference paper
Multiple Classifier Systems (MCS 2007)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4472))

Included in the following conference series:

Abstract

Rotation Forest is a recently proposed method for building classifier ensembles using independently trained decision trees. It was found to be more accurate than bagging, AdaBoost and Random Forest ensembles across a collection of benchmark data sets. This paper carries out a lesion study on Rotation Forest in order to find out which of the parameters and the randomization heuristics are responsible for the good performance. Contrary to common intuition, the features extracted through PCA gave the best results compared to those extracted through non-parametric discriminant analysis (NDA) or random projections. The only ensemble method whose accuracy was statistically indistinguishable from that of Rotation Forest was LogitBoost although it gave slightly inferior results on 20 out of the 32 benchmark data sets. It appeared that the main factor for the success of Rotation Forest is that the transformation matrix employed to calculate the (linear) extracted features is sparse.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36, 105–142 (1999)

    Article  Google Scholar 

  2. Blake, C.L., Merz, C.J.: UCI repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html

  3. Breiman, L.: Bagging predictors. Machine Learning 26(2), 123–140 (1996)

    Google Scholar 

  4. Breiman, L.: Arcing classifiers. The Annals of Statistics 26(3), 801–849 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  5. Breiman, L.: Random forests. Machine Learning 45, 5–32 (2001)

    Article  MATH  Google Scholar 

  6. Bressan, M., Vitrià, J.: Nonparametric discriminant analysis and nearest neighbor classification. Pattern Recognition Letters 24, 2743–2749 (2003)

    Article  Google Scholar 

  7. Demšar, J.: Statistical comparison of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)

    Google Scholar 

  8. Fern, X.Z., Brodley, C.E.: Random projection for high dimensional data clustering: A cluster ensemble approach. In: Proc. 20th International Conference on Machine Learning, ICML, Washington,DC, pp. 186–193 (2003)

    Google Scholar 

  9. Foley, F.H., Sammon, J.W.: An optimal set of discriminant vectors. IEEE Transactions on Computers 24(3), 281–289 (1975)

    Article  MATH  Google Scholar 

  10. Fukunaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, Boston (1990)

    MATH  Google Scholar 

  11. Fukunaga, K., Koontz, W.L.G.: Application of the karhunen-loeve expansion to feature selection and ordering. IEEE Transactions on Computers 19(4), 311–318 (1970)

    Article  MATH  Google Scholar 

  12. Fukunaga, K., Mantock, J.: Nonparametric discriminant analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 5(6), 671–678 (1983)

    Article  MATH  Google Scholar 

  13. Kittler, J.V., Young, P.C.: A new approach to feature selection based on the karhunen-loeve expansion. Pattern Recognition 5(4), 335–352 (1973)

    Article  MathSciNet  Google Scholar 

  14. Nadeau, C., Bengio, Y.: Inference for the generalization error. Machine Learning 62, 239–281 (2003)

    Article  Google Scholar 

  15. Rodríguez, J.J., Kuncheva, L.I., Alonso, C.J.: Rotation forest: A new classifier ensemble method. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(10), 1619–1630 (2006)

    Article  Google Scholar 

  16. Skurichina, M., Duin, R.P.W.: Combining feature subsets in feature selection. In: Oza, N.C., et al. (eds.) MCS 2005. LNCS, vol. 3541, pp. 165–175. Springer, Heidelberg (2005)

    Google Scholar 

  17. Tumer, K., Oza, N.C.: Input decimated ensembles. Pattern Analysis and Applications 6, 65–77 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  18. van der Heijden, F., et al.: Classification, Parameter Estimation and State Estimation. Wiley, Chichester (2004)

    MATH  Google Scholar 

  19. Webb, A.: Statistical Pattern Recognition. Arnold, London (1999)

    MATH  Google Scholar 

  20. Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques, 2nd edn. Morgan Kaufmann, San Francisco (2005)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Michal Haindl Josef Kittler Fabio Roli

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer Berlin Heidelberg

About this paper

Cite this paper

Kuncheva, L.I., Rodríguez, J.J. (2007). An Experimental Study on Rotation Forest Ensembles. In: Haindl, M., Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2007. Lecture Notes in Computer Science, vol 4472. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72523-7_46

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-72523-7_46

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-72481-0

  • Online ISBN: 978-3-540-72523-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics