Skip to main content

FaSS: Ensembles for Stable Learners

  • Conference paper
Multiple Classifier Systems (MCS 2009)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 5519))

Included in the following conference series:

  • 2500 Accesses

Abstract

This paper introduces a new ensemble approach, Feature-Space Subdivision (FaSS), which builds local models instead of global models. FaSS is a generic ensemble approach that can use either stable or unstable models as its base models. In contrast, existing ensemble approaches which employ randomisation can only use unstable models. Our analysis shows that the new approach reduces the execution time to generate a model in an ensemble with an increased level of localisation in FaSS. Our empirical evaluation shows that FaSS performs significantly better than boosting in terms of predictive accuracy, when a stable learner SVM is used as the base learner. The speed up achieved by FaSS makes SVM ensembles a reality that would otherwise infeasible for large data sets, and FaSS SVM performs better than Boosting J48 and Random Forests when SVM is the preferred base learner.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Asuncion, A., Newman, D.J.: UCI repository of machine learning databases. University of California, Irvine (2007)

    Google Scholar 

  2. Breiman, L.: Bagging Predictors. Machine Learning 24, 123–140 (1996)

    MATH  Google Scholar 

  3. Breiman, L.: Random forests. Machine Learning 45, 5–32 (2001)

    Article  MATH  Google Scholar 

  4. Frank, E., Hall, M., Pfahringer, B.: Locally Weighted Naive Bayes. In: Proc. of the 19th Conf. on Uncertainty in AI, pp. 249–256 (2003)

    Google Scholar 

  5. Ho, T.K.: The Random Subspace Method for Constructing Decision Forests. IEEE Trans. Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)

    Article  Google Scholar 

  6. Kohavi, R.: Scaling Up the Accuracy of Naive-Bayes Classifiers: a Decision-Tree Hybrid. In: Proc. of the 2nd KDD, pp. 202–207 (1996)

    Google Scholar 

  7. Kohavi, R., Li, C.H.: Oblivious Decision Trees, Graphs, and Top-Down Pruning. In: Proc. of 1995 Intl. Joint Conf. on Artificial Intelligence, pp. 1071–1077 (1995)

    Google Scholar 

  8. Opitz, D.: Feature selection for ensembles. In: Proc. of the 16th AAAI, pp. 379–384 (1999)

    Google Scholar 

  9. Oza, N.C., Tumer, K.: Input Decimation Ensembles: Decorrelation through Dimensionality Reduction. In: Kittler, J., Roli, F. (eds.) MCS 2001. LNCS, vol. 2096, pp. 238–247. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  10. Pavlov, D., Mao, J., Dom, B.: Scaling-up support vector machines using the boosting algorithm. In: Proc. of 2000 Intl. Conf. on Pattern Recognition, pp. 219–222 (2000)

    Google Scholar 

  11. Schapire, R.E., Singer, S.: Improved boosting algorithms using confidence-rated predictions. Machine Learning 37, 297–336 (1999)

    Article  MATH  Google Scholar 

  12. Tao, D., Tang, X., Li, X., Wu, X.: Asymmetric bagging and random subspace for support vector machines-based relevance feedback in image retrieval. IEEE Trans. on Pattern Analysis and Machine Intelligence 28(7), 1088–1099 (2006)

    Article  Google Scholar 

  13. Witten, I.H., Frank, E.: Data Mining: Practical machine learning tools and techniques, 2nd edn. (2005)

    Google Scholar 

  14. Zheng, Z., Webb, G.I.: Lazy Learning of Bayesian Rules. Machine Learning 41(1), 53–84 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ting, K.M., Wells, J.R., Tan, S.C., Teng, S.W., Webb, G.I. (2009). FaSS: Ensembles for Stable Learners. In: Benediktsson, J.A., Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2009. Lecture Notes in Computer Science, vol 5519. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02326-2_37

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-02326-2_37

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-02325-5

  • Online ISBN: 978-3-642-02326-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics