Skip to main content

Pruned Random Subspace Method for One-Class Classifiers

  • Conference paper
Multiple Classifier Systems (MCS 2011)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 6713))

Included in the following conference series:

Abstract

The goal of one-class classification is to distinguish the target class from all the other classes using only training data from the target class. Because it is difficult for a single one-class classifier to capture all the characteristics of the target class, combining several one-class classifiers may be required. Previous research has shown that the Random Subspace Method (RSM), in which classifiers are trained on different subsets of the feature space, can be effective for one-class classifiers. In this paper we show that the performance by the RSM can be noisy, and that pruning inaccurate classifiers from the ensemble can be more effective than using all available classifiers. We propose to apply pruning to RSM of one-class classifiers using a supervised area under the ROC curve (AUC) criterion or an unsupervised consistency criterion. It appears that when the AUC criterion is used, the performance may be increased dramatically, while for the consistency criterion results do not improve, but only become more predictable.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Biggio, B., Fumera, G., Roli, F.: Multiple classifier systems under attack. In: El Gayar, N., Kittler, J., Roli, F. (eds.) MCS 2010. LNCS, vol. 5997, pp. 74–83. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  2. Bradley, A.: The use of the area under the ROC curve in the evaluation of machine learning algorithms. Pattern Recognition 30(7), 1145–1159 (1997)

    Article  Google Scholar 

  3. Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)

    MATH  Google Scholar 

  4. Bryll, R., Gutierrez-Osuna, R., Quek, F.: Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets. Pattern Recognition 36(6), 1291–1302 (2003)

    Article  MATH  Google Scholar 

  5. Chandola, V., Banerjee, A., Kumar, V.: Anomaly detection: A survey. ACM Computing Surveys 41(3), 1–58 (2009)

    Article  Google Scholar 

  6. Cheplygina, V.: Random subspace method for one-class classifiers. Master’s thesis, Delft University of Technology (2010)

    Google Scholar 

  7. DD_tools, the Data Description toolbox for Matlab, http://prlab.tudelft.nl/david-tax/dd_tools.html

  8. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. The Journal of Machine Learning Research 7, 1–30 (2006)

    MATH  Google Scholar 

  9. Friedman, M.: The use of ranks to avoid the assumption of normality implicit in the analysis of variance. Journal of the American Statistical Association 32(200), 675–701 (1937)

    Article  MATH  Google Scholar 

  10. Ho, T.: The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)

    Article  Google Scholar 

  11. Kittler, J., Hatef, M., Duin, R., Matas, J.: On combining classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(3), 226–239 (2002)

    Article  Google Scholar 

  12. Lazarevic, A., Kumar, V.: Feature bagging for outlier detection. In: 11th ACM SIGKDD Int. Conf. on Knowledge Discovery in Data Mining, pp. 157–166 (2005)

    Google Scholar 

  13. Nanni, L.: Experimental comparison of one-class classifiers for online signature verification. Neurocomputing 69(7-9), 869–873 (2006)

    Article  Google Scholar 

  14. Nemenyi, P.: Distribution-free multiple comparisons. Ph.D. thesis, Princeton (1963)

    Google Scholar 

  15. OC classifier results, http://homepage.tudelft.nl/n9d04/occ/index.html

  16. Oza, N., Tumer, K.: Classifier ensembles: Select real-world applications. Information Fusion 9(1), 4–20 (2008)

    Article  Google Scholar 

  17. PRtools, Matlab toolbox for Pattern Recognition, http://www.prtools.org

  18. Rokach, L.: Collective-agreement-based pruning of ensembles. Computational Statistics & Data Analysis 53(4), 1015–1026 (2009)

    Article  MATH  Google Scholar 

  19. Schapire, R., Freund, Y.: Experiments with a new boosting algorithm. In: 13th Int. Conf. on Machine Learning, p. 148. Morgan Kaufmann, San Francisco (1996)

    Google Scholar 

  20. Tax, D., Duin, R.: Combining one-class classifiers. In: Kittler, J., Roli, F. (eds.) MCS 2001. LNCS, vol. 2096, pp. 299–308. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  21. Tax, D.: One-class classification; Concept-learning in the absence of counter-examples. Ph.D. thesis, Delft University of Technology (June 2001)

    Google Scholar 

  22. Tax, D., Muller, K.: A consistency-based model selection for one-class classification. In: 17th Int. Conf. on Pattern Recognition, pp. 363–366. IEEE, Los Alamitos (2004)

    Google Scholar 

  23. UCI Machine Learning Repository, http://archive.ics.uci.edu/ml/

  24. Zhou, Z., Wu, J., Tang, W.: Ensembling neural networks: Many could be better than all. Artificial Intelligence 137(1-2), 239–263 (2002)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Cheplygina, V., Tax, D.M.J. (2011). Pruned Random Subspace Method for One-Class Classifiers. In: Sansone, C., Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2011. Lecture Notes in Computer Science, vol 6713. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21557-5_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-21557-5_12

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-21556-8

  • Online ISBN: 978-3-642-21557-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics