Skip to main content

Selected Random Subspace Novelty Detection Filter

  • Conference paper
  • 3733 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8226))

Abstract

In this paper we propose a solution to deal with the problem of novelty detection. Given a set of training examples believed to come from the same class, the aim is to learn a model that will be able to distinguich examples in the future that do not belong to the same class. The proposed approach called Selected Random Subspace Novelty Detection Filter (SRS − NDF) is based on the bootstrap technique, the ensemble idea and model selection principle. The SRS − NDF method is compared to novelty detection methods on publicly available datasets. The results show that for most datasets, this approach significantly improves performance over current techniques used for novelty detection.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Markou, M., Singh, S.: Novelty detection: a review - part 1: statistical approaches. Signal Processing 83, 2481–2497 (2003)

    Article  MATH  Google Scholar 

  2. Markou, M., Singh, S.: Novelty detection: a review - part 2: neural network based approaches. Signal Processing 83, 2499–2521 (2003)

    Article  MATH  Google Scholar 

  3. Kohonen, T., Oja, E.: Fast Adaptive Formation of Orthogonalizing Filters and Associative Memory in Recurrent Networks of Neuron-Like Elements. Biological Cybernetics 21, 85–95 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  4. Breiman, L.: Bagging Predictors. Machine Learning 24(2), 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  5. Freund, Y., Saphire, R.E.: Experiments with a new boosting algorithm. In: The 13th International Conference on Machine Learning, pp. 276–280 (1996)

    Google Scholar 

  6. Kassab, R., Lamirel, J.-C., Nauer, E.: Novelty Detection for Modeling Users Profile. In: The 18th International FLAIRS Conference, pp. 830–831 (2005)

    Google Scholar 

  7. Kassab, R., Alexandre, F.: Incremental Data-driven Learning of a Novelty Detection Model for One-Class Classification Problem with Application to High-Dimensional Noisy Data. Machine Learning 74(2), 191–234 (2009)

    Article  Google Scholar 

  8. Breiman, L.: Random forest. Machine Learning (2001)

    Google Scholar 

  9. Asuncion, A., Newman, D.J.: UCI Machine Learning Repository. University of California, Irvine (2007)

    Google Scholar 

  10. Bradeley, P.W.: The use of area under the ROC curve in the evaluation of machine learning algorithms. Pattern Recognition 30, 1145–1159 (1997)

    Article  Google Scholar 

  11. Kubat, M., Holte, R.C., Matwin, S.: Machine Learning for the Detection of Oil Spills in Satellite Radar Images. Machine Learning 30, 195–215 (1998)

    Article  Google Scholar 

  12. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representation by error propagation. In: Parallel Distributed Processing: Explorations in the Microstructures of Cognition, pp. 318–362. MIT Press (1986)

    Google Scholar 

  13. Jolliffe, I.T.: Principal Component Analysis. Springer Series in Statistics. Springer, Berlin (1986)

    Book  Google Scholar 

  14. Scholkopf, B., Platt, J., Shawe-Taylor, J., Smola, A.J., Williamson, R.C.: Estimating the support of a high-dimensional distribution. Neural Computation (1999)

    Google Scholar 

  15. Greville, T.N.E.: Some applications of the pseudoinverse of a matrix. SIAM Rev. (1960)

    Google Scholar 

  16. Zhang, Y., Burer, S., Street, W.N.: Ensemble prunning via semi-denite programming. Journal of Machin Learning Reasearch 7, 1315–1338 (2006)

    MathSciNet  MATH  Google Scholar 

  17. Hamdi, F., Bennani, Y.: Learning Random Subspace Detection Filter. In: International Joint Conference in Neural Networks, IJCNN (2011)

    Google Scholar 

  18. Caruna, R., Niculescu Mizil, A., Grew, G., Ksikes, A.: Ensemble selection from librairiesof models. In: The 21st International Conference on Machin Learning (2004)

    Google Scholar 

  19. Catell, R.: The scree test for the number of factor. Multivariate Behaviorial Research, 245–276 (1966)

    Google Scholar 

  20. García, V., Mollineda, R.A., Sánchez, J.S.: Index of balanced accuracy: A performance measure for skewed class distributions. In: Araujo, H., Mendonça, A.M., Pinho, A.J., Torres, M.I. (eds.) IbPRIA 2009. LNCS, vol. 5524, pp. 441–448. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hamdi, F. (2013). Selected Random Subspace Novelty Detection Filter. In: Lee, M., Hirose, A., Hou, ZG., Kil, R.M. (eds) Neural Information Processing. ICONIP 2013. Lecture Notes in Computer Science, vol 8226. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-42054-2_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-42054-2_43

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-42053-5

  • Online ISBN: 978-3-642-42054-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics