Skip to main content

One-Class Classification Criterion Robust to Anomalies in Training Dataset

  • Conference paper
  • First Online:
Pattern Recognition. ICPR International Workshops and Challenges (ICPR 2021)

Abstract

A new version of one-class classification criterion robust to anomalies in the training dataset is proposed based on support vector data description (SVDD). The original formulation of the problem is not geometrically correct, since the value of the penalty for the admissible escape of the training sample objects outside the describing hypersphere is incommensurable with the distance to its center in the optimization problem and the presence of outliers can greatly affect the decision boundary. The proposed criterion is intended to eliminate this inconsistency. The equivalent form of criterion without constraints lets us use a kernel-based approach without transition to the dual form to make a flexible description of the training dataset. The substitution of the non-differentiable objective function by the smooth one allows us to apply an algorithm of sequential optimizations to solve the problem. We apply the Jaccard measure for a quantitative assessment of the robustness of a decision rule to the presence of outliers. A comparative experimental study of existing one-class methods shows the superiority of the proposed criterion in anomaly detection.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Moya, M.M., Koch, M.W., Hostetler, L.D.: One-class classifier networks for target recognition applications. In: Proceeding WCNN 1993, World Congress on Neural Networks, vol. 3, pp. 797–801 (1993)

    Google Scholar 

  2. Bekkerd J., Davis J.: Learning from positive and unlabeled data: a survey. Mach. Learn., 109(4), 719–760. Springer, US (2020)

    Google Scholar 

  3. Xu, D., et al.: Learning deep representations of appearance and motion for anomalous event detection. In: Procedings of the British Machine Vision Conference 2015, pp. 1–12. British Machine Vision Association (2015)

    Google Scholar 

  4. Chandola, V., Banerjee, A., Kumar, V.: Anomaly detection: a survey. ACM Comput. Surv. 41(3), 1–58 (2009)

    Article  Google Scholar 

  5. Larin, Aleksandr., et al.: Parametric representation of objects in color space using one-class classifiers. In: Perner, P. (ed.) MLDM 2014. LNCS (LNAI), vol. 8556, pp. 300–314. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-08979-9_23

    Chapter  Google Scholar 

  6. Shi, L.-F.F., et al.: Removing haze particles from single image via exponential inference with support vector data description. IEEE Trans. Multimedia 20(9), 2503–2512 (2018)

    Article  Google Scholar 

  7. Kopylov, A., et al.: Background-invariant robust hand detection based on probabilistic one-class color segmentation and skeleton matching. In: ICPRAM 2018 - Proceedings of the 7th International Conference on Pattern Recognition Applications and Methods. SCITEPRESS - Science and Technology Publications, vol. 2018, pp. 503–510, January (2018)

    Google Scholar 

  8. Khan, S.S., Hoey, J.: Review of fall detection techniques: a data availability perspective. Med. Eng. Phys. 39, 12–22. Elsevier Ltd (2017)

    Google Scholar 

  9. Tarassenko, L., et al.: Novelty detection for the identification of masses in mammograms. In: IEE Conference Publication, no. 409, pp. 440–447. IEEE (1995)

    Google Scholar 

  10. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification. Wiley, Hoboken (2012)

    Google Scholar 

  11. Parzen, E.: On estimation of a probability density function and mode. Ann. Math. Stat. 33(3), 1065–1076 (1962)

    Article  MathSciNet  Google Scholar 

  12. Juszczak, P., et al.: Minimum spanning tree based one-class classifier. Neurocomputing 72(7–9), 1859–1869 (2009)

    Article  Google Scholar 

  13. Vapnik, V.N.: Statistical learning theory. In: Haykin, S. (ed.) Interpreting, vol. 2, № 4, p. 736. Wiley, Hoboken (1998)

    Google Scholar 

  14. Schölkopf, B., et al.: Estimating the support of a high-dimensional distribution. In: Neural Computation, vol. 13, № 7, pp. 1443–1471. MIT Press, 238 Main St., Suite 500, Cambridge, MA 02142–1046, USA (2001). journals-info@mit.edu

    Google Scholar 

  15. Tax, D.M.J.: One-class Classification. Concept-learning in the Absence of Counter-Examples. Delft University of Technology (2001). 202 p.

    Google Scholar 

  16. Gornitz, N., et al.: Support vector data descriptions and k-means clustering: one class? IEEE Trans. Neural Netw. Learn. Syst. 29(9), 3994–4006 (2018)

    Article  MathSciNet  Google Scholar 

  17. Duin, R.P.W., de Ridder, D., Tax, D.M.J.: Experiments with a featureless approach to pattern recognition. Pattern Recogn. Lett. 18(11–13), 1159-1166 (1997)

    Google Scholar 

  18. Mottl, V., Dvoenko, S., Seredin, O., Kulikowski, C., Muchnik, I.: Featureless pattern recognition in an imaginary hilbert space and its application to protein fold classification. In: Perner, P. (ed.) MLDM 2001. LNCS (LNAI), vol. 2123, pp. 322–336. Springer, Heidelberg (2001). https://doi.org/10.1007/3-540-44596-X_26

    Chapter  MATH  Google Scholar 

  19. Chang, W., Lee, C., Lin, C.: A revisit to support vector data description (SVDD), W.Csie.Org, № 1, pp. 1–20 (2013)

    Google Scholar 

  20. Nesterov, Y.: Smooth minimization of non-smooth functions. Math. Program. 103(1), 127–152 (2005). https://doi.org/10.1007/s10107-004-0552-5

    Article  MathSciNet  MATH  Google Scholar 

  21. Liu, B.: On smoothing exact penalty functions for nonlinear constrained optimization problems. J. Appl. Math. Comput. 30(1–2), 259–270 (2009). https://doi.org/10.1007/s12190-008-0171-z

    Article  MathSciNet  MATH  Google Scholar 

  22. Gramfort, A., Thomas, A.: Comparing anomaly detection algorithms for outlier detection on toy datasets [Элeктpoнный pecypc], scikit-learn 0.20.3 documentation, pp. 2–5 (2019). https://scikit-learn.org/stable/auto_examples/miscellaneous/plot_anomaly_comparison.html#sphx-glr-auto-examples-miscellaneous-plot-anomaly-comparison-py. Accessed 20 Oct 2020

  23. Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. 2(3), 1–27 (2011)

    Article  Google Scholar 

  24. Jaccard, P.: Etude comparative de la distribution florale dans une portion des Alpes et des Jura. Bull. Soc. Vaudoise Sci. Nat. 37, 547–579 (1901)

    Google Scholar 

Download references

Acknowledgements

The work is supported by the Russian Fund for Basic Research. Grant no: 18-07-00942 and Grant no: 20-07-00441.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Oleg S. Seredin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Larin, A.O., Seredin, O.S., Kopylov, A.V. (2021). One-Class Classification Criterion Robust to Anomalies in Training Dataset. In: Del Bimbo, A., et al. Pattern Recognition. ICPR International Workshops and Challenges. ICPR 2021. Lecture Notes in Computer Science(), vol 12665. Springer, Cham. https://doi.org/10.1007/978-3-030-68821-9_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-68821-9_15

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-68820-2

  • Online ISBN: 978-3-030-68821-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics