Skip to main content

Advertisement

Log in

LGND: a new method for multi-class novelty detection

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Multi-class novelty detection is a crucial yet challenging aspect for recognition systems. Several methods have been presented, which either concatenate multiple classes into a large artificial super-class, or combine several independent classifiers of each known class, or utilize the results of multi-class classifiers. However, these methods ignore the correlation within each class, or cannot be elegantly formulated in a joint model. To overcome these limitations, we propose a new local and global novelty detection model (LGND). Different from the previous approaches, LGND incorporates the local correlation with the global regularization in a unified framework. This new optimization model boils down to a convex quadratic programming with guaranteed global optimum solution. Furthermore, comprehensive discussions, including the relationship between locality and globality, the discussion on the parameters in LGND and the connections to multi-class classification, are also presented. LGND opens up a new way for multi-class novelty detection from both local and global perspectives. Experimental results on Corel5k, Caltech-256, and five UCI data sets confirm that LGND outperforms or is at least comparable to state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. Available at http://cvxr.com/cvx/.

  2. Available at http://prlab.tudelft.nl/david-tax/dd_tools.html.

  3. Available at http://lear.inrialpes.fr/people/guillaumin/data.php.

  4. Available at http://www.vision.caltech.edu/Image_Datasets/Caltech256/.

  5. Available at http://archive.ics.uci.edu/ml/.

References

  1. Basharat A, Gritai A, Shah M (2008) Learning object motion patterns for anomaly detection and improved object detection. In: Proceedings of the international conference on computer vision and pattern recognition. IEEE, pp 1–8

  2. Bodesheim P, Freytag A, Rodner E, Denzler J (2015) Local novelty detection in multi-class recognition problems. In: Proceedings of IEEE winter conference on applications of computer vision. IEEE, pp 813–820

  3. Bodesheim P, Freytag A, Rodner E, Kemmler M, Denzler J (2013) Kernel null space methods for novelty detection. In: Proceedings of IEEE conference on computer vision and pattern recognition, pp 3374–3381

  4. Bottou L, Vapnik V (1992) Local learning algorithms. Neural Comput 4(6):888–900

    Article  Google Scholar 

  5. Cheng H, Tan P, Jin R (2007) Localized support vector machine and its efficient algorithm. In: Proceedings of the international conference on data mining. SIAM, pp 461–466

  6. Cohen G, Sax H, Geissbuhler A (2008) Novelty detection using one-class parzen density estimator. An application to surveillance of nosocomial infections. Stud Health Technol Inform 136(136):21–26

    Google Scholar 

  7. Crammer K, & Chechik G (2004) A needle in a haystack: local one-class optimization. In: Proceedings of the 21st international conference on machine learning. ACM, p 26

  8. Curiac DI, Volosencu C (2012) Ensemble based sensing anomaly detection in wireless sensor networks. Expert Syst Appl 39(10):9087–9096

    Article  Google Scholar 

  9. Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

  10. Ding X, Li Y, Belatreche A, Maguire L (2014) An experimental evaluation of novelty detection methods. Neurocomputing 135:313–327

    Article  Google Scholar 

  11. Frome A, Singer Y, Sha F, Malik J (2007) Learning globally-consistent local distance functions for shape-based image retrieval and classification. In: Proceedings of the international conference on computer vision. IEEE, pp 1–8

  12. García S, Fernández A, Luengo J, Herrera F (2010) Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Inf Sci 180(10):2044–2064

    Article  Google Scholar 

  13. Guo Y, Wu L, Lu H, Feng Z, Xue X (2006) Null foley-sammon transform. Pattern Recognit 39(11):2248–2251

    Article  MATH  Google Scholar 

  14. Gupta G, Ghosh J (2005) Robust one-class clustering using hybrid global and local search. In: Proceedings of the 22nd international conference on machine learning. ACM, pp 273–280

  15. Haggett SJ, Chu DF, Marshall IW (2008) Evolving a dynamic predictive coding mechanism for novelty detection. Knowl Based Syst 21(3):217–224

    Article  Google Scholar 

  16. Hoffmann H (2007) Kernel PCA for novelty detection. Pattern Recognit 40(3):863C874

    Article  MATH  Google Scholar 

  17. Jiang M, Huang W, Huang Z, Yen GG (2015) Integration of global and local metrics for domain adaptation learning via dimensionality reduction. IEEE Trans Cybern PP(99):1–14

    Google Scholar 

  18. Jumutc V, Suykens JAK (2014) Multi-class supervised novelty detection. IEEE Trans Pattern Anal Mach Intell 36(12):2510–2523

    Article  Google Scholar 

  19. Kang S, Cho S (2015) A novel multi-class classification algorithm based on one-class support vector machine. Intell Data Anal 19(4):713–725

    Article  Google Scholar 

  20. Kemmler M, Rodner E, Wacker E-S, Denzler J (2013) One-class classification with gaussian processes. Pattern Recognit 46(12):3507–3518

    Article  Google Scholar 

  21. Khan SS, Karg ME, Hoey J, Kulic D (2012) Towards the detection of unusual temporal events during activities using hmms. In: ACM conference on ubiquitous computing, pp 1075–1084

  22. Khan SS, Madden MG (2013) One-class classification: taxonomy of study and review of techniques. Knowl Eng Rev 29(3):1–30

    Google Scholar 

  23. Krawczyk B, Niak M, Herrera F (2015) On the usefulness of one-class classifier ensembles for decomposition of multi-class problems. Pattern Recognit 48(12):3969–3982

    Article  Google Scholar 

  24. Lampert CH, Nickisch H, Harmeling S (2009) Learning to detect unseen object classes by between-class attribute transfer. In: Proceedings of the international conference on computer vision and pattern recognition. IEEE, pp 951–958

  25. Lazzaretti A, Eug Tax DMJ, Vieira Neto H, Ferreira VH (2016) Novelty detection and multi-class classification in power distribution voltage waveforms. Expert Syst With Appl 45(C):322–330

    Article  Google Scholar 

  26. Lee K, Kim D, Lee K, Lee D (2007) Density-induced support vector data description. IEEE Trans Neural Netw 18(1):284–289

    Article  Google Scholar 

  27. Lee W, Liu B (2003) Learning with positive and unlabeled examples using weighted logistic regression. Proc Int Conf Mach Learn 3:448–455

    Google Scholar 

  28. Li D, Tian Y (2016) Global and local metric learning via eigenvectors. Knowl Based Syst 116:152–162

    Article  Google Scholar 

  29. Lin Y, Liu T, Fuh C (2007) Local ensemble kernel learning for object category recognition. In: Proceedings of the international conference on computer vision and pattern recognition. IEEE, pp 1–8

  30. Manevitz L, Yousef M (2002) One-class svms for document classification. J Mach Learn Res 2:139–154

    MATH  Google Scholar 

  31. Muñoz-Marí J, Bovolo F, Gómez-Chova L, Bruzzone L, Camp-Valls G (2010) Semisupervised one-class support vector machines for classification of remote sensing data. IEEE Trans Geosci Remote Sens 48(8):3188–3197

    Article  Google Scholar 

  32. Pan R, Zhou Y, Cao B, Liu NN, Lukose R, Scholz M, Yang Q (2008) One-class collaborative filtering. In: Proceedings of the international conference on data mining. IEEE, pp 502–511

  33. Perdisci R, Gu G, Lee W (2006) Using an ensemble of one-class SVM classifiers to harden payload-based anomaly detection systems. In: Proceedings of the 6th IEEE international conference on data mining. IEEE, pp 488–498

  34. Qi G, Tian Q, Huang T (2011) Locality-sensitive support vector machine by exploring local correlation and global regularization. In: Proceedings of the international conference on computer vision and pattern recognition. IEEE, pp 841–848

  35. Sarasamma S, Zhu Q, Huff J (2005) Hierarchical Kohonenen net for anomaly detection in network security. IEEE Trans Syst Man Cybern Part B (Cybern) 35(2):302–312

    Article  Google Scholar 

  36. Schölkopf B, Platt J, Shawe-Taylor J, Smola A, Williamson R (2001) Estimating the support of a high-dimensional distribution. Neural Comput 13(7):1443–1471

    Article  MATH  Google Scholar 

  37. Schüldt C, Laptev I, Caputo B (2004) Recognizing human actions: a local SVM approach. In: Proceedings of the international conference on pattern recognition, vol 3. IEEE, pp 32–36

  38. Segata N, Blanzieri E (2010) Fast and scalable local kernel machines. J Mach Learn Res 11:1883–1926

    MathSciNet  MATH  Google Scholar 

  39. Sonnenburg S, Rätsch G, Schäfer C, Schölkopf B (2006) Large scale multiple kernel learning. J Mach Learn Res 7:1531–1565

    MathSciNet  MATH  Google Scholar 

  40. Sun Y, Todorovic S, Goodison S (2010) Local-learning-based feature selection for high-dimensional data analysis. IEEE Trans Pattern Anal Mach Intell 32(9):1610–1626

    Article  Google Scholar 

  41. Tao JW, Wen S, Hu W (2016) Multi-source adaptation learning with global and local regularization by exploiting joint kernel sparse representation. Knowl Based Syst 98:76–94

    Article  Google Scholar 

  42. Tax D, Duin R (2008) Growing a multi-class classifier with a reject option. Pattern Recognit Lett 29(10):1565–1570

    Article  Google Scholar 

  43. Tax DM, Duin RP (2004) Support vector data description. Mach Learn 54(1):45–66

    Article  MATH  Google Scholar 

  44. Theissler A (2017) Detecting known and unknown faults in automotive systems using ensemble-based anomaly detection. Knowl Based Syst 123:163C173

    Article  Google Scholar 

  45. Vedaldi A, Zisserman A (2012) Efficient additive kernels via explicit feature maps. IEEE Trans Pattern Anal Mach Intell 34(3):480–492

    Article  Google Scholar 

  46. Wu M, Ye J (2009) A small sphere and large margin approach for novelty detection using training data with outliers. IEEE Trans Pattern Anal Mach Intell 31(11):2088–2092

    Article  Google Scholar 

  47. Yang Z, Cai Z, Guan X (2016) Estimating user behavior toward detecting anomalous ratings in rating systems. Knowl Based Syst 111:144–158

    Article  Google Scholar 

  48. Yu A, Grauman K (2014) Fine-grained visual comparisons with local learning. In: Proceedings of the international conference on computer vision and pattern recognition, pp 192–199

  49. Yu H (2005) Single-class classification with mapping convergence. J Mach Learn Res 61(1–3):49–69

    Article  Google Scholar 

  50. Zhang H, Berg A, Maire M, Malik J (2006) SVM-KNN: discriminative nearest neighbor classification for visual category recognition. In: Proceedings of the international conference on computer vision and pattern recognition, vol 2. IEEE, pp 2126–2136

  51. Zhao J, Wang Z, Cao F, Wang D (2014) A local learning algorithm for random weights networks. Knowl Based Syst 74(1):159–166

    Google Scholar 

  52. Zheng W, Zhao L, Zou C (2005) Foley-sammon optimal discriminant vectors using kernel approach. IEEE Trans Neural Netw 16(1):1–9

    Article  Google Scholar 

  53. Zhou N, Xu Y, Cheng H, Fang J, Pedrycz W (2016) Global and local structure preserving sparse subspace learning: an iterative approach to unsupervised feature selection. Pattern Recognit 53:87–101

    Article  MATH  Google Scholar 

Download references

Acknowledgements

This work has been partially supported by grants from National Natural Science Foundation of China (Nos. 61472390, 71731009, 71331005, and 91546201), and the Beijing Natural Science Foundation (No.1162005).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yingjie Tian.

Ethics declarations

Conflict of interest

The authors declared that they have no conflicts of interest to this work.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tang, J., Tian, Y. & Liu, X. LGND: a new method for multi-class novelty detection. Neural Comput & Applic 31, 3339–3355 (2019). https://doi.org/10.1007/s00521-017-3270-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-017-3270-7

Keywords