Skip to main content
Log in

An adaptive class pairwise dimensionality reduction algorithm

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Support vector machines (SVM) has achieved great success in multi-class classification. However, with the increase in dimension, the irrelevant or redundant features may degrade the generalization performances of the SVM classifiers, which make dimensionality reduction (DR) become indispensable for high-dimensional data. At present, most of the DR algorithms reduce all data points to the same dimension for multi-class datasets, or search the local latent dimension for each class, but they neglect the fact that different class pairs also have different local latent dimensions. In this paper, we propose an adaptive class pairwise dimensionality reduction algorithm (ACPDR) to improve the generalization performances of the multi-class SVM classifiers. In the proposed algorithm, on the one hand, different class pairs are reduced to different dimensions; on the other hand, a tabu strategy is adopted to select adaptively a suitable embedding dimension. Five popular DR algorithms are employed in our experiment, and the numerical results on some benchmark multi-class datasets show that compared with the traditional DR algorithms, the proposed ACPDR can improve the generalization performances of the multi-class SVM classifiers, and also verify that it is reasonable to consider the different class pairs have different local dimensions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Vapnik V (1998) Statistical learning theory. Wiley-Interscience, New York

    MATH  Google Scholar 

  2. Gidudu A, Ruther H (2007) Comparison of feature selection techniques for SVM classification. In: Schaepman ME, Liang S, Groot NE, Kneubühler M (eds) Proceedings of 10th international symposium on physical measurements and spectral signatures in remote sensing, vol XXXVI. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Davos, Switzerland, pp 258–263

  3. Pal M, Foody GM (2010) Feature selection for classification of hyperspectral data by SVM. IEEE Trans Geosci Remote Sens 5:2297–2306

    Article  Google Scholar 

  4. Yu L, Liu H (2003) Feature selection for high-dimensional data: a fast correlation based filter solution. In: Proceedings of the twelfth International Conference on Machine Learning (ICML)

  5. Zhang D, Chen S, Zhou Z (2007) Constraint score: a new filter method for feature selection with pairwise constraints. Pattern Recognit 41(5):1440–1451

    Google Scholar 

  6. Pal M (2011) Fuzzy entropy based feature selection for classification of hyperspectral data. Dimensions and Directions of Geospatial Industry, pp 18–21

  7. Saradha A, Annandurai S (2005) A hybrid feature extraction approach for face recognition systems. Int J Graph Vis Image Process 5(5):23–30

    Google Scholar 

  8. Camastra F, Vinciarelli A (2008) Machine learning for audio, image and video analysis, 1st edn. Springer, Berlin, pp 305–341

    Book  MATH  Google Scholar 

  9. Yang B (2009) SVM-induced dimensionality reduction and classification. In: 2009 second international conference on intelligent computation technology and automation.

  10. Joliffe I (1986) Principal component analysis. Springer, Berlin

    Book  Google Scholar 

  11. Balakrishnama S, Ganapathirraju A (1998) Linear discriminate analysis. Institute for Signal and Information Processing, Mississippi State University

  12. Cai D, He X, Han J (2007) Isometric projection. In: Proceedings of AAAI conference on artificial intelligence

  13. He X, Cai D, Yan S, Zhang H (2005) Neighborhood preserving embedding. In: Proceedings in International Conference on Computer Vision (ICCV)

  14. He X, Niyogi P (2003) Locality preserving projections. In: Proceedings of conference advances in neural information processing systems

  15. Geng X, Zhan D-C, Zhou Z-H (2005) Supervised nonlinear dimensionality reduction for visualization and classification. IEEE Trans Syst Man Cybern Part B Cybern 35(6):1098–1107

    Google Scholar 

  16. de Ridder D, Kouropteva O, Okun O, Pietikäinen M, Duin RPW (2003) Supervised locally linear embedding. In: Proceedings of joint conference on artificial neural networks and neural information processing

  17. Silva C, Ribeiro B (2008) Selecting examples in manifold reduced feature space for active learning. In: 2008 seventh international conference on machine learning and applications

  18. Cai D, He XF, Kun Z, Han JW, Bao HJ (2007) Locality sensitive discriminant analysis. In: Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI), Hyderabad, India, pp 141–146

  19. Lukui S, Jun Z, Enhai L, Pilian H (2007) Text classification based on nonlinear dimensionality reduction techniques and support vector machines. In: Third international conference on natural computation, pp 674–677

  20. Bruske J, Sommer G (1997) An algorithm for intrinsic dimensionality estimation. In: Sommer G, Daniilidis K, Pauli J (eds) Computer analysis of images and patterns. Lecture Notes in Computer Science, vol 1296. Springer, Berlin, pp 9–16

  21. Camastra F (2003) Data dimensionality estimation methods: a survey. Pattern Recognit 36(12):2945–2954

    Google Scholar 

  22. Costa J, Girotra A, Hero AO (2005) Estimating local intrinsic dimension with k-nearest neighbor graphs. IEEE workshop on Statistical Signal Processing (SSP), Bordeaux

  23. Tenenbaum JB, de Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290(5500):2319–2323

    Article  Google Scholar 

  24. Camastra F, Vinciarelli A (2002) Estimating the intrinsic dimension of data with a fractal-based method. IEEE Trans Pattern Anal Mach Intell 24(10):1404–1407

    Article  Google Scholar 

  25. Kegl B (2002) Intrinsic dimension estimation using packing numbers. Neural Information Processing Systems, Vancouver

  26. Levina E, Bickel P (2005) Maximum likelihood estimation of intrinsic dimension. Adv Neural Inf Process Syst 17:777–784

    Google Scholar 

  27. Xiao R, Zhao Q, Zhang D, Shi P (2010) Data classification on multiple manifolds. In: 2010 international conference on pattern recognition, pp 3898–3901

  28. Carter KM (2010) On local intrinsic dimension estimation and its applications. IEEE Trans Signal Process 58(2):650–663

    Google Scholar 

  29. Goldberg AB, Zhu X, Singh A, Xu Z, Nowak R (2009) Multi-manifold semi-supervised learning. In: Proceedings of the twelfth international conference on artificial intelligence and statistics

  30. Wang Y, Jiang Y, Wu Y, Zhou Z-H (2010) Multi-manifold clustering. In: Proceedings of Pacific rim international conference on artificial intelligence, pp 280–291

  31. Anand A, Suganthan PN (2009) Multiclass cancer classification by support vector machines with class-wise optimized genes and probability estimates. J Theor Biol 533–540

  32. Kreβel UH-G (1999) Pairwise classification and support vector machines. In: Scholkopf B, Burges CJC, Smola AJ (eds) Advances in kernel methods: support vector learning. MIT Press, Cambridge, pp 255–268

  33. Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326

    Article  Google Scholar 

  34. Belkin M, Niyogi P (2002) Laplacian eigenmaps for dimensionality reduction and data representation. Technical Report TR-2002-01, Department of Computer Science, University of Chicago

  35. Blake C, Keogh E, Merz CJ (1998) UCI repository of machine learning databases from http://archive.ics.uci.edu/ml/datasets.html. Department of Information and Computer Science, University of California, Irvine

  36. Chang C-C, Lin C-J (2001) LIBSVM: a library for support vector machines. Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm

Download references

Acknowledgments

The work presented in this paper is supported by the National Science Foundation of Chian (61070033), the Guangdong Natural Science Foundation (9251009001000005) and the Open Project of Key Laboratory of Symbolic Computation and Knowledge Engineering of the Chinese Ministry of Education (93K-17-2009-K04).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lifang He.

Rights and permissions

Reprints and permissions

About this article

Cite this article

He, L., Yang, X. & Hao, Z. An adaptive class pairwise dimensionality reduction algorithm. Neural Comput & Applic 23, 299–310 (2013). https://doi.org/10.1007/s00521-012-0897-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-012-0897-2

Keywords

Navigation