Skip to main content
Log in

RRS + LS-SVM: a new strategy for “a priori” sample selection

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

We present in this work a new Sparse Hybrid Classifier, by using reduced remaining subset (RRS) with least squares support vector machine (LS-SVM). RRS is a sample selection technique based on a modified nearest neighbor rule. It is used in order to choose the best samples to represent each class of a given database. After that, LS-SVM uses the samples selected by RRS as support vectors to find the decision surface between the classes, by solving a system of linear equations. This hybrid classifier is considered as a sparse one because it is able to detect support vectors, what is not possible when using LS-SVM separately. Some experiments are presented to compare the proposed approach with two existent methods that also aim to impose sparseness in LS-SVMs, called LS 2-SVM and Ada-Pinv.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Suykens JAK, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300

    Article  MathSciNet  Google Scholar 

  2. Vapnik VN (1995) The nature of statistical learning theory. Springer, New York

    MATH  Google Scholar 

  3. Suykens JAK, Vandewalle J (1999) Multiclass least squares support vector machines. In: International joint conference on neural networks, Washington

  4. Lacerda WS, Braga AP (2004) A new method for margin determination using the modified nearest neighbor rule. In VIII Brazilian symposium on artificial neural networks (SBRN 2004), MA

  5. Bertsekas DP (1982) Constrained optimization and lagrange multiplier methods. Academic, New York

  6. Mercer J (1909) Functions of positive and negative types and their connection with the theory of integral equations. Trans Lond Phil Soc 209:415–446

    Article  Google Scholar 

  7. Chidananda Gowda K, Krishna G (1979) The condensed nearest neighbor rule using the concept of mutual nearest neighborhood. IEEE Trans Inf Theory 25(4):488–490

    Article  Google Scholar 

  8. Hart PE (1968) The condensed nearest neighbor. IEEE Trans Inf Theory IT(14):515–516

    Google Scholar 

  9. Gates GW (1972) The reduced nearest neighbor rule. IEEE Trans Inf Theory 18(3):431–433

    Article  Google Scholar 

  10. Wilson DL (1972) Asymptotic properties of nearest neighbor rules using edited data. IEEE Trans Syst Man Cybern SMC 2(3):408–421

    Article  MATH  Google Scholar 

  11. Carvalho BPR (2005) New Strategies for support vectors’ automatic detection in least squares support vector machines. Master thesis. Engineering School, UFMG, Brazil

  12. Valyon J , Horvath G (2004) A sparse least squares support vector machine classifier. In: International Joint Conference on Neural Networks

  13. Lee YJ, Mangasarian OL (2001) Rsvm: reduced support vector machines. In: 1st SIAM international conference on data mining, Chicago

  14. Carvalho BPR, Braga AP (2005) New training strategies for least squares support vector machines. In: V National Meeting on Artificial Intelligence (ENIA 2005), RS

  15. Carvalho BPR, Braga AP (2004) Neural strategies for training least squares support vector machines. In VIII Brazilian symposium on artificial neural networks, MA

  16. Widrow B, Hoff ME (1960) Adaptive switching circuits. IRE WESCON Convention Record 4:96–104

    Google Scholar 

  17. Math Works Inc. (1991) MATLAB for Windows User’s Guide. Math Works

Download references

Acknowledgments

The authors would like to thank the support from governmental agencies CAPES and FAPEMIG.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bernardo Penna Resende de Carvalho.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Penna Resende de Carvalho, B., Soares Lacerda, W. & de Pádua Braga, A. RRS + LS-SVM: a new strategy for “a priori” sample selection. Neural Comput & Applic 16, 227–234 (2007). https://doi.org/10.1007/s00521-007-0085-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-007-0085-y

Keywords

Navigation