Skip to main content

Feature Selection Using Sparse Twin Bounded Support Vector Machine

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2020)

Abstract

Although twin bounded support machine (TBSVM) has a lower time complexity than support vector machine (SVM), TBSVM has a poor ability to select features. To overcome the shortcoming of TBSVM, we propose a sparse twin bounded support machine (STBSVM) inspired by the sparsity of the \(\ell _1\)-norm. The objective function of STBSVM contains the hinge loss and the \(\ell _1\)-norm terms, both which can induce sparsity. We find solutions in the primal space instead of the dual space and avoid the operation of matrix inversion. All of these can assure the sparsity of STBSVM, or the ability to select features. Experiments carried out on synthetic and UCI datasets show that STBSVM has a good ability to select features and simultaneously enhances the classification performance.

This work was supported in part by the Natural Science Foundation of the Jiangsu Higher Education Institutions of China under Grant No. 19KJA550002, by the Six Talent Peak Project of Jiangsu Province of China under Grant No. XYDXX-054, by the Priority Academic Program Development of Jiangsu Higher Education Institutions, and by the Collaborative Innovation Center of Novel Software Technology and Industrialization.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zhang, X., Wu, G., Dong, Z., Crawford, C.: Embedded feature-selection support vector machine for driving pattern recognition. J. Franklin Inst. 352, 669–685 (2015)

    Article  Google Scholar 

  2. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)

    MATH  Google Scholar 

  3. Chandrashekar, G., Sahin, F.: A survey on feature selection methods. Comput. Electr. Eng. 40, 16–28 (2014)

    Article  Google Scholar 

  4. Bi, J., Bennett, K., Embrechts, M., Breneman, C., Song, M.: Dimensionality reduction via sparse support vector machines. J. Mach. Learn. Res. 3, 1229–1243 (2003)

    MATH  Google Scholar 

  5. Jain, A.K., Duin, R.P.W., Mao, J.: Statistical pattern recognition: a review. IEEE Trans. Pattern Anal. Mach. Intell. 22(1), 4–37 (2000)

    Article  Google Scholar 

  6. Huffener, F., Niedermeier, R., Wernicke, S.: Techniques for practical fixed-parameter algorithms. Comput. J. 51(1), 7–25 (2008)

    Article  Google Scholar 

  7. Niedermeier, R.: Invitation to Fixed-Parameter Algorithms. Oxford University Press, Oxford (2006)

    Google Scholar 

  8. Cortes, C., Vapnik, V.: Support vector networks. Mach. Learn. 20(3), 273–297 (1995). https://doi.org/10.1007/BF00994018

    Article  MATH  Google Scholar 

  9. Vladimir N.V.: The Nature of Statistical Learning Theory. Springer, Switzerland (2000). https://doi.org/10.1007/978-1-4757-3264-1

  10. Zhou, W., Zhang, L., Jiao, L.: Linear programming support vector machines. Pattern Recognit. 35(12), 2927–2936 (2002)

    Article  Google Scholar 

  11. Zhang, L., Zhou, W.: On the sparseness of 1-norm support vector machines. Neural Netw. 23(3), 373–385 (2010)

    Article  Google Scholar 

  12. Khemchandani, R., Chandra, S.: Twin support vector machine for pattern classification. IEEE Trans. Pattern Anal. Mach. Intell. 29(5), 905–910 (2007)

    Article  Google Scholar 

  13. Shao, Y.H., Zhang, C.H., Wang, X.B., Deng, N.Y.: Improvements on twin support vector machine. IEEE Trans. Neural Netw. 22(6), 962–968 (2011)

    Article  Google Scholar 

  14. Kumar, M.A., Gopal, M.: Least squares twin support vector machine for pattern classification. Exp. Syst. Appl. 36, 7535–7543 (2009)

    Article  Google Scholar 

  15. Dou, Q., Zhang, L.: Decision tree twin support vector machine based on kernel clustering for multi-class classification. In: Cheng, L., Leung, A.C.S., Ozawa, S. (eds.) ICONIP 2018. LNCS, vol. 11304, pp. 293–303. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-04212-7_25

    Chapter  Google Scholar 

  16. Ju, H. and Jing, L.: an improved fuzzy multi-class twin support vector machine. In: 2019 6th International Conference on Systems and Informatics (ICSAI), pp. 393–397 (2019)

    Google Scholar 

  17. Zhang, Z., Zhen, L., Deng, N., Tan, J.: Sparse least square twin support vector machine with adaptive norm. Appl. Intell. 4(41), 1097–1107 (2014). https://doi.org/10.1007/s10489-014-0586-1

    Article  Google Scholar 

  18. Tanveer, M.: Robust and sparse linear programming twin support vector machines. Cognit. Comput. 7(1), 137–149 (2015). https://doi.org/10.1007/s12559-014-9278-8

    Article  Google Scholar 

  19. Hertog, D.D.: Interior point approach to linear, quadratic and convex programming: algorithms and complexity. Topics in Engineering Mathematics. Springer, Netherlands (1992)

    Google Scholar 

  20. UCI Machine Learning Repository (2017). http://archive.ics.uci.edu/ml

  21. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. John Wiley, New York (2001)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Li Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zheng, X., Zhang, L., Yan, L. (2020). Feature Selection Using Sparse Twin Bounded Support Vector Machine. In: Yang, H., Pasupa, K., Leung, A.CS., Kwok, J.T., Chan, J.H., King, I. (eds) Neural Information Processing. ICONIP 2020. Lecture Notes in Computer Science(), vol 12533. Springer, Cham. https://doi.org/10.1007/978-3-030-63833-7_30

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-63833-7_30

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-63832-0

  • Online ISBN: 978-3-030-63833-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics