Skip to main content

Multi-class Feature Selection Based on Softmax with \(L_{2,0}\)-Norm Regularization

  • Conference paper
  • First Online:
Cognitive Systems and Information Processing (ICCSIP 2021)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1515))

Included in the following conference series:

  • 1197 Accesses

Abstract

In recent years, feature selection (FS) methods based on sparsity have been extensively investigated due to their high performance. These methods solve the FS problem mainly by introducing some kinds of sparsity regularization terms. However, recent existing feature selection algorithms combine sparsity regularization with simple linear loss function, which may lead to deficient in performance. To this end, we propose a fresh and robust feature selection method that combines the structured sparsity regularization, i.e., \(\ell _{2,0}\)-norm regularization, with the Softmax model to find a stable row-sparse solution, where we can select the features in group according to the solution of the projected matrix, and the classification performance can be improved by Softmax. Extensive experiments on six different datasets indicate that our method can obtain better or comparable classification performance by using fewer features compared with other advanced sparsity-based FS methods.

This work is supported by National Natural Science Foundation of China (NSFC) under grant #61873067.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Ding, C., Peng, H.: Minimum redundancy feature selection from microarray gene expression data. J. Bioinf. Comput. Biol. 3(02), 185–205 (2005)

    Article  Google Scholar 

  2. Kira, K., Rendell, L.A.: A practical approach to feature selection. In: Proceedings of the 9th International Workshop on Machine Learning, pp. 249–256 (1992)

    Google Scholar 

  3. Kononenko, I.: Estimating attributes: analysis and extensions of relief. In: Proceedings of the 7th European Conference on Machine Learning, pp. 171–182 (1994)

    Google Scholar 

  4. Peng, H., Long, F., Ding, C.: Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27(8), 1226–1238 (2005)

    Article  Google Scholar 

  5. Guyon, I., Weston, J., Barnhill, S., Vapnik, V.: Gene selection for cancer classification using support vector machines. Mach. Learn. 46(1), 389–422 (2002)

    Article  Google Scholar 

  6. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. Roy. Stat. Soc. Ser. B (Methodological) 58(1), 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  7. Liu, J., Chen, J., Ye, J.: Large-scale sparse logistic regression. In: Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 547–556 (2009)

    Google Scholar 

  8. Nie, F., Huang, H., Cai, X., Ding, C.: Efficient and robust feature selection via joint \(\ell _{2,1}\)-norms minimization. Adv. Neural Inf. Process. Syst. 23, 1813–1821 (2010)

    Google Scholar 

  9. Liu, J., Ji, S., Ye, J.: Multi-task feature learning via efficient \(\ell _{2,1}\)-norm minimization. arXiv preprint arXiv:1205.2631 (2012)

  10. Wen, J., Lai, Z., Zhan, Y., Cui, J.: The \(\ell _{2,1}\)-norm-based unsupervised optimal feature selection with applications to action recognition. Pattern Recogn. 60, 515–530 (2016)

    Article  Google Scholar 

  11. Peng, Y., Sehdev, P., Liu, S., Li, J., Wang, X.: \(\ell _{2,1}\)-norm minimization based negative label relaxation linear regression for feature selection. Pattern Recogn. Lett. 116, 170–178 (2018)

    Article  Google Scholar 

  12. Cai, X., Nie, F., Huang, H.: Exact top-k feature selection via l2, 0-norm constraint. In: Proceedings of the 23rd International Joint Conference on Artificial Intelligence, pp. 1240–1246 (2013)

    Google Scholar 

  13. Pang, T., Nie, F., Han, J., Li, X.: Efficient feature selection via \(\ell _{2,0}\)-norm constrained sparse regression. IEEE Trans. Knowl. Data Eng. 31(5), 880–893 (2018)

    Article  Google Scholar 

  14. Sun, Z., Yu, Y.: Robust multi-class feature selection via \(\ell _{2,0}\)-norm regularization minimization. arXiv preprint arXiv:2010.03728 (2020)

  15. Pomeroy, S.L., et al.: Prediction of central nervous system embryonal tumour outcome based on gene expression. Nature 415(6870), 436–442 (2002)

    Article  Google Scholar 

  16. Nutt, C.L., et al.: Gene expression-based classification of malignant gliomas correlates better with survival than histological classification. Cancer Res. 63(7), 1602–1607 (2003)

    Google Scholar 

  17. Alizadeh, A.A., et al.: Distinct types of diffuse large b-cell lymphoma identified by gene expression profiling. Nature 403(6769), 503–511 (2000)

    Article  Google Scholar 

  18. Ross, D.T., et al.: Systematic variation in gene expression patterns in human cancer cell lines. Nat. Genet. 24(3), 227–235 (2000)

    Article  Google Scholar 

  19. Singh, D., et al.: Gene expression correlates of clinical prostate cancer behavior. Cancer Cell 1(2), 203–209 (2002)

    Article  Google Scholar 

  20. Khan, J., et al.: Classification and diagnostic prediction of cancers using gene expression profiling and artificial neural networks. Nat. Med. 7(6), 673–679 (2001)

    Article  Google Scholar 

  21. Chen, X., Yuan, G., Nie, F., Huang, J.Z.: Semi-supervised feature selection via rescaled linear regression. In: Proceedings of the 26th International Joint Conference on Artificial Intelligence, pp. 1525–1531 (2017)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuanlong Yu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zeng, S., Yu, Y., Sun, Z. (2022). Multi-class Feature Selection Based on Softmax with \(L_{2,0}\)-Norm Regularization. In: Sun, F., Hu, D., Wermter, S., Yang, L., Liu, H., Fang, B. (eds) Cognitive Systems and Information Processing. ICCSIP 2021. Communications in Computer and Information Science, vol 1515. Springer, Singapore. https://doi.org/10.1007/978-981-16-9247-5_3

Download citation

  • DOI: https://doi.org/10.1007/978-981-16-9247-5_3

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-16-9246-8

  • Online ISBN: 978-981-16-9247-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics