Skip to main content

Feature Selection for Multi-class Problems Using Support Vector Machines

  • Conference paper
PRICAI 2004: Trends in Artificial Intelligence (PRICAI 2004)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3157))

Included in the following conference series:

Abstract

Since feature selection can remove the irrelevant features and improve the performance of learning systems, it is an crucial step in machine learning. The feature selection methods using support vector machines have obtained satisfactory results, but the previous works are usually for binary classification, and needs auxiliary techniques to be extended to multiple classification. In this paper, we propose a prediction risk based feature selection method using multiple classification support vector machines. The performance of the proposed method is compared with the previous methods of optimal brain damage based feature selection methods using binary support vector machines. The results of experiments on UCI data sets show that prediction risk based feature selection method obtains better results than the previous methods using support vector machines for multiple classification problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Dash, M., Liu, H.: Feature selection for classification. Intelligent Data Analysis 1, 131–156 (1997)

    Article  Google Scholar 

  2. Kohavi, R., George, J.H.: Wrappers for feature subset selection. Artificial Intelligence 97, 273–324 (1997)

    Article  MATH  Google Scholar 

  3. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. Journal of machine learning research 3, 1157–1182 (2003)

    Article  MATH  Google Scholar 

  4. Reed, R.: Pruning algorithms — a survey. IEEE Transactions on Neural Networks 4, 740–747 (1993)

    Article  Google Scholar 

  5. Guyon, I., Weston, J., Barnhill, S., Vapnik, V.: Gene selection for cancer classification using support vector machines. Machine Learning 46, 389–422 (2002)

    Article  MATH  Google Scholar 

  6. Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Printice Hall, New Jersey (1999)

    MATH  Google Scholar 

  7. Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge University Press, Cambridge (2000)

    Google Scholar 

  8. Vapnik, V.: Statistical Learning Theory. Wiley, New York (1998)

    MATH  Google Scholar 

  9. Weston, J., Mukherjee, S., Chapelle, O., Pontil, M., Poggio, T., Vapnik, V.: Feature selection for SVMs. In: Advances in Neural Information Processing Systems, vol. 13 (2001)

    Google Scholar 

  10. Rakotomamonjy, A.: Variable selection using SVM-based criteria. Journal of machine learning research 3, 1357–1370 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  11. Weston, J., Elisseeff, A., Bakir, G., Sinz, F.: The spider (2004), http://www.kyb.tuebingen.mpg.de/bs/people/spider/index.html

  12. Moody, J., Utans, J.: Principled architecture selection for neural networks: Application to corporate bond rating prediction. In: Moody, J.E., Hanson, S.J., Lippmann, R.P. (eds.) Advances in Neural Information Processing Systems, vol. 4, pp. 683–690. Morgan Kaufmann Publishers, Inc., San Francisco (1992)

    Google Scholar 

  13. LeCun, Y., Jackel, L.D., Bottou, L., Brunot, A., Cortes, C., Denker, J.S., Drucker, H., Guyon, I., Müller, U.A., Säckinger, E., Simard, P., Vapnik, V.: Comparison of learning algorithms for handwritten digit recognition. In: Fogelman-Soulié, F., Gallinari, P. (eds.) Proceedings ICANN 1995 – International Conference on Artificial Neural Networks, vol. II, pp. 53–60 (1995)

    Google Scholar 

  14. Joachims, T.: Text categorization with support vector machines. In: Nédellec, C., Rouveirol, C. (eds.) ECML 1998. LNCS, vol. 1398, Springer, Heidelberg (1998)

    Chapter  Google Scholar 

  15. Pontil, M., Verri, A.: Object recognition with support vector machines. IEEE Trans. on PAMI 20, 637–646 (1998)

    Google Scholar 

  16. El-Naqa, I., Yang, Y., Wernick, M.N., Galatsanos, N.P., R, N.: Support vector machine learning for detection of microcalcifications in mammograms. In: Proceedings of IEEE International Symposium on Biomedical Imaging, pp. 201–204 (2002)

    Google Scholar 

  17. Mercer, J.: Functions of positive and negative type and their connection with the theory of integral equations. Philos. Trans. Roy. Soc. London A 209, 415–446 (1909)

    Article  Google Scholar 

  18. Keerthi, S.S., Lin, C.J.: Asymptotic behaviors of support vector machines with gaussian kernel. Neural Computation 15, 1667–1689 (2003)

    Article  MATH  Google Scholar 

  19. Hsu, C.W., Lin, C.J.: A comparison of methods for multi-class support vector machines. IEEE Transactions on Neural Networks 13, 415–425 (2002)

    Article  Google Scholar 

  20. LeCun, Y., Denker, J.S., Solla, S.A.: Optimal brain damage. In: Touretzky, D. (ed.) Advances in Neural Information Processing Systems, pp. 598–605. Morgan Kaufmann, Inc., San Francisco (1990)

    Google Scholar 

  21. Cibas, T., Soulie, F., Gallinari, P.: Variable selection with neural networks. Neurocomputing 12, 223–248 (1996)

    Article  MATH  Google Scholar 

  22. Verikas, A., Bacauskiene, M.: Feature selection with neural networks. Pattern Recognition Letters 23, 1323–1335 (2002)

    Article  MATH  Google Scholar 

  23. Marill, T., Green, D.M.: On the effectiveness of receptors in recognition system. IEEE Transaction on Information Theory 9, 11–17 (1963)

    Article  Google Scholar 

  24. Blake, C., Keogh, E., Merz, C.J.: UCI repository of machine learning databases. Technical report, Department of Information and Computer Science, University of California, Irvine, CA (1998), http://www.ics.uci.edu/mlearn/MLRepository.htm

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Li, GZ., Yang, J., Liu, GP., Xue, L. (2004). Feature Selection for Multi-class Problems Using Support Vector Machines. In: Zhang, C., W. Guesgen, H., Yeap, WK. (eds) PRICAI 2004: Trends in Artificial Intelligence. PRICAI 2004. Lecture Notes in Computer Science(), vol 3157. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-28633-2_32

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-28633-2_32

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-22817-2

  • Online ISBN: 978-3-540-28633-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics