Skip to main content

Feature Selection in SVM Based on the Hybrid of Enhanced Genetic Algorithm and Mutual Information

  • Conference paper
Modeling Decisions for Artificial Intelligence (MDAI 2006)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3885))

Abstract

Feature selection is a well-researched problem, which can improve the network performance and speed up the training of the network. In this paper, we proposed an effective feature selection scheme for SVM using the hybrid of enhanced genetic algorithm and mutual information, in which mutual information between each input and each output of the data set is employed in mutation in evolutionary process to purposefully guide search direction based on some criterions. In order to avoid the noise fitness evaluation, in evaluating the fitness of an input subset, a SVM should adaptively adjust its parameters to obtain the best performance of network, so an enhanced GA is used to simultaneously evolve the input features and the parameters of SVM. By examining two real financial time series, the simulation of three different methods of feature selection shows that the feature selection using the hybrid of GA and MI can reduce the dimensionality of inputs, speed up the training of the network and get better performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)

    Book  MATH  Google Scholar 

  2. John, G.H., Kohavi, R., Pfleger, K.: Irrelevant features and the subset selection problem. In: Proc. 11th Int. Conf. Machine Learning, San Mateo, CA, pp. 121–129 (1994)

    Google Scholar 

  3. Taira, Hirotoshi: Feature selection in SVM text categorization. In: Proceedings of the National Conference on Artificial Intelligence, pp. 480–486 (1999)

    Google Scholar 

  4. Huang, D., Chow, T.W.S.: Effective feature selection scheme using mutual information. Neurocomputing 63, 325–343 (2005)

    Article  Google Scholar 

  5. Tay, F.E.H., Cao, L.J.: A comparative study of saliency analysis and genetic algorithm for feature selection in support vector machines. Intell. Data Anal. 5(3), 191–209 (2001)

    MATH  Google Scholar 

  6. Samanta, B., Al-Balushi, K.R., Al-Araimi, S.A.: Artificial neural networks and support vector machines with genetic algorithm for bearing fault detection. Engineering Applications of Artificial Intelligence 16, 657–665 (2003)

    Article  Google Scholar 

  7. Battiti, R.: Using mutual information for selecting features in supervised neural net training. IEEE Trans. Neural Networks 5, 537–550 (1994)

    Article  Google Scholar 

  8. Vapnik, V.N.: On the uniform convergence of relative frequencies of events to their probabilities. Soviet Mathematics: Doklady 9, 915–918 (1968)

    MATH  Google Scholar 

  9. Campbell, C.: Algorithmic approaches to training support vector machines: a survey. In: Proceedings of ESANN 2000 Bruges, Belgium, Paper #ESANN2000-355 (2000)

    Google Scholar 

  10. Joachims, T.: Making large-scale SVM learning practical. In: Scholkopf, B., Burges, C.J., Simola, A. (eds.) Advances in Kernel Methods-Support Vector Learning, pp. 169–184. MIT Press, Cambridge (1999)

    Google Scholar 

  11. Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, New York (1991)

    Book  MATH  Google Scholar 

  12. Fraser, A.M., Swinney, H.L.: Independent Coordinates for Strange Attractors from Mutual Information. Physical Review A 33(2), 1134–1140 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  13. Choi, B., Hendtlass, T., Bluff, K.: A Comparison of Neural Network Input Vector Selection Techniques. In: Orchard, B., Yang, C., Ali, M. (eds.) IEA/AIE 2004. LNCS (LNAI), vol. 3029, pp. 1–10. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  14. Yao, X.: A review of evolutionary artificial neural networks. Int. J. Intell. Syst. 8(4), 539–567 (1993)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhang, C., Hu, H. (2006). Feature Selection in SVM Based on the Hybrid of Enhanced Genetic Algorithm and Mutual Information. In: Torra, V., Narukawa, Y., Valls, A., Domingo-Ferrer, J. (eds) Modeling Decisions for Artificial Intelligence. MDAI 2006. Lecture Notes in Computer Science(), vol 3885. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11681960_30

Download citation

  • DOI: https://doi.org/10.1007/11681960_30

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-32780-6

  • Online ISBN: 978-3-540-32781-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics