Abstract
Feature selection is a well-researched problem, which can improve the network performance and speed up the training of the network. In this paper, we proposed an effective feature selection scheme for SVM using the hybrid of enhanced genetic algorithm and mutual information, in which mutual information between each input and each output of the data set is employed in mutation in evolutionary process to purposefully guide search direction based on some criterions. In order to avoid the noise fitness evaluation, in evaluating the fitness of an input subset, a SVM should adaptively adjust its parameters to obtain the best performance of network, so an enhanced GA is used to simultaneously evolve the input features and the parameters of SVM. By examining two real financial time series, the simulation of three different methods of feature selection shows that the feature selection using the hybrid of GA and MI can reduce the dimensionality of inputs, speed up the training of the network and get better performance.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)
John, G.H., Kohavi, R., Pfleger, K.: Irrelevant features and the subset selection problem. In: Proc. 11th Int. Conf. Machine Learning, San Mateo, CA, pp. 121–129 (1994)
Taira, Hirotoshi: Feature selection in SVM text categorization. In: Proceedings of the National Conference on Artificial Intelligence, pp. 480–486 (1999)
Huang, D., Chow, T.W.S.: Effective feature selection scheme using mutual information. Neurocomputing 63, 325–343 (2005)
Tay, F.E.H., Cao, L.J.: A comparative study of saliency analysis and genetic algorithm for feature selection in support vector machines. Intell. Data Anal. 5(3), 191–209 (2001)
Samanta, B., Al-Balushi, K.R., Al-Araimi, S.A.: Artificial neural networks and support vector machines with genetic algorithm for bearing fault detection. Engineering Applications of Artificial Intelligence 16, 657–665 (2003)
Battiti, R.: Using mutual information for selecting features in supervised neural net training. IEEE Trans. Neural Networks 5, 537–550 (1994)
Vapnik, V.N.: On the uniform convergence of relative frequencies of events to their probabilities. Soviet Mathematics: Doklady 9, 915–918 (1968)
Campbell, C.: Algorithmic approaches to training support vector machines: a survey. In: Proceedings of ESANN 2000 Bruges, Belgium, Paper #ESANN2000-355 (2000)
Joachims, T.: Making large-scale SVM learning practical. In: Scholkopf, B., Burges, C.J., Simola, A. (eds.) Advances in Kernel Methods-Support Vector Learning, pp. 169–184. MIT Press, Cambridge (1999)
Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, New York (1991)
Fraser, A.M., Swinney, H.L.: Independent Coordinates for Strange Attractors from Mutual Information. Physical Review A 33(2), 1134–1140 (1986)
Choi, B., Hendtlass, T., Bluff, K.: A Comparison of Neural Network Input Vector Selection Techniques. In: Orchard, B., Yang, C., Ali, M. (eds.) IEA/AIE 2004. LNCS (LNAI), vol. 3029, pp. 1–10. Springer, Heidelberg (2004)
Yao, X.: A review of evolutionary artificial neural networks. Int. J. Intell. Syst. 8(4), 539–567 (1993)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhang, C., Hu, H. (2006). Feature Selection in SVM Based on the Hybrid of Enhanced Genetic Algorithm and Mutual Information. In: Torra, V., Narukawa, Y., Valls, A., Domingo-Ferrer, J. (eds) Modeling Decisions for Artificial Intelligence. MDAI 2006. Lecture Notes in Computer Science(), vol 3885. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11681960_30
Download citation
DOI: https://doi.org/10.1007/11681960_30
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-32780-6
Online ISBN: 978-3-540-32781-3
eBook Packages: Computer ScienceComputer Science (R0)