Abstract
This paper presents a new approach to identify the activity of input attributes efficiently in the wrapper model of feature selection. The relevant features are selected by the diversity among the inputs of the neural network and the entire process is done depending on several criteria. While the most of existing feature selection methods use all input attributes by examining network performance, we use here only the attributes having relatively high possibilities to contribute to the network performance knowing preceding assumptions. The proposed diversity-based feature selection method (DFSM) can therefore significantly reduce the size of hidden layer priori to feature selection process without degrading the network performance. We tested DFSM to several real world benchmark problems and the experimental results confirmed that it could select a small number of relevant features with good classification accuracies.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artificial Intelligence 97, 273–324 (1997)
Liu, H., Tu, L.: Toward Integrating Feature Selection Algorithms for Classification and Clustering. IEEE Transactions on Knowledge and Data Engineering 17(4), 491–502 (2005)
Sateino, R., Liu, H.: Neural Network Feature Selector. IEEE Transactions on Neural Networks 8 (1997)
Guan, S., Liu, J., Qi, Y.: An Incremental approach to Contribution-based Feature Selection. Journal of Intelligence Systems 13(1) (2004)
Abe, S.: Modified Backward Feature Selection by Cross Validation. In: Proceedings of the European Symposium on Artificial Neural Networks, April 2005, pp. 163–168 (2005)
Bontempi, G.: Structural feature selection for wrapper methods. In: Proceedings of the European Symposium on Artificial Neural Networks, April 2005, pp. 405–410 (2005)
Dunne, K., Cunningham, P., Azuaje, F.: Solutions to Instability Problems with Sequential Wrapperbased Approaches to Feature Selection. Journal of Machine Learning Research (2002)
Stracuzzi, D.J., Utgoff, P.E.: Randomized Variable Elimination. Journal of Machine Learning Research 5, 1331–1362 (2004)
Shahjahan, M., Murase, K.: Neural Network Training Algorithm with Positive Correlation. IEICE Trans.Inf. & Syst. E88-D(10), 2399–2409 (2005)
Monirul Kabir, M., Shahjahan, M., Murase, K.: A Backward Feature Selection by Creating Compact Neural Network Using Coherence Learning and Pruning. Journal of Advanced Computational Intelligence and Intelligent Informatics 11(6) (2007)
Prechelt, L.: PROBEN1-A set of neural network benchmark problems and benchmarking rules. Technical Report 21/94, Faculty of Informatics, University of Karlsruhe, Germany (1994)
Newman, D.J., Hettich, S., Blake, C.L., Merz, C.J.: UCI Repository of Machine Learning Databases. Dept. of Information and Computer Sciences, University of California, Irvine (1998), Available: http://www.ics.uci.edu/~mlearn/MLRepository.html
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kabir, M.M., Shahjahan, M., Murase, K. (2008). Diversity-Based Feature Selection from Neural Network with Low Computational Cost. In: Ishikawa, M., Doya, K., Miyamoto, H., Yamakawa, T. (eds) Neural Information Processing. ICONIP 2007. Lecture Notes in Computer Science, vol 4985. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69162-4_106
Download citation
DOI: https://doi.org/10.1007/978-3-540-69162-4_106
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-69159-4
Online ISBN: 978-3-540-69162-4
eBook Packages: Computer ScienceComputer Science (R0)