Skip to main content

Diversity-Based Feature Selection from Neural Network with Low Computational Cost

  • Conference paper
  • 1505 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4985))

Abstract

This paper presents a new approach to identify the activity of input attributes efficiently in the wrapper model of feature selection. The relevant features are selected by the diversity among the inputs of the neural network and the entire process is done depending on several criteria. While the most of existing feature selection methods use all input attributes by examining network performance, we use here only the attributes having relatively high possibilities to contribute to the network performance knowing preceding assumptions. The proposed diversity-based feature selection method (DFSM) can therefore significantly reduce the size of hidden layer priori to feature selection process without degrading the network performance. We tested DFSM to several real world benchmark problems and the experimental results confirmed that it could select a small number of relevant features with good classification accuracies.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artificial Intelligence 97, 273–324 (1997)

    Article  MATH  Google Scholar 

  2. Liu, H., Tu, L.: Toward Integrating Feature Selection Algorithms for Classification and Clustering. IEEE Transactions on Knowledge and Data Engineering 17(4), 491–502 (2005)

    Article  Google Scholar 

  3. Sateino, R., Liu, H.: Neural Network Feature Selector. IEEE Transactions on Neural Networks 8 (1997)

    Google Scholar 

  4. Guan, S., Liu, J., Qi, Y.: An Incremental approach to Contribution-based Feature Selection. Journal of Intelligence Systems 13(1) (2004)

    Google Scholar 

  5. Abe, S.: Modified Backward Feature Selection by Cross Validation. In: Proceedings of the European Symposium on Artificial Neural Networks, April 2005, pp. 163–168 (2005)

    Google Scholar 

  6. Bontempi, G.: Structural feature selection for wrapper methods. In: Proceedings of the European Symposium on Artificial Neural Networks, April 2005, pp. 405–410 (2005)

    Google Scholar 

  7. Dunne, K., Cunningham, P., Azuaje, F.: Solutions to Instability Problems with Sequential Wrapperbased Approaches to Feature Selection. Journal of Machine Learning Research (2002)

    Google Scholar 

  8. Stracuzzi, D.J., Utgoff, P.E.: Randomized Variable Elimination. Journal of Machine Learning Research 5, 1331–1362 (2004)

    MathSciNet  Google Scholar 

  9. Shahjahan, M., Murase, K.: Neural Network Training Algorithm with Positive Correlation. IEICE Trans.Inf. & Syst. E88-D(10), 2399–2409 (2005)

    Article  Google Scholar 

  10. Monirul Kabir, M., Shahjahan, M., Murase, K.: A Backward Feature Selection by Creating Compact Neural Network Using Coherence Learning and Pruning. Journal of Advanced Computational Intelligence and Intelligent Informatics 11(6) (2007)

    Google Scholar 

  11. Prechelt, L.: PROBEN1-A set of neural network benchmark problems and benchmarking rules. Technical Report 21/94, Faculty of Informatics, University of Karlsruhe, Germany (1994)

    Google Scholar 

  12. Newman, D.J., Hettich, S., Blake, C.L., Merz, C.J.: UCI Repository of Machine Learning Databases. Dept. of Information and Computer Sciences, University of California, Irvine (1998), Available: http://www.ics.uci.edu/~mlearn/MLRepository.html

Download references

Author information

Authors and Affiliations

Authors

Editor information

Masumi Ishikawa Kenji Doya Hiroyuki Miyamoto Takeshi Yamakawa

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kabir, M.M., Shahjahan, M., Murase, K. (2008). Diversity-Based Feature Selection from Neural Network with Low Computational Cost. In: Ishikawa, M., Doya, K., Miyamoto, H., Yamakawa, T. (eds) Neural Information Processing. ICONIP 2007. Lecture Notes in Computer Science, vol 4985. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69162-4_106

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-69162-4_106

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-69159-4

  • Online ISBN: 978-3-540-69162-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics