Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3697))

Included in the following conference series:

Abstract

The determination of the optimal architecture of a supervised neural network is an important and a difficult task. The classical neural network topology optimization methods select weight(s) or unit(s) from the architecture in order to give a high performance of a learning algorithm. However, all existing topology optimization methods do not guarantee to obtain the optimal solution. In this work, we propose a hybrid approach which combines variable selection method and classical optimization method in order to improve optimization topology solution. The proposed approach suggests to identify the relevant subset of variables which gives a good classification performance in the first step and then to apply a classical topology optimization method to eliminate unnecessary hidden units or weights. A comparison of our approach to classical techniques for architecture optimization is given.

An erratum to this chapter can be found at http://dx.doi.org/10.1007/11550907_163 .

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bishop, C.M.: Neural Networks for Pattern Recognition. Clarendon Press, Oxford (1995)

    Google Scholar 

  2. Reed, R.: Pruning algorithms — A survey. IEEE Transactions on Neural Networks 4, 740–746 (1993)

    Article  Google Scholar 

  3. John, G.H., Kohavi, R., Pfleger, K.: Irrelevant features and the subset selection problem. In: International Conference on Machine Learning, Journal version in AIJ , pp. 121–129 (1994), available at http://citeseer.nj.nec.com/13663.html

  4. Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artificial Intelligence 97, 273–324 (1997)

    Article  MATH  Google Scholar 

  5. Blum, A., Langley, P.: Selection of relevant features and examples in machine learning. Artificial Intelligence 97, 245–271 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  6. Le Cun, Y., Denker, J.S., Solla, S.A.: Optimal brain damage. In: Touretzky, D.S. (ed.) Advances in Neural Information Processing Systems: Proceedings of the 1989 Conference, pp. 598–605. Morgan Kaufmann, San Mateo (1990)

    Google Scholar 

  7. Hassibi, B., Stork, D.G.: Second order derivatives for network pruning: Optimal brain surgeon. In: Hanson, S.J., Cowan, J.D., Giles, C.L. (eds.) Advances in Neural Information Processing Systems, vol. 5, pp. 164–171. Morgan Kaufmann, San Mateo (1993)

    Google Scholar 

  8. Hassibi, B., Stork, D.G., Wolff, G.: Optimal brain surgeon: Extensions and performance comparison. In: Cowan, J.D., Tesauro, G., Alspector, J. (eds.) Advances in Neural Information Processing Systems, vol. 6, pp. 263–270. Morgan Kaufmann Publishers, Inc., San Francisco (1994)

    Google Scholar 

  9. Stahlberger, A., Riedmiller, M.: Fast network pruning and feature extraction by using the unit-OBS algorithm. In: Mozer, M.C., Jordan, M.I., Petsche, T. (eds.) Advances in Neural Information Processing Systems, vol. 9, p. 655. The MIT Press, Cambridge (1997)

    Google Scholar 

  10. Attik, M., Bougrain, L., Alexandre, F.: Optimal brain surgeon variants for feature selection. In: International Joint Conference on Neural Networks - IJCNN 2004, Budapest, Hungary (2004)

    Google Scholar 

  11. Attik, M., Bougrain, L., Alexandre, F.: Optimal brain surgeon variants for optimization. In: 16h European Conference on Artificial Intelligence - ECAI 2004, Valencia, Spain (2004)

    Google Scholar 

  12. Thrun, S.B., Bala, J., Bloedorn, E., Bratko, I., Cestnik, B., Cheng, J., Jong, K.D., Džeroski, S., Fahlman, S.E., Fisher, D., Hamann, R., Kaufman, K., Keller, S., Kononenko, I., Kreuziger, J., Michalski, R.S., Mitchell, T., Pachowicz, P., Reich, Y., Vafaie, H., de Welde, W.V., Wenzel, W., Wnek, J., Zhang, J.: The MONK’s problems: A performance comparison of different learning algorithms. Technical Report CS-91-197, Carnegie Mellon University, Pittsburgh, PA (1991)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Attik, M., Bougrain, L., Alexandre, F. (2005). Neural Network Topology Optimization. In: Duch, W., Kacprzyk, J., Oja, E., Zadrożny, S. (eds) Artificial Neural Networks: Formal Models and Their Applications – ICANN 2005. ICANN 2005. Lecture Notes in Computer Science, vol 3697. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11550907_9

Download citation

  • DOI: https://doi.org/10.1007/11550907_9

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28755-1

  • Online ISBN: 978-3-540-28756-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics