Skip to main content

Pruning Feedforward Neural Network Search Space Using Local Lipschitz Constants

  • Conference paper
  • 2637 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 7367))

Abstract

Combination of backpropagation with global search algorithms such as genetic algorithm (GA) and particle swarm optimization (PSO) has been deployed to improve the efficacy of neural network training. However, those global algorithms suffer the curse of dimensionality. We propose a new approach that focuses on the topology of the solution space. Our method prunes the search space by using the Lipschitzian property of the criterion function. We have developed procedures that efficiently compute local Lipschitz constants over subsets of the weight space. Those Local Lipschitz constants can be used to compute lower bounds on the optimal solution.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Rumelhart, D.E., Hinton, G.G., Williams, R.J.: Learning internal representations by error propagation. In: Parallel Distributed Processing: Exploration in the Microstructure of Cognition, pp. 318–362. MIT Press, Cambridge (1986)

    Google Scholar 

  2. Duch, W., Korczak, J.: Optimization and global minimization methods suitable for neural networks. Neural Computing Surveys 2, 163–212 (1999)

    Google Scholar 

  3. Yao, X.: Evolving artificial neural networks. Proceedings of the IEEE 89(9), 1423–1447 (1999)

    Google Scholar 

  4. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proc. of IEEE Int. Conf. on Neural Networks, Perth, Australia, Piscataway, NJ, pp. 1942–1948 (1995)

    Google Scholar 

  5. Malviya, R., Pratihar, D.K.: Tuning of neural networks using particle swarm optimization to model MIG welding process. Swarm and Evolutionary Computation 1, 223–235 (2011)

    Article  Google Scholar 

  6. Georgiopoulos, M., Li, C., Kocakb, T.: Learning in the feed-forward random neural net-work: A critical review. Performance Evaluation 68, 361–384 (2011)

    Article  Google Scholar 

  7. Duch, W., Korczak, J.: Optimization and global minimization methods suitable for neural networks. In: Neural Computing Surveys, vol. 2, pp. 163–212. Lawrence Erlbaum Associates Inc., USA (1999)

    Google Scholar 

  8. Flores, J.J., Rodriguez, H., Graff, M.: Reducing the Search Space in Evolutive Design of ARIMA and ANN Models for Time Series Prediction. In: Sidorov, G., Hernández Aguirre, A., Reyes García, C.A. (eds.) MICAI 2010, Part II. LNCS (LNAI), vol. 6438, pp. 325–336. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  9. Rempis, C.W., Pasemann, F.: Search Space Restriction of Neuro-Evolution Through Con-strained Modularization of Neural Networks. In: Madani, K. (ed.) Artificial Neural Networks and Intelligent Information Processing, pp. 13–22 (2010)

    Google Scholar 

  10. Horst, R., Tuy, H.: Global Optimization: Deterministic Approaches. Springer, Berlin (1990)

    MATH  Google Scholar 

  11. Tang, Z., Koehler, G.J.: Deterministic global optimal FNN training algorithms. Neural Networks 7(2), 301–311 (1994)

    Article  Google Scholar 

  12. Hush, D.R., Salas, J.M., Horne, B.: Error surfaces for Multi-layer perceptrons. In: International Joint Conference on Neural Networks, vol. 1, pp. 759–764. INNS/IEEE, Seattle (1991)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Tang, Z., Bagchi, K., Pan, Y., Koehler, G.J. (2012). Pruning Feedforward Neural Network Search Space Using Local Lipschitz Constants. In: Wang, J., Yen, G.G., Polycarpou, M.M. (eds) Advances in Neural Networks – ISNN 2012. ISNN 2012. Lecture Notes in Computer Science, vol 7367. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-31346-2_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-31346-2_2

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-31345-5

  • Online ISBN: 978-3-642-31346-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics