Skip to main content

An Interval Approach for Weight’s Initialization of Feedforward Neural Networks

  • Conference paper
MICAI 2006: Advances in Artificial Intelligence (MICAI 2006)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4293))

Included in the following conference series:

Abstract

This work addresses an important problem in Feedforward Neural Networks (FNN) training, i.e. finding the pseudo-global minimum of the cost function, assuring good generalization properties to the trained architecture. Firstly, pseudo-global optimization is achieved by employing a combined parametric updating algorithm which is supported by the transformation of network parameters into interval numbers. It solves the network weight initialization problem, performing an exhaustive search for minimums by means of Interval Arithmetic (IA). Then, the global minimum is obtained once the search has been limited to the region of convergence (ROC). IA allows representing variables and parameters as compact-closed sets, then, a training procedure using interval weights can be done. The methodology developed is exemplified by an approximation of a known non-linear function in last section.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 239.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Duch, W., Adamczak, R., Jankowski, N.: Initialization and Optimization of Multilayered Perceptrons. In: Proceedings of the 3rd Conference on Neural Networks and Their Applications, Kule, Poland, pp. 105–110 (1997)

    Google Scholar 

  2. Thimm, G., Fiesler, E.: High-Order and Multilayer Perceptron Initialization. IEEE Transactions on Neural Networks 8, 349–359 (1997)

    Article  Google Scholar 

  3. Erdogmus, D., Fontenla-Romero, O., Principe, J., Alonso-Betanzos, A., Castillo, E., Jenssen, R.: Accurate Initialization of Neural Network Weights by Backpropagation of the Desired Response. In: Proceedings of the International Joint Conference on Neural Networks, Portland, USA, vol. 3, pp. 2005–2010 (2003)

    Google Scholar 

  4. Colla, V., Reyneri, L., Sgarbi, M.: Orthogonal Least Squares Algorithm Applied to the Initialization of Multilayer Perceptrons. In: Proceedings of the European Symposium on Artificial Neural Networks, pp. 363–369 (1999)

    Google Scholar 

  5. Yam, Y., Chow, T.: A New Method in Determining the Initial Weights of Feedforward Neural Networks. Neurocomputing 16, 23–32 (1997)

    Article  Google Scholar 

  6. Husken, M., Goerick, C.: Fast Learning for Problem Classes Using Knowledge Based Network Initialization. In: Proceedings of the IJCNN, pp. 619–624 (2000)

    Google Scholar 

  7. Hansen, E.: Global Optimization using Interval Analysis. Marcel Dekker, New York (1992)

    MATH  Google Scholar 

  8. Stolfi, J., Figuereido, L.: Self–Validated Numerical Methods and Applications. In: 21st Brazilian Mathematics Colloquium, IMPA (1997)

    Google Scholar 

  9. Jaulin, L., Kiefer, M., Didrit, O., Walter, E.: Applied Interval Analysis. Laboratoire des Signaux et Systèmes, CNRS-SUPÉLEC. Université Paris-Sud, France (2001)

    Google Scholar 

  10. Chen, S., Wu, J.: Interval optimization of dynamic response for structures with interval parameters. Computer and Structures 82, 1–11 (2004)

    Article  Google Scholar 

  11. Valdés, H., Flaus, J.-M., Acuña, G.: Moving horizon state estimation with global convergence using interval techniques: application to biotechnological processes. Journal of Process Control 13, 325–336 (2003)

    Article  Google Scholar 

  12. Cybenko, G.: Approximation by Superposition of a Sigmoidal Function. Mathematics of Control, Signals and Systems 2, 303–314 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  13. Hornik, K., Stinchcombe, M., White, H.: Multilayer Feedforward Networks are Universal Approximators. Neural Networks 2, 359–366 (1989)

    Article  Google Scholar 

  14. Attali, J., Pagès, G.: Approximations of Functions by a Multilayer Perceptron: a New Approach. Neural Networks 10, 1069–1081 (1997)

    Article  Google Scholar 

  15. Acuña, G., Pinto, E.: Development of a Matlab® Toolbox for the Design of Grey-Box Neural Models. International Journal of Computers, Communications and Control 1, 7–14 (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Jamett, M., Acuña, G. (2006). An Interval Approach for Weight’s Initialization of Feedforward Neural Networks. In: Gelbukh, A., Reyes-Garcia, C.A. (eds) MICAI 2006: Advances in Artificial Intelligence. MICAI 2006. Lecture Notes in Computer Science(), vol 4293. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11925231_29

Download citation

  • DOI: https://doi.org/10.1007/11925231_29

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-49026-5

  • Online ISBN: 978-3-540-49058-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics