Skip to main content

A Methodology for Developing Nonlinear Models by Feedforward Neural Networks

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5495))

Abstract

Feedforward neural networks have been established as versatile tools for nonlinear black-box modeling, but in many data-mining tasks the choice of relevant inputs and network complexity is still a major challenge. Statistical tests for detecting relations between inputs and outputs are largely based on linear theory, and laborious retraining combined with the risk of getting stuck in local minima make the application of exhaustive search through all possible network configurations impossible but for toy problems. This paper proposes a systematic method to tackle the problem where an output shall be estimated on the basis of a (large) set of potential inputs. Feedforward neural networks of multi-layer perceptron type are used in the three-stage modeling approach: First, starting from sufficiently large networks an efficient pruning method is applied to detect a pool of potential model candidates. Next, the Akaike weights are used as to select the actual Kullback-Leibler best models in the pool. Third, the hidden nodes of these networks are available for the final network, where mixed-integer linear programming is applied to find the optimal combination of M hidden nodes, and the corresponding upper-layer weights. The procedure outlined is demonstrated to yield parsimonious models for a nonlinear benchmark problem, and to detect the relevant inputs.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Frean, M.: The Upstart Algorithm. A Method for Constructing and Training Feed-forward Neural Networks. Neural Computation 2, 198–209 (1991)

    Article  Google Scholar 

  2. Fahlman, S.E., Lebiere, C.: The Cascade-Correlation Learning Architecture. In: Touretzky, D.S. (ed.) Adv. Neural Inf. Proc. Syst., vol. 2, pp. 524–532. Morgan Kaufmann, San Francisco (1990)

    Google Scholar 

  3. Le Chun, Y., Denker, J.S., Solla, S.A.: Optimal Brain Damage. In: Touretzky, D.S. (ed.) Adv. Neural Inf. Proc. Syst., vol. 2, pp. 598–605. Morgan Kaufmann, San Francisco (1990)

    Google Scholar 

  4. Engelbrecht, A.P.: A new pruning heuristic based on variance analysis of sensitivity information. IEEE Trans. Neural Networks 2, 1386–1399 (2001)

    Article  Google Scholar 

  5. Jorgensen, T.D., Haynes, B.P., Norlund, C.C.F.: Pruning artificial neural networks using neural complexity measures. Int. J. Neural Systems 18, 389–403 (2008)

    Article  Google Scholar 

  6. Narasimha, P.L., Delashmit, W.H., Manry, M.T., Li, J., Maldonado, F.: An integrated growing-pruning method for feedforward network training. Neurocomputing 71, 2831–2847 (2008)

    Article  Google Scholar 

  7. Fogel, D.B.: An Information Criterion for Optimal Neural Network Selection. IEEE Trans. Neural Networks 2, 490–497 (1991)

    Article  Google Scholar 

  8. Gao, F., Li, M., Wang, F., Wang, B., Yue, P.: Genetic Algorithms and Evolutionary Programming Hybrid Strategy for Structure and Weight Learning for Multilayer Feedforward Neural Networks. Ind. Eng. Chem. Res. 38, 4330–4336 (1999)

    Article  Google Scholar 

  9. Hinnelä, J., Saxén, H., Pettersson, F.: Modeling of the blast furnace burden distribution by evolving neural networks. Ind. Eng. Chem. Res. 42, 2314–2323 (2003)

    Article  Google Scholar 

  10. Pettersson, F., Chakraborti, N., Saxén, H.: A Genetic Algorithms Based Multiobjective Neural Net Applied to Noisy Blast Furnace Data. Applied Soft Computing 7, 387–397 (2007)

    Article  Google Scholar 

  11. Saxén, H., Pettersson, F.: A simple method for selection of inputs and structure of feedforward neural networks. In: Ribeiro, R., et al. (eds.) Adaptive and Natural Computing Algorithms. Springer Computer Science, Heidelberg (2005)

    Google Scholar 

  12. Saxén, H., Pettersson, F.: Method for the selection of inputs and structure of feedforward neural networks. Comput. Chem. Engng. 30, 1038–1045 (2006)

    Article  Google Scholar 

  13. Saxén, H., Pettersson, F.: Nonlinear Prediction of the hot Metal Silicon Content in the Blast Furnace. ISIJ Int. 47, 1732–1737 (2007)

    Article  Google Scholar 

  14. Akaike, H.: A new look at the statistical model identification. IEEE Trans. Automatic Control 19, 716–723 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  15. Hurvich, C.M., Tsai, C.-L.: Regression and time series model selection in small samples. Biometrika 76, 297–307 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  16. Kullback, S., Leibler, R.A.: On Information and Sufficiency. Annals of Mathematical Statistics 22, 79–86 (1951)

    Article  MathSciNet  MATH  Google Scholar 

  17. Haykin, S.: Neural Networks. A Comprehensive Foundation, 2nd edn. Prentice-Hall Inc., New Jersey (1999)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Saxén, H., Pettersson, F. (2009). A Methodology for Developing Nonlinear Models by Feedforward Neural Networks. In: Kolehmainen, M., Toivanen, P., Beliczynski, B. (eds) Adaptive and Natural Computing Algorithms. ICANNGA 2009. Lecture Notes in Computer Science, vol 5495. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04921-7_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-04921-7_8

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-04920-0

  • Online ISBN: 978-3-642-04921-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics