Skip to main content

Neural model selection: How to determine the fittest criterion?

  • Part VII: Prediction, Forecasting and Monitoring
  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN'97 (ICANN 1997)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1327))

Included in the following conference series:

Abstract

Based on recent results about the least-squares estimation for non-linear time series, M. Mangeas and J.F. Yao [6] proposed an identification criterion of neural architectures. So, for a given series of T observations, we know that for any γ ε R +* the selected neural model (architecture + weights) that minimize the least square criterion LSC = MSE +γlnT/T x n (the term n denotes the number of weights) converges almost surely towards the “true” model, when T grows to infinity. Nevertheless, when few observations are available, an identification method based on this criterion (such the pruning method named Statistical Stepwise Method (SSM) [1]) can yield different neural models. In this paper, we propose a heuristic for setting the value of γ up, with respect of the series we deal with (its complexity and the fixed number T). The basic idea is to split the set of observations into two subsets, following the well-known cross-validation method, and to perform the SSM methodology (using the the LSC criterion on the first subset (the learning set) for different values of γ. Once the best value of γ is found (the one minimizing the MSE on the second subset (the validation set)), we can use the identification scheme on the whole set of data.

To find the suited model structure (the architecture).

To estimate the suited set of parameters (synaptic weights).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. M. Cottrell, B. Girard, Y. Girard, M. Mangeas, and C. Muller. Neural modeling for time series: a statistical stepwise method for weight elimination. I.E.E.E. Trans. Neural Networks, 6:1355–1364, 1995.

    Google Scholar 

  2. M. Duflo. Algorithmes Stochastiques. Mathématiques & Applications (SMAI). Springer-Verlag, Berlin,1996.

    Google Scholar 

  3. X. Guyon. Random Fields on a Network-Modeling, Statistics, and Applications. Springer-Verlag, Berlin, 1995.

    Google Scholar 

  4. Y. le Cun, J. S. Denker, and S. A. Solla. Optimal brain damage. In D. S. Touretzky, editor, Advances in Neural Information Processing Systems 2 (NIPS*89), pages 598–605, San Mateo, CA, 1990. Morgan Kaufmann.

    Google Scholar 

  5. M. Mangeas, M. Cottrell, and J.F. Yao. New criterion of identification in the multilayered perceptron modelling. In Proceedings of ESANN'97, Bruges, Belgium, 1997.

    Google Scholar 

  6. M. Mangeas and Jian-feng Yao. Sur l'estimateur des moindres carrés d'un mod'ele autorégressif non-linéaire Technical Report 53, SAMOS, Université Paris I, 1996.

    Google Scholar 

  7. A. S. Weigend and M. Mangeas. Avoiding overfitting by locally matching the noise level of the data. In World Congress on Neural Networks (WCNN'95), pages II–1–9;, 1995.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Wulfram Gerstner Alain Germond Martin Hasler Jean-Daniel Nicoud

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Mangeas, M. (1997). Neural model selection: How to determine the fittest criterion?. In: Gerstner, W., Germond, A., Hasler, M., Nicoud, JD. (eds) Artificial Neural Networks — ICANN'97. ICANN 1997. Lecture Notes in Computer Science, vol 1327. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0020281

Download citation

  • DOI: https://doi.org/10.1007/BFb0020281

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-63631-1

  • Online ISBN: 978-3-540-69620-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics