Abstract
One of the main research concern in neural networks is to find the appropriate network size in order to minimize the trade-off between overfitting and poor approximation. In this paper the choice among different competing models that fit to the same data set is faced when statistical methods for model comparison are applied. The study has been conducted to find a range of models that can work all the same as the cost of complexity varies. If they do not, then the generalization error estimates should be about the same among the set of models. If they do, then the estimates should be different and our job would consist on analyzing pairwise differences between the least generalization error estimate and each one of the range, in order to bound the set of models which might result in an equal performance. This method is illustrated applied to polynomial regression and RBF neural networks.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Chen, T., Seneta, E.: A stepwise rejective test procedure with strong control of familywise error rate. University of Sidney, School of Mathematics and Statistics, Research Report 99-9, March (1999)
Dean A., Voss, D.: Design and Analysis of Experiments. Springer-Verlag New York (1999)
Don Lehmkuhl, L: Nonparametric Statistics: Methods for Analyzing Data Not Meeting Assumptions Required for the Application of Parametric Tests, Journal of Prosthetics and Orthotics, 3(8) 105–113 (1996)
Efron, B., Tibshirani, R.: Introduction to the Bootstrap, Chapman & Hall, (1993)
Girden, E.R.: Anova Repeated Measures, Sage Publications (1992)
Hochberg, Y., Tamhane A.C.: Multiple Comparison Procedures, Wiley (1987)
Hollander, M., Wolfe, D.A.: Nonparametric Statistical Methods, Wiley (1999)
Kearns, M., Mansour, Y.: An experimental and theorical comparison of model selection methods. Machine Learning, 27(1), (1997)
Minke, A.: Conducting Repeated Measures Analyses: Experimental Design Considerations, Annual Meeting of the Southwest Educational Research Association, Austin, (1997)
Pizarro, J., Guerrero, E., Galindo, P.: A statistical model selection strategy applied to neural networks. Proceedings of the European Symposium on Artificial Neural Networks Vol 1, pp. 55–60, Bruges (2000)
Vila, J.P., Wagner, V., Neveu, P.:Bayesian nonlinear model selection and neural networks: a conjugate prior approach. IEEE Transactions on neural networks, vol 11,2, march (2000)
Zar, J.H.: Biostatistical Analysis, Prentice Hall (1996)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Guerrero Vázquez, E., Yañez Escolano, A., Galindo Riaño, P., Pizarro Junquera, J. (2001). Repeated Measures Multiple Comparison Procedures Applied to Model Selection in Neural Networks. In: Mira, J., Prieto, A. (eds) Bio-Inspired Applications of Connectionism. IWANN 2001. Lecture Notes in Computer Science, vol 2085. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45723-2_10
Download citation
DOI: https://doi.org/10.1007/3-540-45723-2_10
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42237-2
Online ISBN: 978-3-540-45723-7
eBook Packages: Springer Book Archive