Abstract
Artificial neural networks (ANN) have been used as predictive systems for a variety of application domains such as science, engineering and finance. Therefore it is very important to be able to estimate the reliability of a given model. Bootstrap is a computer intensive method used for estimating the distribution of a statistical estimator based on an imitation of the probabilistic structure of the data generating process and the information contained in a given set of random observations. Bootstrap plans can be used for estimating the uncertainty associated with a value predicted by a feedforward neural network.
The available bootstrap methods for ANN assume independent random samples that are free of outliers. Unfortunately, the existence of outliers in a sample has serious effects such as some resamples may have a higher contamination level than the initial sample, and the model is affected because it is sensible to these deviations resulting on a poor performance.
In this paper we investigate a robust bootstrap method for ANN that is resistant to the presence of outliers and is computationally simple. We illustrate our technique on synthetic and real datasets and results are shown on confidence intervals for neural network prediction.
This work was supported in part by Research Grant Fondecyt 1010101 and 7010101, and in part by Research Grant DGIP-UTFSM.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Allende, H., Moraga, C., Salas, R.: Robust estimator for the learning process in neural networks applied in time series. In: Dorronsoro, J.R. (ed.) ICANN 2002. LNCS, vol. 2415, pp. 1080–1086. Springer, Heidelberg (2002)
Amado, C., Pires, A.: Robust bootstrapping using influence functions. In: Proceedings in Computational Statistics 2000: Short Communications and Posters, Statistics Netherlands, Voorburg, pp. 83–84 (2000)
Efron, B.: Bootstrap methods: another look at the jacknife. The Annals of Statistics 7, 1–26 (1979)
Franke, J., Neumann, M.: Bootstrapping neural networks. Neural Computation 12, 1929–1949 (2000)
Hampel, F.R., Ronchetti, E.M., Rousseeuw, P.J., Stahel, W.A.: Robust statistics. Wiley Series in Probability and Mathematical Statistics (1986)
Heskes, T.: Practical confidence and prediction intervals. Advances in Neural Information Processing Systems 9, 176–182 (1997)
Hwang, J., Ding, A.: Prediction intervals for artificial neural networks. J. American Statistical Association 92(438), 748–757 (1997)
Nix, D., Weigend, A.: Estimating the mean and the variance of the target probability distribution. In: Proceedings of the IJCNN 1994, pp. 55–60. IEEE, Los Alamitos (1994)
Salas, R., Torres, R., Allende, H., Moraga, C.: Robust estimation of confidence interval in neural networks applied to time series. In: Mira, J., Álvarez, J.R. (eds.) IWANN 2003. LNCS, vol. 2687, pp. 441–448. Springer, Heidelberg (2003)
Salibián-Barrera, M., Zamar, R.: Bootstrapping robust estimates of regression. The Annals of Statistics 30, 556–582 (2002)
Singh, K.: Breakdown theory for bootstrap quantiles. The annals of Statistics 26, 1719–1732 (1998)
White, H.: Artificial neural networks: Approximation and learning theory. Basil Blackwell, Oxford (1992)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Allende, H., Ñanculef, R., Salas, R. (2004). Robust Bootstrapping Neural Networks. In: Monroy, R., Arroyo-Figueroa, G., Sucar, L.E., Sossa, H. (eds) MICAI 2004: Advances in Artificial Intelligence. MICAI 2004. Lecture Notes in Computer Science(), vol 2972. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-24694-7_84
Download citation
DOI: https://doi.org/10.1007/978-3-540-24694-7_84
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-21459-5
Online ISBN: 978-3-540-24694-7
eBook Packages: Springer Book Archive