Skip to main content

Robust Bootstrapping Neural Networks

  • Conference paper
MICAI 2004: Advances in Artificial Intelligence (MICAI 2004)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2972))

Included in the following conference series:

Abstract

Artificial neural networks (ANN) have been used as predictive systems for a variety of application domains such as science, engineering and finance. Therefore it is very important to be able to estimate the reliability of a given model. Bootstrap is a computer intensive method used for estimating the distribution of a statistical estimator based on an imitation of the probabilistic structure of the data generating process and the information contained in a given set of random observations. Bootstrap plans can be used for estimating the uncertainty associated with a value predicted by a feedforward neural network.

The available bootstrap methods for ANN assume independent random samples that are free of outliers. Unfortunately, the existence of outliers in a sample has serious effects such as some resamples may have a higher contamination level than the initial sample, and the model is affected because it is sensible to these deviations resulting on a poor performance.

In this paper we investigate a robust bootstrap method for ANN that is resistant to the presence of outliers and is computationally simple. We illustrate our technique on synthetic and real datasets and results are shown on confidence intervals for neural network prediction.

This work was supported in part by Research Grant Fondecyt 1010101 and 7010101, and in part by Research Grant DGIP-UTFSM.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Allende, H., Moraga, C., Salas, R.: Robust estimator for the learning process in neural networks applied in time series. In: Dorronsoro, J.R. (ed.) ICANN 2002. LNCS, vol. 2415, pp. 1080–1086. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  2. Amado, C., Pires, A.: Robust bootstrapping using influence functions. In: Proceedings in Computational Statistics 2000: Short Communications and Posters, Statistics Netherlands, Voorburg, pp. 83–84 (2000)

    Google Scholar 

  3. Efron, B.: Bootstrap methods: another look at the jacknife. The Annals of Statistics 7, 1–26 (1979)

    Article  MATH  MathSciNet  Google Scholar 

  4. Franke, J., Neumann, M.: Bootstrapping neural networks. Neural Computation 12, 1929–1949 (2000)

    Article  Google Scholar 

  5. Hampel, F.R., Ronchetti, E.M., Rousseeuw, P.J., Stahel, W.A.: Robust statistics. Wiley Series in Probability and Mathematical Statistics (1986)

    Google Scholar 

  6. Heskes, T.: Practical confidence and prediction intervals. Advances in Neural Information Processing Systems 9, 176–182 (1997)

    Google Scholar 

  7. Hwang, J., Ding, A.: Prediction intervals for artificial neural networks. J. American Statistical Association 92(438), 748–757 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  8. Nix, D., Weigend, A.: Estimating the mean and the variance of the target probability distribution. In: Proceedings of the IJCNN 1994, pp. 55–60. IEEE, Los Alamitos (1994)

    Google Scholar 

  9. Salas, R., Torres, R., Allende, H., Moraga, C.: Robust estimation of confidence interval in neural networks applied to time series. In: Mira, J., Álvarez, J.R. (eds.) IWANN 2003. LNCS, vol. 2687, pp. 441–448. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  10. Salibián-Barrera, M., Zamar, R.: Bootstrapping robust estimates of regression. The Annals of Statistics 30, 556–582 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  11. Singh, K.: Breakdown theory for bootstrap quantiles. The annals of Statistics 26, 1719–1732 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  12. White, H.: Artificial neural networks: Approximation and learning theory. Basil Blackwell, Oxford (1992)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Allende, H., Ñanculef, R., Salas, R. (2004). Robust Bootstrapping Neural Networks. In: Monroy, R., Arroyo-Figueroa, G., Sucar, L.E., Sossa, H. (eds) MICAI 2004: Advances in Artificial Intelligence. MICAI 2004. Lecture Notes in Computer Science(), vol 2972. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-24694-7_84

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-24694-7_84

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-21459-5

  • Online ISBN: 978-3-540-24694-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics