Skip to main content
Log in

Assessing the Noise Immunity and Generalization of Radial Basis Function Networks

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

In previous work we have derived a magnitude termed the 'Mean Squared Sensitivity' (MSS) to predict the performance degradation of a MLP affected by perturbations in different parameters. The present Letter continues the same line of researching, applying a similar methodology to deal with RBF networks and to study the implications when they are affected by input noise. We obtain the corresponding analytical expression for MSS in RBF networks and validate it experimentally, using two different models for perturbations: an additive and a multiplicative model. We discuss the relationship between MSS and the generalization ability. MSS is proposed as a quantitative measurement to evaluate the noise immunity and generalization ability of a RBFN configuration, giving even more generalization to our approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Broomhead, D. S. and Lowe, D.: Multivariate functional interpolation and adaptive networks, Complex Systems, 2 321–355.

  2. Haykin, S.: Neural Networks. A Comprehensive Foundation, (second edition, Prentice Hall, 1999).

  3. Hochreiter, H. and Schmidhuber, J.: Flat Minima, Neural Computation, 9 (1997), 1–42.

    Google Scholar 

  4. Minai, A. A. and Williams, R. D.: Perturbation response in feedforward networks, Neural Networks, 7(5) (1994), 783–796.

    Google Scholar 

  5. Edwards, P. J. and Murray, A. F.: Towards optimally distributed computation, Neural Computation, 10 (1998), 997–1015.

    Google Scholar 

  6. Bernier, J. L., Ortega, J., Rodriguez, M. M., Rojas, I. and Prieto, A.: An accurate measure for multilayer perceptron tolerance to weight deviations, Neural Processing Letters, 10(2) (1999), 121–130.

    Google Scholar 

  7. Bernier, J. L., Ortega, J., Ros, E., Rojas, I. and Prieto, A.: A quantitative study of fault tolerance,noise immunity and generalization ability of MLPs, Neural Computation, 12 (2000), 2941–2964.

    Google Scholar 

  8. Segee, B. E. and Carter, M. J.: Comparative fault tolerance of generalized radial basis function and multilayer perceptron networks, IEEE Int. Conference on Neural Networks, 3 (1993), 1847–1852.

    Google Scholar 

  9. Townsend, N. W. and Tarassenko, L.: Estimations of error bounds for neural-network function approximators, IEEE Trans. on Neural Networks, 10(2) (1999), 217–230.

    Google Scholar 

  10. Parra, X. and Catala, A.: Fault tolerance in the learning algorithm of radial basis function networks, Proc. IEEE-INNS-ENNS Int. Joint Conference on Neural Networks, 3 (2000), 527–532.

    Google Scholar 

  11. Hagiwara, K., Hayasaka, T., Toda, N., Usui, S. and Kuno, K.: Upper bound of the expected trainig error of neural regression for a gaussian noise sequence, Neural Networks, 14 (2001), 1419–1429.

    Google Scholar 

  12. Catala, A. and Parra, X.: Fault tolerance parameter model of radial basis function networks, IEEE Int. Conference on Neural Networks, 2 (1996), 1384–1389.

    Google Scholar 

  13. Hegde, M. V., Naraghi-Pour, M. and Bapat, P.: Learning algorithms for fault tolerance in radial basis function networks, Proc 37th Midwest Symposium on Circuits and Systems, 1 (1995), 535–538.

    Google Scholar 

  14. Wang, Z. and Zhu, T.: An efficient learning algorithm for improving generalization performance of radial basis function neural networks, Neural Networks, 13 (2000), 545–553.

    Google Scholar 

  15. Bernier, J. L., Ortega, J., Rojas, I., Ros, E. and Prieto, A.: Obtaining fault tolerant multilayer perceptrons using an explicit regularization. Neural Processing Letters. 12(2) (2000), 107–113.

    Google Scholar 

  16. Choi, J. Y. and Choi, C.: Sensitivity analysis of multilayer perceptron with differentiable activation functions, IEEE Trans. on Neural Networks, 3(1) (1992), 101–107.

    Google Scholar 

  17. Bishop, C.: Training with noise is equivalent to Tikhonov regularization, Neural Computation, 7(1) (1995), 108–116.

    Google Scholar 

  18. Mackey, M. C. and Glass, L.: Oscillation and chaos in physiological control systems, Science, 197 (1977), 287–289.

    Google Scholar 

  19. Cherkassky, V., Gehring, D. and Mulier, F.: Comparison of adaptive methods for function estimation from samples, IEEE Trans. on Neural Networks, 7(4) (1996), 969–984.

    Google Scholar 

  20. Box, G. E. P. and Jenkins, G. M.: Time Series Analysis, Forecasting and Control. San Francisco, Holden-Day, CA, 2nd ed., 1976.

  21. Gonzalez, J., Rojas, I., Pomares, H., Fernandez, F. J. and Diaz, A. F.: Multiobjective evolutionary optimization of the size, shape and position parameters of radial basis function networks for function approximation, IEEE Trans. on Neural Networks (in press).

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Bernier, J.L., Díaz, A.F., Fernández, F.J. et al. Assessing the Noise Immunity and Generalization of Radial Basis Function Networks. Neural Processing Letters 18, 35–48 (2003). https://doi.org/10.1023/A:1026275522974

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1026275522974

Navigation