Abstract
When the learning algorithm is applied to a MLP structure, different solutions for the weight values can be obtained if the parameters of the applied rule or the initial conditions are changed. Those solutions can present similar performance with respect to learning, but they differ in other aspects, in particular, fault tolerance against weight perturbations. In this paper, a backpropagation algorithm that maximizes fault tolerance is proposed. The algorithm presented explicitly adds a new term to the backpropagation learning rule related to the mean square error degradation in the presence of weight deviations in order to minimize this degradation. The results obtained demonstrate the efficiency of the learning rule proposed here in comparison with other algorithm.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Segee, B. E. and Carter, M. J.: Comparative fault tolerance of parallel distributed processing networks. IEEE Trans. on Computers 43(11) (1994), 1323–1329.
Pathak, D. S. and Koren, I.: Complete and partial fault tolerance of feedforward neural nets. IEEE Trans. on Neural Networks 6(2) (1995), 446–456.
Edwards, P. J. and Murray, A. F.: Towards optimally distributed computation. Neural Computation 10 (1998), 997–1015.
Bernier, J. L., Ortega, J., Rodríguez, M. M., Rojas, I. and Prieto, A.: An accurate measure for multilayer perceptron tolerance to weight deviations. Neural Processing Letters 10(2) (1999), 121–130.
Mao, J. and Jain, A. K.: Regularization techniques in artificial neural networks. World Congress on Neural Networks, 1993, pp. 75–79.
Anza Plus. Users Guide and Neurosoftware Documents. HNC Neurosoftware, HNC Inc., June 1989.
Choi, J. Y. and Choi, C.: Sensitivity analysis of multilayer perceptron with differentiable activation functions. IEEE Trans. on Neural Networks 3(1) (1992), 101–107.
Edwards, P. J. and Murray, A. F.: Can deterministic penalty terms model the effects of synaptic weight noise on network fault-tolerance? Int. J. Neural Systems 6(4) (1995), 401–416.
Prechelt, L.: PROBEN1-A Set of Neural Network Benchmark Problems and Benchmarking Rules. Technical Report 21/94. Universität Karlsruhe, Germany, September 1994.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Bernier, J.L., Ortega, J., Rojas, I. et al. Obtaining Fault Tolerant Multilayer Perceptrons Using an Explicit Regularization. Neural Processing Letters 12, 107–113 (2000). https://doi.org/10.1023/A:1009698206772
Issue Date:
DOI: https://doi.org/10.1023/A:1009698206772