Skip to main content

A modified backpropagation algorithm to tolerate weight errors

  • Neural Nets Simulation, Emulation and Implementation
  • Conference paper
  • First Online:
Biological and Artificial Computation: From Neuroscience to Technology (IWANN 1997)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1240))

Included in the following conference series:

Abstract

A modified backpropagation algorithm that minimizes the sensitivity to weight errors is presented. Multi-Layer Perceptrons (MLPs) trained with this algorithm are more tolerant to weight deviations compared to those obtained with a classical algorithm while the other performance figures are similar. Thus the algorithm is useful for MLPs that are going to be mapped on a physical implementation that can be affected by weight imprecision.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. M. Stevenson, R. Winter, B. Widrow, “Sensitivity of Neural Networks to Weight Errors”, IEEE Trans. on Neural Networks, vol.1, no.1, pp. 71–80, Mar 1990.

    Google Scholar 

  2. C. Alippi, V. Piuri, M. Sami, “Sensitivity to Errors in Artificial Neural Networks: a Behavioral Approach”, in Proc. IEEE Int. Symp. on Circuits & Systems, pp. 459–462, May 1994.

    Google Scholar 

  3. C. Chiu K Mehrotra, C. K. Mohan, S. Ranka, “Modifying Training Algorithms for Improved Fault Tolerance”, in Proc. IEEE Int. Conf. on Neural Networks, vol.1 pp. 333–338, Jun 1994.

    Google Scholar 

  4. C. Lin, I. Wu, “Maximizing Fault Tolerance in Multilayer Neural Networks”, in Proc. IEEE Int. Conf. on Neural Networks, pp. 419–424, Jun 1994.

    Google Scholar 

  5. H. Elsimary, S. Mashali, A. Darwish, S. Shasheen, ”Performance Evaluation of a Novel Fault Tolerance Training Algorithm”, in Proc. Int. Conf. on Electronics, Circuits & Systems, pp.566–570, Dec. 1994.

    Google Scholar 

  6. J.Y. Choi, C. Choi, ”Sensitivity Analisis of Multilayer Perceptron with Differentiable Activation Functions”, IEEE Trans. on Neural Networks, vol.3, no.1, pp.101–107, Jan 1992.

    Google Scholar 

  7. R.P. Lippmann, ”An Introduction to Computing with Neural Nets”, IEEE ASSP Magazine, pp. 4–22, Apr 1987.

    Google Scholar 

  8. A. Grace, ”Optimization Toolbox”, in Matlab User's Guide. The MathWorks Inc, Jan 1994.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

José Mira Roberto Moreno-Díaz Joan Cabestany

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Bernier, J.L., Ortega, J., Prieto, A. (1997). A modified backpropagation algorithm to tolerate weight errors. In: Mira, J., Moreno-Díaz, R., Cabestany, J. (eds) Biological and Artificial Computation: From Neuroscience to Technology. IWANN 1997. Lecture Notes in Computer Science, vol 1240. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0032535

Download citation

  • DOI: https://doi.org/10.1007/BFb0032535

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-63047-0

  • Online ISBN: 978-3-540-69074-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics