Abstract
Quantization analysis of the limited precision is widely used in the hardware realization of neural networks. Due to the most neural computations are required in the training phase, the effects of quantization are more significant in this phase. We pay attention and analyze backpropagation training and recall of the limited precision on the HOFNN, point out the potential problems and the performance sensitivity with lower-bit quantization. We compare the training performances with and without weight clipping, derive the effects of the quantization error on backpropagation for on-chip and off-chip training. Our experimental simulation results verify the presented theoretical analysis.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Jiang, M., Gielen, G.: The Effects of Quantization on Multilayer Feedforward Neural Networks. International Journal of Pattern Recognition and Artificial Intelligence 17, 637–661 (2003)
Marco, G., Marco, M.: Optimal Convergence of On-Line Backpropagation. IEEE Transactions on Neural Networks 7, 251–254 (1996)
Bayraktaroglu, I., Ogrenci, A.S., Dundar, G., et al.: ANNSyS: An Analog Neural Network Synthesis System. Neural Networks 12, 325–338 (1999)
Shima, T., Kimura, T., Kamatani, Y., et al.: Neural Chips with On-Chip Backpropagation and / or Hebbian Learning. IEEE Journal of Solid State Circuits 27, 1868–1876 (1992)
Anand, R., Mehrotra, K., Mohan, C.K., et al.: Efficient Classification for Multiclass Problems Using Modular Neural Networks. IEEE Transactions on Neural Networks 6, 117–123 (1995)
Holt, J.L., Hwang, J.-N.: Finite Error Precision Analysis of Neural Network Hardware Implementations. IEEE Transactions on Computers 42, 1380–1389 (1993)
Jiang, M., Gielen, G.: The Effect of Quantization on the High Order Function Neural Networks. In: Neural Networks for Signal Processing - Proceedings of the IEEE Workshop, pp. 143–152. IEEE Inc., Los Alamitos (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Jiang, M., Gielen, G. (2004). Backpropagation Analysis of the Limited Precision on High-Order Function Neural Networks. In: Yin, FL., Wang, J., Guo, C. (eds) Advances in Neural Networks – ISNN 2004. ISNN 2004. Lecture Notes in Computer Science, vol 3173. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-28647-9_52
Download citation
DOI: https://doi.org/10.1007/978-3-540-28647-9_52
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22841-7
Online ISBN: 978-3-540-28647-9
eBook Packages: Springer Book Archive