Abstract
All existing architectures and learning algorithms for Generalized Congruence Neural Network (GCNN) seem to have some shortages or lack rigorous theoretical foundation. In this paper, a novel GCNN architecture (BPGCNN) is proposed. A new error back-propagation learning algorithm is also developed for the BPGCNN. Experimental results on some benchmark problems show that the proposed BPGCNN performs better than standard sigmoidal BPNN and some improved versions of BPNN in convergence speed and learning capability, and can overcome the drawbacks of other existing GCNNs.
This paper is supported by the Program for New Century Excellent Talents in University, the National Natural Science Foundation of China under Grant No.60373111, the Science and Technology Research Program of the Municipal Education Committee of Chongqing of China under Grant No.040505, and the Key Lab of Computer Network and Communication Technology of Chongqing of China.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Duch, W., Jankowski, N.: Survey of Neural Transfer Functions. Neural Computing Surveys 2, 163–212 (1999)
Chandra, P., Singh, Y.: A Case for the Self-adaptation of Activation Functions in FFANNs. Neurocomputing 56, 447–454 (2004)
Dorffner, G.: A Unified Framework for MLPs and RBFNs: Introducing Conic Section Function Networks. Cybernetics and Systems 25, 511–554 (1994)
Giraud, B.G., Lapedes, A., Liu, L.C., Lemm, J.C.: Lorentzian Neural Nets. Neural Networks 8, 757–767 (1995)
Jin, F.: Architectures and Algorithms of Generalized Gongruence Neural Networks. Journal of Southwest Jiaotong University 6 (1998)
Jin, F.: Study on Principles and Algorithms of Generalized Congruence Neural Networks. In: International Conference on Neural Network and Brain, pp. 441–444. Publishing House of Electronics Industry, Beijing (1998)
Hu, F., Jin, F.: Analysis of Characteristics of Generalized Congruence Neural Networks with an Improved Algorithm. Journal of Southwest Jiaotong University 36, 136–139 (2001)
Ng, S.C., Leung, S.H., Luk, A.: Fast Convergent Generalized Back-propagation Algorithm with Constant Learning Rate. Neural Processing Letters 9, 13–23 (1999)
Baum, E., Lang, K.: Constructing Hidden Units Using Examples and Queries. In: Lippmann, R., Moody, J., Touretzky, D. (eds.) Advances in Neural Information Processing Systems, vol. 3, pp. 904–910. Morgan Kaufmann, San Mateo (1991)
Shang, Y., Wah, B.W.: Global optimization for Neural Network Training. IEEE Computer 29, 45–54 (1996)
Cho, S., Chow, T.W.S.: Training Multilayer Neural Networks Using Fast Global Learning Algorithm-least-squares and Penalized Optimization Methods. Neurocomputing 25, 115–131 (1999)
Plagianakos, V.P., Sotiropoulos, D.G., Vrahatis, M.N.: An Improved Backpropagation Method with Adaptive Learning Rate. Department of Mathematics, University of Patras, Technical Report No.98-02 (1998)
Vogl, T.P., Mangis, J.K., Rigler, J.K., Zink, W.T., Alkon, D.L.: Accelerating the Convergence of the Back-propagation Method. Biological Cybernetics 59, 257–263 (1988)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Chen, Y., Wang, G., Jin, F., Yan, T. (2005). A Novel Generalized Congruence Neural Networks. In: Wang, J., Liao, X., Yi, Z. (eds) Advances in Neural Networks – ISNN 2005. ISNN 2005. Lecture Notes in Computer Science, vol 3496. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11427391_72
Download citation
DOI: https://doi.org/10.1007/11427391_72
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-25912-1
Online ISBN: 978-3-540-32065-4
eBook Packages: Computer ScienceComputer Science (R0)