Abstract
This paper introduces a new concept of the connection weight to the standard recurrent neural networks—Elman and Jordan networks. The architecture of the modified networks is the same as that of the original recurrent neural networks. However, unlike the original recurrent neural networks whose connection weight is a single real number, in the modified networks the weight of each connection is multi-valued, depending on the value of the input data involved. The backpropagation learning algorithm is also modified to suit the proposed concept. The modified networks have been benchmarked against the feedforward neural network and the original recurrent neural networks. The experimental results on twelve benchmark problems show that the modified networks are clearly superior to the other three methods.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Rao SS, Kumthekar B (1995) A composite neural architecture and algorithm for nonlinear system identification. In: Proceedings of the 1995 artificial neural networks in engineering conference (ANNIE’ 95), pp 77–84
Lee H, Park Y (1991) Nonlinear system identification using recurrent networks. In: Proceedings of the 1991 IEEE international joint conference on neural networks, pp 2410–2415
Karaboga D, Kalinli A (1997) Training recurrent neural networks for dynamic system identification using parallel tabu search algorithm. In: Proceedings of the 12th IEEE international symposium on intelligent control, pp 113–117
Yu W (2004) Nonlinear system identification using discrete-time recurrent neural networks with stable learning algorithms. Inf Sci 158: 131–147
Pham DT, Karaboga D (1999) Training Elman and Jordan networks for system identification using genetic algorithms. Artificial Intelligence in Engineering 13, no 2, Elsevier, Amsterdam, pp 107–117
Yu X, Tang K, Chen T, Yao X (2009) Empirical analysis of evolutionary algorithms with immigrants schemes for dynamic optimization. Memetic Computing, vol 1, no 1, Springer, Berlin, pp 3–24
Elman JL (1990) Finding structure in time. Cognitive Science 14: 179–211
Jordan MI (1986) Attractor dynamics and parallelism in a connectionist sequential machine. In: Proceedings of the eighth annual conference of the cognitive science society, pp 531–546
Cheng YC, Qi WM, Cai WY (2002) Dynamic properties of Elman and modified Elman neural network. In: Proceedings of the first international conference on machine learning and cybernetics, pp 637–640
Gao XZ, Gao XM, Ovaska SJ (1996) A modified Elman neural network model with application to dynamical systems identification. In: Proceedings of the IEEE international conference on systems, man, and cybernetics, pp 1376–1381
Chen TB, Soo VW (1996) A comparative study of recurrent neural network architectures on learning temporal sequences. In: Proceedings of the IEEE international conference on neural networks, pp 1945–1950
Ji QL, Qi WM (2007) The property of PID Elman neural network and its application in identification of hydraulic unit. In: Proceedings of the 2007 IEEE international conference on control and automation, pp 1795–1798
Bogdan C (Online) Chaotic Generators Demo. Available at http://www.mathworks.com/matlabcentral/fileexchange/8382
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Thammano, A., Ruxpakawong, P. Nonlinear dynamic system identification using recurrent neural network with multi-segment piecewise-linear connection weight. Memetic Comp. 2, 273–282 (2010). https://doi.org/10.1007/s12293-010-0042-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12293-010-0042-7