Abstract
Practical Recurrent Learning (PRL) has been proposed as a simple learning algorithm for recurrent neural networks[1][2]. This algorithm enables learning with practical order O(n 2) of memory capacity and computational cost, which cannot be realized by conventional Back Propagation Through Time (BPTT) or Real Time Recurrent Learning (RTRL). It was shown in the previous work[1] that 3-bit parity problem could be learned successfully by PRL, but the learning performance was quite inferior to BPTT. In this paper, a simple calculation is introduced to prevent monotonous oscillations from being biased to the saturation range of the sigmoid function during learning. It is shown that the learning performance of the PRL method can be improved in the 3-bit parity problem. Finally, this improved PRL is applied to a scanned digit pattern classification task for which the results are inferior but comparable to the conventional BPTT.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Samsudin, M.F., Hirose, T., Shibata, K.: Practical Recurrent Learning (PRL) in the Discrete Time Domain. In: Ishikawa, M., Doya, K., Miyamoto, H., Yamakawa, T. (eds.) ICONIP 2007, Part I. LNCS, vol. 4984, pp. 228–237. Springer, Heidelberg (2008)
Shibata, K., Okabe, Y., Ito, K.: Simple Learning Algorithm for Recurrent Networks to Realize Short-Term Memories. In: Proc. of IJCNN Intl. Joint Conf. on Neural Network, vol. 98, pp. 2367–2372 (1998)
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by errorpropagating. In: Parallel Distributed Processing, vol. 1, pp. 318–362. MIT Press, Cambridge (1986)
Williams, R.J., Zipser, D.: A learning algorithm for continually running fully recurrent neural network. Neural Computation 1, 270–280 (1989)
Song, Q., Wu, Y., Soh, Y.C.: Robust Adaptive Gradient-Descent Training Algorithm for Recurrent Neural Networks in Discrete Time Domain. IEEE Transactions on Neural Networks 19(11), 1841–1853 (2008)
Hochreiter, S., Schimidhuber, J.: Long Short Term Memory. Neural Computation 9, 1735–1780 (1997)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Samsudin, M.F.b., Shibata, K. (2009). Improvement of Practical Recurrent Learning Method and Application to a Pattern Classification Task. In: Köppen, M., Kasabov, N., Coghill, G. (eds) Advances in Neuro-Information Processing. ICONIP 2008. Lecture Notes in Computer Science, vol 5507. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-03040-6_77
Download citation
DOI: https://doi.org/10.1007/978-3-642-03040-6_77
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-03039-0
Online ISBN: 978-3-642-03040-6
eBook Packages: Computer ScienceComputer Science (R0)