Abstract
In this paper we consider the problem of sequential processing and present a sequential model based on the back-propagation algorithm. This model is intended to deal with intrinsically sequential problems, such as word recognition, speech recognition, natural language understanding. This model can be used to train a network to learn the sequence of input patterns, in a fixed order or a random order. Besides, this model is open- and partial-associative, characterized as “recognizing while accumulating”, which, as we argue, is mental cognition process oriented.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Hinton G E. Mapping part-whole hierarchies into connectionist networks. Artificial Intelligence, 1990, 46: 47–75.
Watrous R L, Shastri L. Learning phonetic features using connectionist networks: An experiment in speech recognition. In: Proceedings of the 1st International Conference on Neural Networks, 1987, San Diego.
McCloskey M, Cohen N J. The sequential learning problem in connectionist models. Paper read at the Meetings of the Psychonomic Society, Nov. 1987, Washington.
Jordan M I. Attractor dynamics and parallelism in a connectionist sequential machine. In: Proceedings of the 8th Annual Conference of the Cognitive Science Society, 1986, Erlbaum.
McClelland J L, Rumelhart D E. An interactive activation model of context effects in letter perception, Part 1: An account of basic findings. 1981, Psychology Review, 1981, 88.
Author information
Authors and Affiliations
Additional information
This work is supported in part by the “863” National High Technology Program.
Rights and permissions
About this article
Cite this article
Hui, H., Liu, D. & Wang, Y. Sequential back-propagation. J. of Compt. Sci. & Technol. 9, 252–260 (1994). https://doi.org/10.1007/BF02939506
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF02939506