Contributed articleRecurrent neural networks can be trained to be maximum a posteriori probability classifiers
References (7)
- et al.
Approximation theory and feedforward networks
Neural Networks
(1991) Approximation capabilities of multilayer feedforward networks
Neural Networks
(1991)- et al.
Equivalence proofs for multi-layer perceptron classifiers and the Bayesian discriminant function
There are more references available in the full text version of this article.
Cited by (17)
A new boosting algorithm for improved time-series forecasting with recurrent neural networks
2008, Information FusionHybrid NN/HMM acoustic modeling techniques for distributed speech recognition
2006, Speech CommunicationApproaches to fractional land cover and continuous field mapping: A comparative assessment over the BOREAS study region
2004, Remote Sensing of EnvironmentBayesian learning for recurrent neural networks [2]
2001, NeurocomputingA view-based neurocomputational system for relational map-making and navigation in visual environments
1995, Robotics and Autonomous Systems
Copyright © 1994 Published by Elsevier Ltd.