Elsevier

Neural Networks

Volume 6, Issue 7, 1993, Pages 1019-1022
Neural Networks

Consistency of multilayer perceptron regression estimators

https://doi.org/10.1016/S0893-6080(09)80011-7Get rights and content

Abstract

In the paper three layer perceptron with one hidden layer and the output layer consisting of one neuron is considered. This is commonly used architecture to solve regression problems where such a perceptron minimizing the mean squared error criterion for the data points (xk, yk), k = 1,…, N is sought. It is shown that in the model: yk = g0(xk) + εk, k = 1, … N, where xk is independent from zero mean error term εk, this procedure is consistent when N → ∞, provided that g0 is represented as three layer perceptron with Heaviside transfer fucntion. The same result is true when transfer function is an arbitrary continuous function with bounded limits at ±∞ and the hidden-to-output weights in the considered family of perceptions are bounded.

References (13)

There are more references available in the full text version of this article.

Cited by (31)

View all citing articles on Scopus
*

The work was partly done while the author was a Visiting Researcher with the SANS group at the Department of Numerical Analysis and Computing Science, Royal Institute of Technology, Stockholm, Sweden.

View full text