Loading [MathJax]/extensions/TeX/euler_ieee.js
Training multilayer perceptron by using optimal input normalization | IEEE Conference Publication | IEEE Xplore

Training multilayer perceptron by using optimal input normalization

Publisher: IEEE

Abstract:

In this paper, we propose a novel second order paradigm called optimal input normalization (OIN) to solve the problems of slow convergence and high complexity of MLP. By ...View more

Abstract:

In this paper, we propose a novel second order paradigm called optimal input normalization (OIN) to solve the problems of slow convergence and high complexity of MLP. By optimizing the non-orthogonal transformation matrix of input units in an equivalent network, OIN absorbs separate optimal learning factor for each synaptic weight as well as the threshold of hidden unit, leading to an improvement in the performance for MLP training. Moreover, by using a whitening transformation of negative Jacobian matrix of hidden weights, a modified version of OIN called optimal input normalization with hidden weights optimization (OIN-HWO) is also proposed. The Hessian matrices in both OIN and OIN-HWO are computed by using GaussNewton method. All the linear equations are solved via orthogonal least square (OLS). Regression simulations are performed on several real-life datasets and the results show that the proposed OIN has not only much better convergence rate and generalization ability than output weights optimization-back propagation (OWO-BP), optimal input gains (OIG) and even Levenberg-Marquardt (LM) method, but also takes less computational time than OWO-BP. Although OIN-HWO takes a little expensive computational burden than OIN, its convergence rate is faster than OIN and often close to or rivals LM. It is therefore suggested that OIN-based algorithms are potentially very good choices for practical applications.
Date of Conference: 27-30 June 2011
Date Added to IEEE Xplore: 01 September 2011
ISBN Information:

ISSN Information:

Publisher: IEEE
Conference Location: Taipei, Taiwan

References

References is not available for this document.