Loading [MathJax]/extensions/MathMenu.js
A New Correntropy-Based Conjugate Gradient Backpropagation Algorithm for Improving Training in Neural Networks | IEEE Journals & Magazine | IEEE Xplore

A New Correntropy-Based Conjugate Gradient Backpropagation Algorithm for Improving Training in Neural Networks


Abstract:

Mean square error (MSE) is the most prominent criterion in training neural networks and has been employed in numerous learning problems. In this paper, we suggest a group...Show More

Abstract:

Mean square error (MSE) is the most prominent criterion in training neural networks and has been employed in numerous learning problems. In this paper, we suggest a group of novel robust information theoretic backpropagation (BP) methods, as correntropy-based conjugate gradient BP (CCG-BP). CCG-BP algorithms converge faster than the common correntropy-based BP algorithms and have better performance than the common CG-BP algorithms based on MSE, especially in nonGaussian environments and in cases with impulsive noise or heavy-tailed distributions noise. In addition, a convergence analysis of this new type of method is particularly considered. Numerical results for several samples of function approximation, synthetic function estimation, and chaotic time series prediction illustrate that our new BP method is more robust than the MSE-based method in the sense of impulsive noise, especially when SNR is low.
Published in: IEEE Transactions on Neural Networks and Learning Systems ( Volume: 29, Issue: 12, December 2018)
Page(s): 6252 - 6263
Date of Publication: 10 May 2018

ISSN Information:

PubMed ID: 29993752

Funding Agency:


References

References is not available for this document.