Abstract:
Recently, there has been much interest in information theoretic learning (ITL) criteria, widely used in several applications with different robust algorithms. Dealing wit...Show MoreMetadata
Abstract:
Recently, there has been much interest in information theoretic learning (ITL) criteria, widely used in several applications with different robust algorithms. Dealing with the rate convergence of the information theoretic robust approaches, we introduce a fast novel correntropy based algorithm, as correntropy based Levenberg-Marquardt (CLM), and apply it to nonlinear learning problems. This new method converges significantly faster than common gradient descent maximum correntropy based method, and is robust against heavy-tailed noise distributions or outliers. In addition, as our numerical results illustrate, this novel method while having the same convergence rate as the previously presented fastest correntropy based method for linear models, called fixed-point maximum correntropy criteria (FP-MCC), is also valid for all linear and nonlinear learning models.
Date of Conference: 03-04 May 2016
Date Added to IEEE Xplore: 16 June 2016
Electronic ISBN:978-1-5090-1922-9