Abstract
This chapter develops several batch and online learning algorithms for the error entropy criterion (EEC) that are counterparts to the most widely used algorithms for the mean square error criterion (MSE). Because the chapter assumes knowledge of adaptive filter design, readers unfamiliar with this topic should seek a textbook such as [332] or [253] for a review of fundamentals. But the treatment does not require an in-depth knowledge of this field. The case studies in this chapter address only adaptation of linear systems, not because entropic costs are particularly useful for the linear model, but because the solutions for linear systems are well understood and performance comparisons can be easily drawn. This chapter also considers applications of fast evaluations of the IP using the fast Gauss transform and incomplete Cholesky decomposition, and ends with an application of the error correntropy criterion (ECC) to adaptive noise cancellation.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Aczél J., Daróczy Z., On measures of information and their characterizations, Mathematics in Science and Engineering, vol. 115, Academic Press, New York, 1975.
Ahmad I., Lin P., A nonparametric estimation of the entropy for absolutely continuous distributions, IEEE Trans. on Inf. Theor., 22:372–375, 1976.
Erdogmus D., J. Principe, Generalized information potential for adaptive systems training, IEEE Trans. Neural Netw., 13(5):1035–1044, 2002.
Erdogmus D., Principe J., Kim S., Sanchez J., A recursive Renyi’s entropy estimator, Proc. IEEE Workshop on Neural Networks for Signal Process., Martigni, Switzerland, pp. 209–217, 2002.
Golub G., Van Loan C., Matrix Computation, 3rd ed. The Johns Hopkins University Press, Baltimore, Maryland, 1996.
Granas A., Dugundji J., Fixed Point Theory, Springer-Verlag, New York, 2003.
Han S., Rao S., Erdogmus D., Principe J., A minimum error entropy algorithm with self adjusting stepsize (MEE-SAS), Signal Process. 87:2733–2745.
Han S., Principe J., A Fixed-point Minimum Error Entropy Algorithm, in Proc. IEEE Int. Workshop on Machine Learning for Signal Processing, Maynooth, Ireland, 2006.
Han S., Rao S., Jeong K., Principe J., A normalized minimum error entropy stochastic algorithm”, Proc. IEEE Int. Conf. Acoustics Speech and Signal Processing, Toulouse, 2006.
Han S., A Family of Minimum Renyi’s Error Entropy Algorithm for Information Processing, Ph.D. dissertation, University of Florida, Summer, 2007.
Haykin S., Adaptive Filter Theory, 4th Edition, Prentice Hall, Englewood Cliffs, NJ, 2002.
Kushner H., Yin G., Stochastic approximation and recursive algorithms and applications, Application of Mathematics series, vol. 35, Springer, New York, 2003.
Lyapunov A. Stability of motion, Academic Press, New York, 1966
Principe J., Euliano N., Lefebvre C., Neural Systems: Fundamentals through Simulations, CD-ROM textbook, John Wiley, New York, 2000.
Rao Y., D. Erdogmus, G.Y. Rao, J.C. Principe, Fast error whitening algorithms for system identification and control with noisy data, Neurocomputing, 69:158–181, 2006.
Singh A., Principe J., Using correntropy as a cost function in linear adaptive filters, Proc. IEEE IJCNN 2009, Atlanta, 2009.
Widrow B., S. Stearns, Adaptive Signal Processing, Prentice Hall, Englewood Cliffs, NJ, 1985.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Erdogmus, D., Han, S., Singh, A. (2010). Algorithms for Entropy and Correntropy Adaptation with Applications to Linear Systems. In: Information Theoretic Learning. Information Science and Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-1570-2_4
Download citation
DOI: https://doi.org/10.1007/978-1-4419-1570-2_4
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4419-1569-6
Online ISBN: 978-1-4419-1570-2
eBook Packages: Computer ScienceComputer Science (R0)