Abstract
This chapter formulates a new cost function for adaptive filtering based on Renyi’s quadratic error entropy. The problem of estimating the linear system parameters \(\mathrm{\mathbf {w}} = {[{w}_{0},\ldots, {w}_{M-1}]}^{\mathrm{T}}\) in the setting of Figure 3.1 where x(n), and z(n) are random variables can be framed as model-based inference, because it relates measured data, uncertainty, and the functional description of the system and its parameters. The desired response z(n) can be thought of as being created by an unknown transformation of the input vector \(\mathrm{\mathbf {x}} = {[x(n),\ldots, x(n - M + 1)]}^{\mathrm{T}}\). Adaptive filtering theory [143, 284] addresses this problem using the MSE criterion applied to the error signal, \(e(n) = z(n) - f(\mathrm{\mathbf {w}},x(n))\)
when the linear filter is a finite impulse response filter (FIR);
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
We’re sorry, something doesn't seem to be working properly.
Please try refreshing the page. If that doesn't work, please contact support so we can address the problem.
References
Aczél J., Daróczy Z., On measures of information and their characterizations, Mathematics in Science and Engineering, vol. 115, Academic Press, New York, 1975.
Ahmad I., Lin P., A nonparametric estimation of the entropy for absolutely continuous distributions, IEEE Trans. on Inf. Theor., 22:372–375, 1976.
Al-Naffouri T., Zerguine A., Bettayeb M., A unifying view of error nonlinearities in LMS adaptation, in Proc. ICASSP, vol. III, Seattle, pp. 1697–1700, May 1998.
Amari S., Nagoata H., Methods of information geometry, Mathematical Monographs, vol. 191, American Mathematical Society, Providence RI, 2000.
Chen B., Hu J., Pu L., Sun Z., Stochastic gradient algorithm under (h, ϕ)-entropy criterion, Circuits Syst. Signal Process., 26:941–960, 2007.
Douglas S., Meng H., Stochastic gradient adaptation under general error criteria, IEEE Trans. Signal Process., 42:1335–1351, 1994.
Edmonson W., Srinivasan K., Wang C., Principe J. A global least square algorithm for adaptive IIR filtering, IEEE Trans. Circuits Syst., 45(3):379–384, 1996.
Erdogmus D., Principe J.C., An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems, IEEE Trans. Signal Process., 50(7):1780–1786, 2002.
Erdogmus D., J. Principe, Generalized information potential for adaptive systems training, IEEE Trans. Neural Netw., 13(5):1035–1044, 2002.
Fox J., An R and S Companion to Applied Regression, Sage, London, 2002.
Hampel, F. R., Ronchetti E. M., Rousseau P. J., Stahel W. A., Robust Statistics: The Approach Based on Influence Functions. Wiley, New York, 1985.
Hardle W., Applied Nonparametric Regression, Econometric Society Monographs vol 19, Cambridge University Press, New York, 1990.
Haykin S., Adaptive Filter Theory, 4th Edition, Prentice Hall, Englewood Cliffs, NJ, 2002.
Huber, P.J., Robust Estimation of a Location Parameter. Ann. Math. Statist., 35:73–101, 1964.
Jenssen R., Erdogmus D., Hild II K., Principe J., Eltoft T., Information cut for clustering using a gradient descent approach, Pattern Recogn., 40:796–806, 2006.
Liu W., Pokharel P., Principe J., Error entropy, correntropy and M-estimation, IEEE Int. Workshop on Machine Learning for Signal Processing, 2006.
Liu W., Pokharel P., Principe J., Correntropy: Properties and applications in non Gaussian signal processing, IEEE Trans. Sig. Proc., 55(11):5286–5298, 2007.
Middleton D., Statistical-physical models of electromagnetic interference, IEEE Trans. Electromagn. Compat., EMC-19(3):106–126, Aug. 1977.
Morejon R., An information theoretic approach to sonar automatic target recognition, Ph.D. dissertation, University of Florida, Spring 2003
Pei S., Tseng C., Least mean p-power error criterion for adaptive FIR filter, IEEE J. Selected Areas Commun., 12(9):1540–1547, 1994.
Rubinstein R., Simulation and the Monte Carlo Method, John Wiley & Sons, New York, 1981.
Sayed A., Fundamentals of Adaptive Filters, John Wiley & Son, New York, 2003
Sidak Z., Sen P., Hajek J., Theory of Rank Tests, Academic Press, London, 1999.
Singh A., Principe J., Using correntropy as a cost function in linear adaptive filters, Proc. IEEE IJCNN 2009, Atlanta, 2009.
Styblinski M., Tang T., Experiments in nonconvex optimization: Stochastic approximation with function smoothing and simulated annealing, Neural Netw., 3: 467–483, 1990.
Tanrikuku O., Chambers J., Convergence and steady-state properties of the least-mean mixed norm (LMMN) adaptive algorithm, IEE Proc. -Vision, Image Signal Process., 143: 137–142, June 1996.
Walach E., Widrow B., The least mean fourth (LMF) adaptive algorithm and its family, IEEE Trans. Inf. Theor., IT-30(2):275–283, 1984.
Widrow B., S. Stearns, Adaptive Signal Processing, Prentice Hall, Englewood Cliffs, NJ, 1985.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Erdogmus, D., Liu, W. (2010). Adaptive Information Filtering with Error Entropy and Error Correntropy Criteria. In: Information Theoretic Learning. Information Science and Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-1570-2_3
Download citation
DOI: https://doi.org/10.1007/978-1-4419-1570-2_3
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4419-1569-6
Online ISBN: 978-1-4419-1570-2
eBook Packages: Computer ScienceComputer Science (R0)