Skip to main content

Algorithms for Entropy and Correntropy Adaptation with Applications to Linear Systems

  • Chapter
  • First Online:
Information Theoretic Learning

Part of the book series: Information Science and Statistics ((ISS))

  • 3939 Accesses

Abstract

This chapter develops several batch and online learning algorithms for the error entropy criterion (EEC) that are counterparts to the most widely used algorithms for the mean square error criterion (MSE). Because the chapter assumes knowledge of adaptive filter design, readers unfamiliar with this topic should seek a textbook such as [332] or [253] for a review of fundamentals. But the treatment does not require an in-depth knowledge of this field. The case studies in this chapter address only adaptation of linear systems, not because entropic costs are particularly useful for the linear model, but because the solutions for linear systems are well understood and performance comparisons can be easily drawn. This chapter also considers applications of fast evaluations of the IP using the fast Gauss transform and incomplete Cholesky decomposition, and ends with an application of the error correntropy criterion (ECC) to adaptive noise cancellation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Aczél J., Daróczy Z., On measures of information and their characterizations, Mathematics in Science and Engineering, vol. 115, Academic Press, New York, 1975.

    Google Scholar 

  2. Ahmad I., Lin P., A nonparametric estimation of the entropy for absolutely continuous distributions, IEEE Trans. on Inf. Theor., 22:372–375, 1976.

    Article  MATH  MathSciNet  Google Scholar 

  3. Erdogmus D., J. Principe, Generalized information potential for adaptive systems training, IEEE Trans. Neural Netw., 13(5):1035–1044, 2002.

    Article  Google Scholar 

  4. Erdogmus D., Principe J., Kim S., Sanchez J., A recursive Renyi’s entropy estimator, Proc. IEEE Workshop on Neural Networks for Signal Process., Martigni, Switzerland, pp. 209–217, 2002.

    Google Scholar 

  5. Golub G., Van Loan C., Matrix Computation, 3rd ed. The Johns Hopkins University Press, Baltimore, Maryland, 1996.

    Google Scholar 

  6. Granas A., Dugundji J., Fixed Point Theory, Springer-Verlag, New York, 2003.

    Book  MATH  Google Scholar 

  7. Han S., Rao S., Erdogmus D., Principe J., A minimum error entropy algorithm with self adjusting stepsize (MEE-SAS), Signal Process. 87:2733–2745.

    Google Scholar 

  8. Han S., Principe J., A Fixed-point Minimum Error Entropy Algorithm, in Proc. IEEE Int. Workshop on Machine Learning for Signal Processing, Maynooth, Ireland, 2006.

    Google Scholar 

  9. Han S., Rao S., Jeong K., Principe J., A normalized minimum error entropy stochastic algorithm”, Proc. IEEE Int. Conf. Acoustics Speech and Signal Processing, Toulouse, 2006.

    Google Scholar 

  10. Han S., A Family of Minimum Renyi’s Error Entropy Algorithm for Information Processing, Ph.D. dissertation, University of Florida, Summer, 2007.

    Google Scholar 

  11. Haykin S., Adaptive Filter Theory, 4th Edition, Prentice Hall, Englewood Cliffs, NJ, 2002.

    Google Scholar 

  12. Kushner H., Yin G., Stochastic approximation and recursive algorithms and applications, Application of Mathematics series, vol. 35, Springer, New York, 2003.

    Google Scholar 

  13. Lyapunov A. Stability of motion, Academic Press, New York, 1966

    MATH  Google Scholar 

  14. Principe J., Euliano N., Lefebvre C., Neural Systems: Fundamentals through Simulations, CD-ROM textbook, John Wiley, New York, 2000.

    Google Scholar 

  15. Rao Y., D. Erdogmus, G.Y. Rao, J.C. Principe, Fast error whitening algorithms for system identification and control with noisy data, Neurocomputing, 69:158–181, 2006.

    Article  Google Scholar 

  16. Singh A., Principe J., Using correntropy as a cost function in linear adaptive filters, Proc. IEEE IJCNN 2009, Atlanta, 2009.

    Google Scholar 

  17. Widrow B., S. Stearns, Adaptive Signal Processing, Prentice Hall, Englewood Cliffs, NJ, 1985.

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer Science+Business Media, LLC

About this chapter

Cite this chapter

Erdogmus, D., Han, S., Singh, A. (2010). Algorithms for Entropy and Correntropy Adaptation with Applications to Linear Systems. In: Information Theoretic Learning. Information Science and Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-1570-2_4

Download citation

  • DOI: https://doi.org/10.1007/978-1-4419-1570-2_4

  • Published:

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4419-1569-6

  • Online ISBN: 978-1-4419-1570-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics