Abstract
Our emphasis on the linear model in Chapter 4 was only motivated by simplicity and pedagogy. As we have demonstrated in the simple case studies, under the linearity and Gaussianity conditions, the final solution of the MEE algorithms was basically equivalent to the solution obtained with the LMS. Because the LMS algorithm is computationally simpler and better understood, there is really no advantage to use MEE in such cases.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Ahmad I., Lin P., A nonparametric estimation of the entropy for absolutely continuous distributions, IEEE Trans. on Inf. Theor., 22:372–375, 1976.
Battiti R., First and second order methods for learning: Between steepest descent and Newton’s method, Neural Comput., 4(2):141–166, 1992.
Benedetto S., Biglieri E., Nonlinear equalization of digital satellite channels, IEEE J. Select. Areas Commun., 1:57–62, Jan 1983.
Bishop C., Neural Networks for Pattern Recognition, Clarendon Press, Oxford, 1995.
Erdogmus D., Principe J.C., An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems, IEEE Trans. Signal Process., 50(7):1780–1786, 2002.
Erdogmus D., J. Principe, Generalized information potential for adaptive systems training, IEEE Trans. Neural Netw., 13(5):1035–1044, 2002.
Glass L., Mackey M., From clock to chaos, Princeton University Press, Princeton, NJ, 1998.
Han S., Rao S., Erdogmus D., Principe J., A minimum error entropy algorithm with self adjusting stepsize (MEE-SAS), Signal Process. 87:2733–2745.
Haykin S., Principe J., Dynamic modeling with neural networks, in IEEE Signal Process. Mag., 15(3):66–72, 1998.
Liu W., Pokharel P., Principe J., Correntropy: Properties and applications in non Gaussian signal processing, IEEE Trans. Sig. Proc., 55(11):5286–5298, 2007.
Lorenz E., Deterministic non-periodic flow, J. Atmospheric Sci., 20:130–141, 1963.
Moller M., A scaled conjugate gradient algorithm for fast supervised learning, Neural Netw., 6:525–533, 1993.
Morejon R., An information theoretic approach to sonar automatic target recognition, Ph.D. dissertation, University of Florida, Spring 2003
Morejon R., Principe J., Advanced parameter search algorithms for information-theoretic learning, IEEE Trans. Neural Netw. Special Issue Inf. Theor. Learn., 15(4):874–884, 2004.
Principe J., Euliano N., Lefebvre C., Neural Systems: Fundamentals through Simulations, CD-ROM textbook, John Wiley, New York, 2000.
Proakis J., Digital Communications, Prentice-Hall, Englewood Clifts, NJ,1988.
Riedmiller M., Braun H., A direct adaptive method for faster backpropagation learning: The RPROP Algorithm, in Proc. of the IEEE Intl. Conf. on Neural Networks, San Francisco, 1993, pp. 586–591.
Rumelhart D., McClelland J., Eds, Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol. 1, MIT Press, Cambridge, MA, 1986.
Sands N., Cioffi J., Nonlinear channel models for digital magnetic recording, IEEE Trans. Magnetics, 29(6):3996–3998, Nov 1993.
Sands N., Cioffi J., An improved detector for channels with nonlinear intersymbol interference, Proc. Intl. Conf. on Communications, vol 2, pp 1226–1230, 1994.
Santamaria I., Erdogmus D., Principe J.C., Entropy minimization for supervised communication channel equalization, IEEE Trans. Signal Process., 50(5):1184–1192, 2002.
Sheppard A., Second Order Methods for Neural Networks, Springer, London, 1997.
Werbos P., Beyond regression: New tools for prediction and analysis in the behavioral sciences, Ph.D. Thesis, Harvard University, Cambridge, 1974.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Erdogmus, D., Morejon, R., Liu, W. (2010). Nonlinear Adaptive Filtering with MEE, MCC, and Applications. In: Information Theoretic Learning. Information Science and Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-1570-2_5
Download citation
DOI: https://doi.org/10.1007/978-1-4419-1570-2_5
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4419-1569-6
Online ISBN: 978-1-4419-1570-2
eBook Packages: Computer ScienceComputer Science (R0)