Abstract
Similarity is a key concept to quantify temporal signals or static measurements. Similarity is difficult to define mathematically, however, one never really thinks too much about this difficulty and naturally translates similarity by correlation. This is one more example of how engrained second-order moment descriptors of the probability density function really are in scientific thinking. Successful engineering or pattern recognition solutions from these methodologies rely heavily on the Gaussianity and linearity assumptions, exactly for the same reasons discussed in Chapter 3.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Aronszajn N., The theory of reproducing kernels and their applications, Cambridge Philos. Soc. Proc., vol. 39:133–153, 1943.
Bach F., Jordan M., Kernel independent component analysis, J. Mach. Learn. Res., 3:1–48, 2002.
Cover T., Thomas J., Elements of Information Theory, Wiley, New York, 1991
Erdogmus D., Agrawal R., Principe J., A mutual information extension to the matched filter, Signal Process., 85(5):927–935, May 2005.
Fukumizu K., Gretton A., Sun X., Scholkopf B., Kernel measures of conditional dependence. In Platt, Koller, Singer, and Roweis Eds., Advances in Neural Information Processing Systems 20, pp. 489–496. MIT Press, Cambridge, MA, 2008.
Granger C., Maasoumi E., and Racine J., A dependence metric for possibly nonlinear processes. J. Time Series Anal., 25(5):649–669, 2004.
Gretton, A., Herbrich R., Smola A., Bousquet O., Schölkopf B., Kernel Methods for Measuring Independence,” J. Mach. Learn. Res., 6:2075–2129, 2005.
Joe H., Relative entropy measures of multivariate dependence. J. Amer. Statist. Assoc., 84(405):157–164, 1989.
Liu W., Pokharel P., Principe J., Correntropy: Properties and applications in non Gaussian signal processing, IEEE Trans. Sig. Proc., 55(11):5286–5298, 2007.
Mari D., Kotz S., Correlation and Dependence. Imperial College Press, London, 2001.
McCulloch J., Financial applications of stable distributions. In G. S. Madala and C.R. Rao (Eds.), Handbook of Statistics, vol. 14, pages 393–425. Elsevier, Amsterdam, 1996.
Mercer J., Functions of positive and negative type, and their connection with the theory of integral equations, Philosoph. Trans. Roy. Soc. Lond., 209:415–446, 1909.
Micheas A. and Zografos K., Measuring stochastic dependence using ’-divergence. J. Multivar. Anal., 97:765–784, 2006.
Nikias C. Shao M., Signal Processing with Alpha-Stable Distributions and Applications. John Wiley and Sons, New York, 1995.
Parzen E., On the estimation of a probability density function and the mode, Ann. Math. Statist., 33:1065–1067, 1962.
Pokharel P., Liu W., Principe J., A low complexity robust detector in impulsive noise, Signal Process., 89(10):1902–1909, 2009.
Prichard, D., Theiler, J. (1994). Generating surrogate data for time series with several simultaneously measured variables. Phys. Rev. Lett., 73(7):951–954.
Rao M., Xu J., Seth S., Chen Y., Tagare M., Principe J., Correntropy dependence measure, submitted to IEEE Trans. Signal Processing.
Renyi A., On measures of dependence. Acta Mathematica Academiae Scientiarum Hungaricae, 10:441–451, 1959.
Santamaría I., Pokharel P., Principe J, Generalized correlation function: Definition, properties and application to blind equalization, IEEE Trans. Signal Process., 54(6):2187–2197, 2006.
Schreiber, T., Schmitz, A. (2000). Surrogate time series. Physica D, 142:346–382.
Silverman B., Density Estimation for Statistics and Data Analysis, Chapman and Hall, London, 1986.
Silvey S., On a measure of association. Ann Math Statist., 35(3): 1157–1166, 1964.
Xu J., Bakardjian H., Cichocki A., Principe J., A new nonlinear similarity measure for multichannel signals, Neural Netw. (invited paper), 21(2–3):222–231, 2008.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Liu, W., Pokharel, P., Xu, J., Seth, S. (2010). Correntropy for Random Variables: Properties and Applications in Statistical Inference. In: Information Theoretic Learning. Information Science and Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-1570-2_10
Download citation
DOI: https://doi.org/10.1007/978-1-4419-1570-2_10
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4419-1569-6
Online ISBN: 978-1-4419-1570-2
eBook Packages: Computer ScienceComputer Science (R0)