Abstract
In this work, parametric measures of information (information matrices) are defined from the nonparametric information measures called ((h, π)) -divergences. Asymptotic distributions for information matrices are obtained when the parameter is replaced by its maximum likelihood estimator. On the basis of these results, tests of hypotheses are constructed.
The research in this paper was supported in part by DGICYT Grants N. PB93-0068 and N. PB93-0022. Their financial support is gratefully acknowledged
Preview
Unable to display preview. Download preview PDF.
References
V. Balakrishman, L.D. Sanghvi: Distance between populations on the basis of attribute data. Biometrics, 24, 859–865 (1986)
A. Battacharyya: On a measure of divergence between two statistical populations defined by their probability distributions. Bull. Cal. Math. Soc., 35, 99–109 (1946)
J. Burbea: Informative geometry of probability spaces. Expositions Mathematicae, 4, 347–378 (1986)
N. Cressie, T.R.C. Read: Multinomial goodness of fit tests. J. Royal Statist. Soc., B 46, 440–464 (1984)
I. Csiszar: Information type measures of difference of probability distributions and indirect observations. Studia Sci. Mat. Hung., 2, 299–318 (1967)
J. Havrda, F. Charvát: Quantification of classification processes. Concept of structural α-entropy. Kybernetika, 2, 30–35 (1967)
M. Kagan: On the theory of Fisher's amount of information. Sov. Math. Stat., 27, 986–1005 (1963)
S. Kullback, A. Leibler: On the information and sufficiency. Ann. Math. Statist., 27, 986–1005 (1951)
K. Matusita: Decision rules, based on the distance for problems of fit, two samples and estimation. Ann. Math. Statist. 26, 631–640 (1964)
M.L. Menéndez, D. Morales, L. Pardo, M. Salicrú: Asymptotic distribution of the generalized distance measure in a random sampling. Proceeding of Distance'92. 337–340 (1992)
D. Morales, L. Pardo, M. Salicrú, M.L. Menéndez: Asymptotic properties of divergence statistics in a stratified random sampling and its applications to test statistical hypotheses. Journal of Statistical Planning and Inference, 38, 201–221 (1994)
A. Renyi: On measures of entropy and information. Proceedings of 4th Berkeley Symp. Math. Statist. and Prob., 1, 547–561 (1961)
M. Salicrú, M.L. Menéndez, D. Morales, L. Pardo: On the applications of divergence type measures in testing statistical hypotheses. Journal of Multivariate Analysis, 51, 372–391 (1994)
B.D. Sharma, D.P. Mittal: New nonadditive measures of entropy for discrete probability distributions. J. Math. Sci., 10, 28–40 (1977)
I.J. Taneja: On generalized information measures and their applications. Adv. Elect. and Elect. Phis. 76, 327–413 (1989)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1995 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Morales, D., Pardo, L., Salicrú, M., Menéndez, M.L. (1995). Information matrices associated to (h, π)-divergence measures: Applications to testing hypotheses. In: Bouchon-Meunier, B., Yager, R.R., Zadeh, L.A. (eds) Advances in Intelligent Computing — IPMU '94. IPMU 1994. Lecture Notes in Computer Science, vol 945. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0035955
Download citation
DOI: https://doi.org/10.1007/BFb0035955
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-60116-6
Online ISBN: 978-3-540-49443-0
eBook Packages: Springer Book Archive