Skip to main content

Information matrices associated to (h, π)-divergence measures: Applications to testing hypotheses

  • Probabilistic, Statistical and Informational Methods
  • Conference paper
  • First Online:
Advances in Intelligent Computing — IPMU '94 (IPMU 1994)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 945))

  • 172 Accesses

Abstract

In this work, parametric measures of information (information matrices) are defined from the nonparametric information measures called ((h, π)) -divergences. Asymptotic distributions for information matrices are obtained when the parameter is replaced by its maximum likelihood estimator. On the basis of these results, tests of hypotheses are constructed.

The research in this paper was supported in part by DGICYT Grants N. PB93-0068 and N. PB93-0022. Their financial support is gratefully acknowledged

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. V. Balakrishman, L.D. Sanghvi: Distance between populations on the basis of attribute data. Biometrics, 24, 859–865 (1986)

    Google Scholar 

  2. A. Battacharyya: On a measure of divergence between two statistical populations defined by their probability distributions. Bull. Cal. Math. Soc., 35, 99–109 (1946)

    Google Scholar 

  3. J. Burbea: Informative geometry of probability spaces. Expositions Mathematicae, 4, 347–378 (1986)

    Google Scholar 

  4. N. Cressie, T.R.C. Read: Multinomial goodness of fit tests. J. Royal Statist. Soc., B 46, 440–464 (1984)

    Google Scholar 

  5. I. Csiszar: Information type measures of difference of probability distributions and indirect observations. Studia Sci. Mat. Hung., 2, 299–318 (1967)

    Google Scholar 

  6. J. Havrda, F. Charvát: Quantification of classification processes. Concept of structural α-entropy. Kybernetika, 2, 30–35 (1967)

    Google Scholar 

  7. M. Kagan: On the theory of Fisher's amount of information. Sov. Math. Stat., 27, 986–1005 (1963)

    Google Scholar 

  8. S. Kullback, A. Leibler: On the information and sufficiency. Ann. Math. Statist., 27, 986–1005 (1951)

    Google Scholar 

  9. K. Matusita: Decision rules, based on the distance for problems of fit, two samples and estimation. Ann. Math. Statist. 26, 631–640 (1964)

    Google Scholar 

  10. M.L. Menéndez, D. Morales, L. Pardo, M. Salicrú: Asymptotic distribution of the generalized distance measure in a random sampling. Proceeding of Distance'92. 337–340 (1992)

    Google Scholar 

  11. D. Morales, L. Pardo, M. Salicrú, M.L. Menéndez: Asymptotic properties of divergence statistics in a stratified random sampling and its applications to test statistical hypotheses. Journal of Statistical Planning and Inference, 38, 201–221 (1994)

    Google Scholar 

  12. A. Renyi: On measures of entropy and information. Proceedings of 4th Berkeley Symp. Math. Statist. and Prob., 1, 547–561 (1961)

    Google Scholar 

  13. M. Salicrú, M.L. Menéndez, D. Morales, L. Pardo: On the applications of divergence type measures in testing statistical hypotheses. Journal of Multivariate Analysis, 51, 372–391 (1994)

    Google Scholar 

  14. B.D. Sharma, D.P. Mittal: New nonadditive measures of entropy for discrete probability distributions. J. Math. Sci., 10, 28–40 (1977)

    Google Scholar 

  15. I.J. Taneja: On generalized information measures and their applications. Adv. Elect. and Elect. Phis. 76, 327–413 (1989)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Bernadette Bouchon-Meunier Ronald R. Yager Lotfi A. Zadeh

Rights and permissions

Reprints and permissions

Copyright information

© 1995 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Morales, D., Pardo, L., Salicrú, M., Menéndez, M.L. (1995). Information matrices associated to (h, π)-divergence measures: Applications to testing hypotheses. In: Bouchon-Meunier, B., Yager, R.R., Zadeh, L.A. (eds) Advances in Intelligent Computing — IPMU '94. IPMU 1994. Lecture Notes in Computer Science, vol 945. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0035955

Download citation

  • DOI: https://doi.org/10.1007/BFb0035955

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-60116-6

  • Online ISBN: 978-3-540-49443-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics