Abstract:
This paper presents the Bayes Fisher information measures, defined by the expected Fisher information under a distribution for the parameter, for the arithmetic, geometri...Show MoreMetadata
Abstract:
This paper presents the Bayes Fisher information measures, defined by the expected Fisher information under a distribution for the parameter, for the arithmetic, geometric, and generalized mixtures of two probability density functions. The Fisher information of the arithmetic mixture about the mixing parameter is related to chi-square divergence, Shannon entropy, and the Jensen-Shannon divergence. The Bayes Fisher measures of the three mixture models are related to the Kullback-Leibler, Jeffreys, Jensen-Shannon, Rényi, and Tsallis divergences. These measures indicate that the farther away are the components from each other, the more informative are data about the mixing parameter. We also unify three different relative entropy derivations of the geometric mixture scattered in statistics and physics literatures. Extensions of two of the formulations to the minimization of Tsallis divergence give the generalized mixture as the solution.
Published in: IEEE Transactions on Information Theory ( Volume: 65, Issue: 4, April 2019)