Abstract:
This paper illustrates the equivalence between two fundamental results: Stein identity, originally proposed in the statistical estimation realm, and de Bruijn identity, c...Show MoreMetadata
Abstract:
This paper illustrates the equivalence between two fundamental results: Stein identity, originally proposed in the statistical estimation realm, and de Bruijn identity, considered for the first time in the information theory field. Two distinctive extensions of de Bruijn identity are presented as well. For arbitrary but fixed input and noise distributions, the first-order derivative of differential entropy is expressed by means of a function of the posterior mean, while the second-order derivative of differential entropy is manifested in terms of a function of Fisher information. Several applications exemplify the utility of the proposed results.
Date of Conference: 01-06 July 2012
Date Added to IEEE Xplore: 27 August 2012
ISBN Information: