Loading [a11y]/accessibility-menu.js
Higher Order Derivatives in Costa’s Entropy Power Inequality | IEEE Journals & Magazine | IEEE Xplore

Higher Order Derivatives in Costa’s Entropy Power Inequality


Abstract:

Let X be an arbitrary continuous random variable and Z be an independent Gaussian random variable with zero mean and unit variance. For t > 0, Costa proved that e2h(X+√t ...Show More

Abstract:

Let X be an arbitrary continuous random variable and Z be an independent Gaussian random variable with zero mean and unit variance. For t > 0, Costa proved that e2h(X+√t Z) is concave in t, where the proof hinged on the first and second order derivatives of h(X + √t Z). In particular, these two derivatives are signed, i.e., (∂/∂t)h(X + √tZ) ≥ 0 and (∂2/∂t2)h(X + √tZ) ≤ 0. In this paper, we show that the third order derivative of h(X + √tZ) is nonnegative, which implies that the Fisher information J(X + √tZ) is convex in t. We further show that the fourth order derivative of h(X +√tZ) is nonpositive. Following the first four derivatives, we make two conjectures on h(X +√tZ): the first is that (∂n/∂tn)h(X +√tZ) is nonnegative in t if n is odd, and nonpositive otherwise; the second is that log J(X + √tZ) is convex in t. The first conjecture can be rephrased in the context of completely monotone functions: J(X + √tZ) is completely monotone in t. The history of the first conjecture may date back to a problem in mathematical physics studied by McKean in 1966. Apart from these results, we provide a geometrical interpretation to the covariance-preserving transformation and study the concavity of h(√t X +√1 - t Z), revealing its connection with Costa's entropy power inequality.
Published in: IEEE Transactions on Information Theory ( Volume: 61, Issue: 11, November 2015)
Page(s): 5892 - 5905
Date of Publication: 22 September 2015

ISSN Information:

Funding Agency:


References

References is not available for this document.