Abstract:
An important notion of common information between two random variables is due to Wyner. In this paper, we derive a lower bound on a relaxed variant of Wyner's common info...Show MoreMetadata
Abstract:
An important notion of common information between two random variables is due to Wyner. In this paper, we derive a lower bound on a relaxed variant of Wyner's common information for continuous random variables. The new bound reduces to the lower bound on Wyner's common information of Liu (2018). We also show that the new lower bound is tight for a special case of the so-called “Gaussian channels”, namely, when the joint distribution of the random variables can be written as the sum of a single underlying random variable and Gaussian noises. We motivate this work from the recent variations of Wyner's common information and applications to network data compression problems such as the Gray-Wyner network.
Date of Conference: 12-20 July 2021
Date Added to IEEE Xplore: 01 September 2021
ISBN Information: