Abstract:
We propose a sequence of universal denoisers motivated by the goal of extending the notion of twice-universality from universal data compression theory to the sliding win...Show MoreMetadata
Abstract:
We propose a sequence of universal denoisers motivated by the goal of extending the notion of twice-universality from universal data compression theory to the sliding window denoising setting. Given a sequence length n and a denoiser, the kth-order regret of the latter is the maximum excess expected denoising loss relative to sliding window denoisers with window length 2k+1, where, for a given clean sequence, the expectation is over all channel realizations and the maximum is over all clean sequences of length n. We define the twice-universality penalty of a denoiser as its excess kth-order regret when compared to a bound on the kth-order regret of the denoising algorithm DUDE with parameter k, and we are interested in denoisers with a negligible penalty for all k simultaneously. We consider a class of denoisers that apply one of a number of constituent denoisers based on minimizing an estimated denoising loss and establish a formal relationship between the error in the estimated denoising loss and the twice-universality penalty of the resulting denoiser. Given a sequence of window parameters kn, increasing in n sufficiently fast, we use this approach to construct and analyze a specific sequence of denoisers that achieves a much smaller twice-universality penalty for k <; kn than the sequence of DUDE denoisers with parameter kn.
Published in: IEEE Transactions on Information Theory ( Volume: 59, Issue: 1, January 2013)