Abstract:
The decoding of efficiently encoded messages, from either probabilistic, nonprobabilistic, or unknown message sources, is shown to be often practically impossible. If\tau...Show MoreMetadata
Abstract:
The decoding of efficiently encoded messages, from either probabilistic, nonprobabilistic, or unknown message sources, is shown to be often practically impossible. If\tau(S)is a running-time bound on the computational effort of a decoder\Psiaccepting a codewordPfor messageS, and\gamma[K_{\Psi}(S)]is an upper bound to acceptable codeword length\mid P \midwhen the shortest codeword forShas lengthK_{\Psi}(S), then for many message sources\mathcal{M}there exist messagesS \in \mathcal{M}such that: 1) if the encoder satisfies\gamma, then the decoder violates\tau; 2) if the decoder satisfies\tau, then the encoder violates\gamma. These conclusions remain valid even when we allow the decoder to reconstruct only an approximationS \primein a neighborhood\delta(S)ofS. The compatibility of these results with those of information theory rests upon the fact that we are inquiring into the detailed properties of coding systems for individual messages and not into the ensemble average properties.
Published in: IEEE Transactions on Information Theory ( Volume: 21, Issue: 4, July 1975)