Abstract:
The joint source and channel coding theorem states that for any stationary ergodic process X/sub 1/, X/sub 2/.... whose entropy rate H/sub /spl infin// (X) lies below the...Show MoreMetadata
Abstract:
The joint source and channel coding theorem states that for any stationary ergodic process X/sub 1/, X/sub 2/.... whose entropy rate H/sub /spl infin// (X) lies below the capacity C of the channel, it is possible to find a joint source and channel encoder of dimension n such that the probability of a decoding error is smaller than /spl epsiv/ > 0. A joint source and channel encoder in this context is a mapping from the sources sequences of length n to code sequences of length n, i.e., a "rate 1" encoder. The information rate over the channel is the entropy rate of the source. The question we try to resolve is whether there are Markov sources for which a joint source and channel encoder is not necessary. What reliability can be achieved by a decoder that uses the "natural" redundancy of the source to reconstruct its output, when the source output is transmitted uncoded over the channel? Human decoders are able to reconstruct English text when up to half of the letters in the text are missing. Are French or German preferable to English in this respect? What properties of a Markov source make it suitable for uncoded transmission? At equal entropy rates, can one Markov source be better suited than another for uncoded transmission? Is there a "good" and a "bad" redundancy?.
Published in: Proceedings of the IEEE Information Theory Workshop
Date of Conference: 25-25 October 2002
Date Added to IEEE Xplore: 06 January 2003
Print ISBN:0-7803-7629-3