Abstract
In Shannon’s classical model of transmitting a message over a noisy channel we have the following situation:
There are two persons called sender and receiver. Sender and receiver can communicate via a channel. In the simplest case the sender just puts some input letters into the channel and the receiver gets some output letters. Usually the channel is noisy, i.e. the channel output is a random variable whose distribution is governed by the input letters. This model can be extended in several ways: Channels with passive feedback for example give the output letters back to the sender. Multiuser channels like multiple access channels or broadcast channels (which will not be considered in this paper) have several senders or receivers which want to communicate simultaneously. Common to all these models of transmission is the task that sender and receiver have to perform: Both have a common message setM and the sender is given a messagei∈M. He has to encode the message (i.e. transform it into a sequence of input letters for the channel) in such a way, that the receiver can decode the sequence of output letters so that he can decide with a small probability of error what the message i was. The procedures for encoding and decoding are called a code for the channel and the number of times the channel is used to transmit one message is called the blocklength of the code.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Ahlswede, R., Dueck, G.: Identification via channels. IEEE Trans. Inform. Theory 35, 15–29 (1989)
Ahlswede, R., Dueck, G.: Identification in the presence of feedback—a discovery of new capacity formulas. IEEE Trans. Inform. Theory 35, 30–36 (1989)
Ahlswede, R., Verboven, B.: On identification via multiway channels with feedback. IEEE Trans. Inform. Theory 37, 1519–1526 (1991)
Han, T.S., Verdú, S.: New results in the theory of identification via channels. IEEE Trans. Inform. Theory 38, 14–25 (1992)
Han, T.S., Verdú, S.: Approximation theory of output statistics. IEEE Trans. Inform. Theory 39, 752–772 (1993)
Ahlswede, R., Zhang, Z.: New directions in the theory of identification, SFB 343 Diskrete Strukturen in der Mathematik, Bielefeld, Preprint 94-010 (1994)
Ahlswede, R.: General theory of information transfer, Preprint 97–118, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld (1997); General theory of information transfer:updated, General Theory of Information Transfer and Combinatorics, a Special Issue of Discrete Applied Mathematics (to appear)
Ahlswede, R., Csiszár, I.: Common randomness in information theory and cryptograpy II: CR capacity. IEEE Trans. Inform. Theory 44, 225–240 (1998)
Cover, T.M., Thomas, J.A.: Elements of information theory. Wiley Ser. Telecom, New York (1991)
Cziszar, I., Körner, J.: Information theory: Coding theorems for discrete memoryless systems. Academic Press, New York (1981)
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Kleinewächter, C. (2006). On Identification. In: Ahlswede, R., et al. General Theory of Information Transfer and Combinatorics. Lecture Notes in Computer Science, vol 4123. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11889342_5
Download citation
DOI: https://doi.org/10.1007/11889342_5
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-46244-6
Online ISBN: 978-3-540-46245-3
eBook Packages: Computer ScienceComputer Science (R0)