The entropy function H(X) is a measure of the uncertainty of X, in formula
where p X(a)=pR[x=A] denotes the probability that random variable X takes on value a. The interpretation is that with probability \(p_{X}(a)\), X can be described by \({\rm log_2}\ p_X(a)\) bits of information.
The conditional entropy or equivocation (Shannon 1949) H(X/Y) denotes the uncertainty of X provided Y is known:
where \(p_{X,Y}(a,b)=_{\rm def} {\rm Pr}[(X=a)\wedge(Y=b)]\) and p X∣Y (a, b) obeys Bayes' rule for conditional probabilities:
The basic relation on conditional entropy follows from this:
In particular, we note that the entropy is additive if and only if X and Y are independent:
in analogy to the additive entropy of thermodynamical systems.
The redundancyof a text is that part (expressed in bits) that does not carry information. In common English, the redundancy is roughly 3.5 [bit/char],...
This is a preview of subscription content, log in via an institution.
References
Bauer, F.L. (1997). “Decrypted secrets.” Methods and Maxims of Cryptology. Springer-Verlag, Berlin
McEliece, R.J. (1977). “The theory of information and coding.” Encyclopedia of Mathematics and its Applications, vol. 3. Addison-Wesley Publishing Company, Reading, MA.
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 International Federation for Information Processing
About this entry
Cite this entry
Bauer, F.L. (2005). Information Theory. In: van Tilborg, H.C.A. (eds) Encyclopedia of Cryptography and Security. Springer, Boston, MA . https://doi.org/10.1007/0-387-23483-7_199
Download citation
DOI: https://doi.org/10.1007/0-387-23483-7_199
Publisher Name: Springer, Boston, MA
Print ISBN: 978-0-387-23473-1
Online ISBN: 978-0-387-23483-0
eBook Packages: Computer ScienceReference Module Computer Science and Engineering