Related Concepts
Cryptology (Classical); Shannon’s Model
The entropy function H(X) is a measure of the uncertainty of X, in formula
where \({p}_{\textrm{ X}}(a) = \textrm{ Pr}[x = A]\) denotes the probability that random variable X takes on value a. The interpretation is that with probability \({p}_{X}(a)\), X can be described by \({\log}_{2}\ {p}_{X}(a)\) bits of information.
The conditional entropy or equivocation (Shannon, 1949) H(X ∕ Y ) denotes the uncertainty of X provided Y is known:
where \({p}_{X,Y }(a,b) {= }_{\textrm{ def}}\textrm{ Pr}[(X = a) \wedge (Y = b)]\) and \({p}_{X\vert Y }(a\vert b)\)obeys Bayes’ rule for...
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsRecommended Reading
Bauer FL (1997) Decrypted secrets. In: Methods and maxims of cryptology. Springer, Berlin
McEliece RJ (1977) The theory of information and coding. In: Encyclopedia of mathematics and its applications, vol 3. Addison-Wesley, Reading
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer Science+Business Media, LLC
About this entry
Cite this entry
Bauer, F.L. (2011). Information Theory. In: van Tilborg, H.C.A., Jajodia, S. (eds) Encyclopedia of Cryptography and Security. Springer, Boston, MA. https://doi.org/10.1007/978-1-4419-5906-5_169
Download citation
DOI: https://doi.org/10.1007/978-1-4419-5906-5_169
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4419-5905-8
Online ISBN: 978-1-4419-5906-5
eBook Packages: Computer ScienceReference Module Computer Science and Engineering