Skip to main content
Log in

Properties of two Shannon’s ciphers

  • Published:
Designs, Codes and Cryptography Aims and scope Submit manuscript

Abstract

In 1949 Shannon published the famous paper “Communication theory of secrecy systems” where he briefly described two ciphers, but did not investigate their properties. In this note we carry out information-theoretical analysis of these ciphers. In particular, we propose estimations of the cipher equivocation and the probability of correct deciphering without key.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Calmon E.P., Medard M., Varia M., Duffy K.R., Christiansen M.M., Zeger L.M.: Hiding Symbols and Functions: New Metrics and Constructions for Information-Theoretic Security. arxiv:1503.08515 (2015).

  2. Cover T.M., Thomas J.A.: Elements of Information Theory. Wiley-Interscience, New York (2006).

    MATH  Google Scholar 

  3. Diffie W., Hellman M.E.: Privacy and authentication: an introduction to cryptography. Proc. IEEE 67(3), 397–427 (1979).

    Article  Google Scholar 

  4. Hellman M.E.: An extension of the Shannon theory approach to cryptography. IEEE Trans. Inf. Theory 23(3), 289–294 (1977).

    Article  MathSciNet  MATH  Google Scholar 

  5. Lu S.-C.: The existence of good cryptosystems for key rates greater than the message redundancy. IEEE Trans. Inf. Theory 25(4), 475–477 (1979).

    Article  MathSciNet  MATH  Google Scholar 

  6. Ryabko B.: The Vernam cipher is robust to small deviations from randomness. Probl. Inf. Transm. 51(1), 82–86 (2015).

    Article  MathSciNet  MATH  Google Scholar 

  7. Shannon C.E.: Communication theory of secrecy systems. Bell Syst. Tech. J. 28(4), 656–715 (1949).

    Article  MathSciNet  MATH  Google Scholar 

  8. Shannon C.E.: Prediction and entropy of printed English. Bell Syst. Tech. J. 30(1), 50–64 (1951).

    Article  MATH  Google Scholar 

  9. Takahira R., Tanaka-Ishii K., Debowski L.: Entropy rate estimates for natural languagea new extrapolation of compressed large-scale corpora. Entropy 18(10), 364 (2016).

    Article  Google Scholar 

Download references

Acknowledgements

This research was supported by Russian Foundation for Basic Research (Grant No. 15-29-07932).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Boris Ryabko.

Additional information

Communicated by C. Mitchell.

This paper was presented in part at XV International Symposium “Problems of Redundancy in Information and Control Systems” September 26–29, 2016, St. Petersburg, Russia.

Appendix

Appendix

Proof of Lemma

The following chain of equalities and inequalities is valid:

$$\begin{aligned}&h_t(X^1) + h_t(X^2) + \cdots + h_t(X^s) = h_t(X^1, X^2, \ldots , X^s)\\&\quad = h_t(X^1, X^2, \ldots , X^s, Z) = h_t(Z) + h_t(X^1, X^2, \ldots , X^s/ Z)\\&\quad = h_t(Z) + h_t(X^1/ Z) + h_t( X^2/X^1, Z) + h_t( X^3/X^1, X^2, Z)\\&\qquad + \cdots + h_t(X^s/X^1, X^2, \ldots , X^{s-1}, Z) \\&\quad = h_t(Z) + h_t(X^1/ Z) + h_t( X^2/X^1, Z) + h_t( X^3/X^1, X^2, Z)\\&\qquad + \cdots + h_t(X^{s-1}/X^1, X^2, \ldots , X^{s-2}, Z) \\&\quad \le h_t(Z) + h_t(X^1/ Z) + h_t( X^2/Z) + h_t( X^3/Z) + \cdots + h_t(X^{s-1}/ Z) . \end{aligned}$$

The proof is based on well-known properties of the Shannon entropy which can be found, for example, in [2]. More precisely, the first equation follows from the independence of \(X^1, X^2, \ldots , X^s\), whereas the second equation is valid because Z is a function of \(X^1, X^2, \ldots , X^s\), see (1). The third equation is a well-known property of the entropy. Having taken into account that \(X^s\) is determined if \( X^2, \ldots , X^{s-1}, Z\) are known, we obtain the last equation. The inequality also follows from the properties of the Shannon entropy [2]. Thus,

$$\begin{aligned}&h_t(X^1) + h_t(X^2) + \cdots + h_t(X^s) \nonumber \\&\quad \le h_t(Z) + h_t(X^1/ Z) + h_t( X^2/Z) + h_t( X^3/Z) + \cdots + h_t(X^{s-1}/ Z) . \end{aligned}$$
(10)

Taking into account that for any process U over alphabet \(A = \{0, \ldots , n-1 \} \)

$$\begin{aligned} h_t(Z) \le \log _2 n \, , \end{aligned}$$

we obtain (3) from (10). In order to prove (4) we note that analogously to (10), we can obtain the following:

$$\begin{aligned}&h_t(X^1) + h_t(X^2) + \cdots + h_t(X^s) \\&\quad \le \sum _{i=1}^{j-1} h_t(X^i/ Z) + \sum _{i=j+1}^{s} h_t(X^i/ Z) \end{aligned}$$

for any \( 1 \le j \le \). From this inequality we obtain (4). \(\square \)

Proof of Theorem

For the first cipher \(s > 2\) and all \(X^i\), \(i = 1, \ldots , s\) have the same probability distribution. Having taken into account that \(h_t(X^i) = h_t(X^1)\) for \(i = 1, \ldots , s\), from (4) and (6) we obtain (7). For the second cipher \(s=2\) and (7) follows from (3).

In order to prove ii), denote

$$\begin{aligned} H \left( X^i_j|Z_1 \ldots Z_t \right) = -\sum _{X^1_j \in A} P\left\{ X^1_j |Z_1 \ldots Z_t \right\} \log P \left\{ X^1_j |Z_1 \ldots Z_t \right\} , \end{aligned}$$

Let us consider any method G of encryption of \(Z_1 \ldots Z_k\) without key such that

$$\begin{aligned} \hat{X}^1_1 \hat{X}^1_2 \ldots \hat{X}^1_t = G(Z_1 \ldots Z_k) \end{aligned}$$

and define

$$\begin{aligned} p^*_j = P\left\{ \hat{X}^1_j = X_j^1\right\} , \, p^* = t^{-1} \sum _{j=1}^t p^*_j \, . \end{aligned}$$

From Fano inequality (see [2, 5]) we obtain

$$\begin{aligned} p^*_j \log (n-1) + \hat{h}\left( p^*_j \right) \ge H\left( X^1_j|Z_1 \ldots Z_t \right) \, , \end{aligned}$$

where \( \hat{h}(p^*_j)\) is the following entropy:

$$\begin{aligned} \hat{h}\left( p^*_j \right) = - \left( p^*_j \log p^*_j + \left( 1- p^*_j\right) \log \left( 1- p^*_j \right) \right) \, . \end{aligned}$$

From the last inequality we obtain

$$\begin{aligned} t^{-1} \sum _{j=1}^t \left( p^*_j \log (n-1) + \hat{h}\left( p^*_j \right) \right) \ge t^{-1} \sum _{j=1}^t H\left( X^1_j|Z_1 \ldots Z_t \right) \, . \end{aligned}$$

Having taken into account convexity of entropy, from this inequality and the definition \( p^* = t^{-1} \sum _{j=1}^t p^*_j \, \) we obtain

$$\begin{aligned} p^* \log (n-1) + \hat{h}(p^*) \ge t^{-1} \sum _{j=1}^t H\left( X^1_j|Z_1 \ldots Z_t \right) \, . \end{aligned}$$

From this and well known inequality for the entropy \(H(u, v) \le H(u) + H(v) \) we obtain

$$\begin{aligned} p^* \log (n-1) + \hat{h}(p^*) ) \ge t^{-1} H\left( X^1_1\ldots X^1_t|Z_1 \ldots Z_t\right) \, . \end{aligned}$$

Taking into account the Definition (2) and the statement (i), we obtain (ii).

In order to prove the third statement we will use the well-known Shannon–McMillan–Breiman theorem, see [2]. For conditional entropies it can be stated as follows:

\(\forall \varepsilon>0, \forall \delta > 0\), for almost all

\(Z_1,Z_2,\dots \) there exists \(n'\) such that if \(n > n'\) then

$$\begin{aligned} P\left\{ \left| - \frac{1}{n} \log P(X^1_1..X_t|Z_1..Z_t) - h(X|Z) \right| < \varepsilon \right\} \ge 1-\delta , \end{aligned}$$
(11)

where \((X^1,Z)\) is stationary ergodic process.

According to Shannon–McMillan–Breiman theorem for any \(\varepsilon> 0, \delta > 0\) and almost all \(Z_1,Z_2,\dots \) there exists such \(n'\) that for \(t > n'\)

$$\begin{aligned}&P\left\{ \left| - \frac{1}{t} \log P(X_1 X_2 \ldots X_t|Z_1 Z_2 \ldots Z_t ) - h(X|Z) \right| < \varepsilon /2 \right\} \nonumber \\&\quad \ge 1-\delta . \end{aligned}$$
(12)

Let us define

$$\begin{aligned} \Psi _Z= & {} \left\{ X^1_1 X^1_2 \ldots X^1_t : | P\left( X^1_1 X^1_2 \ldots X^1_t |Z_1 \ldots Z_t \right) - h_t(X^1|Z) |\right. \nonumber \\< & {} \left. \varepsilon /2 \right\} \, . \end{aligned}$$
(13)

The equation \( P( \Psi _Z ) > 1 - \delta \) immediately follows from (12). In order to prove (8), note that for any \(X^1 = X_1^1, \dots , X^1_t\), \(\bar{X}^1 = \bar{X}^1_2, \dots , \bar{X}^1_t\) from \( \Psi _Z\) we obtain from (12), (13)

$$\begin{aligned}&\frac{1}{t} \left| \log P(X^1|Z) - \log P(\bar{X}^1|Z) \right| \\&\quad \le \frac{1}{t} \left| \log P(X^1|Z) -h_t(X^1|Z) \right| \, \\&\qquad +\frac{1}{t} \left| \log P(\bar{X}^1|Z) - h_t(X^1|Z) \right| < \varepsilon /2 + \varepsilon /2 = \varepsilon \, . \end{aligned}$$

From (13), (7) and the proven equation \( P( \Psi (Z) ) > 1 - \delta \) we obtain the following: \( \, |\Psi _Z | > ( 1- \delta ) 2^{ t \, (h_t(X|Z) - \varepsilon )} \, .\) Taking into account that it is valid for any \(\varepsilon> 0, \delta > 0\) and \(t > n'\), we obtain (9). Theorem is proven. \(\square \)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ryabko, B. Properties of two Shannon’s ciphers. Des. Codes Cryptogr. 86, 989–995 (2018). https://doi.org/10.1007/s10623-017-0372-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10623-017-0372-2

Keywords

Mathematics Subject Classification

Navigation