Skip to main content

One-Way Functions and the Hardness of (Probabilistic) Time-Bounded Kolmogorov Complexity w.r.t. Samplable Distributions

  • Conference paper
  • First Online:
Advances in Cryptology – CRYPTO 2023 (CRYPTO 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14082))

Included in the following conference series:

  • 928 Accesses

Abstract

Consider the recently introduced notion of probabilistic time-bounded Kolmogorov Complexity, \(pK^t\) (Goldberg et al., CCC’22), and let \(\textsf{MpK}^t\textsf{P}\) denote the language of pairs (xk) such that \(pK^t(x) \le k\). We show the equivalence of the following:

  • \(\textsf{MpK}^\textsf{poly}\textsf{P}\) is (mildly) hard-on-average w.r.t. any samplable distribution \(\mathcal {D}\);

  • \(\textsf{MpK}^\textsf{poly}\textsf{P}\) is (mildly) hard-on-average w.r.t. the uniform distribution;

  • existence of one-way functions.

As far as we know, this yields the first natural class of problems where hardness with respect to any samplable distribution is equivalent to hardness with respect to the uniform distribution.

Under standard derandomization assumptions, we can show the same result also w.r.t. the standard notion of time-bounded Kolmogorov complexity, \(K^t\).

Y. Liu—Work done while visiting the Simons Institute during the Meta-complexity program. Supported by a JP Morgan fellowship.

R. Pass—Supported in part by NSF Award CNS 2149305, NSF Award CNS-2128519, NSF Award RI-1703846, AFOSR Award FA9550-18-1-0267, FA9550-23-1-0312, a JP Morgan Faculty Award, the Algorand Centres of Excellence programme managed by Algorand Foundation, and DARPA under Agreement No. HR00110C0086. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the United States Government, DARPA or the Algorand Foundation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    By “mild” average-case hardness, we here mean that no \(\textsf{PPT}\) algorithm is able to solve the problem with probability \(1-\frac{1}{p(n)}\) on inputs of length n, for some polynomial \(p(\cdot )\).

  2. 2.

    Typically, one would actually like to solve a search version of this problem, where one not only finds the time-bounded K-complexity of a string but also the program that “witnesses” this complexity; as we shall see our results actually consider this.

  3. 3.

    It would seem that we can also use a weaker derandomization assumption in case we only want to deduce io-OWFs; we defer the details to the full version.

  4. 4.

    Recall that an efficiently computable function f is a weak OWF if there exists some polynomial \(q>0\) such that f cannot be efficiently inverted with probability better than \(1-\frac{1}{q(n)}\) for sufficiently large n.

  5. 5.

    Formally, the program/description \(\varPi \) is an encoding of a pair (Mw) where M is a Turing machine and w is some input, and we evaluate M(w) on the Universal Turing machine U.

  6. 6.

    Or, in case, we also want solve the search problem, we also output the \(\ell \)-bit truncation of the program \({\varPi }'\) output by the inverter.

References

  1. Blum, M.: Coin flipping by telephone - A protocol for solving impossible problems. In: COMPCON1982, Digest of Papers, Twenty-Fourth IEEE Computer Society International Conference, San Francisco, California, USA, 22–25 February 1982, pp. 133–137. IEEE Computer Society (1982)

    Google Scholar 

  2. Blum, M., Micali, S.: How to generate cryptographically strong sequences of pseudo-random bits. SIAM J. Comput. 13(4), 850–864 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  3. Carter, L., Wegman, M.: Universal classes of hash functions. J. Comput. Syst. Sci. 18(2), 143–154 (1979)

    Article  MathSciNet  MATH  Google Scholar 

  4. Chaitin, G.J.: On the simplicity and speed of programs for computing infinite sets of natural numbers. J. ACM 16(3), 407–422 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  5. Diffie, W., Hellman, M.: New directions in cryptography. IEEE Trans. Inf. Theory 22(6), 644–654 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  6. Feige, U., Shamir, A.: Witness indistinguishable and witness hiding protocols. In: STOC 1990, pp. 416–426 (1990). http://doi.acm.org/10.1145/100216.100272

  7. Goldberg, H., Kabanets, V., Lu, Z., Oliveira, I.C.: Probabilistic Kolmogorov complexity with applications to average-case complexity. In: 37th Computational Complexity Conference (CCC 2022). Schloss Dagstuhl-Leibniz-Zentrum für Informatik (2022)

    Google Scholar 

  8. Goldreich, O., Goldwasser, S., Micali, S.: On the cryptographic applications of random functions. In: CRYPTO, pp. 276–288 (1984)

    Google Scholar 

  9. Goldwasser, S., Micali, S.: Probabilistic encryption. J. Comput. Syst. Sci. 28(2), 270–299 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  10. Gurevich, Y.: The challenger-solver game: variations on the theme of p=np. In: Logic in Computer Science Column, The Bulletin of EATCS (1989)

    Google Scholar 

  11. Hartmanis, J.: Generalized Kolmogorov complexity and the structure of feasible computations. In: 24th Annual Symposium on Foundations of Computer Science (SFCS 1983), pp. 439–445 (1983). https://doi.org/10.1109/SFCS.1983.21

  12. Håstad, J., Impagliazzo, R., Levin, L.A., Luby, M.: A pseudorandom generator from any one-way function. SIAM J. Comput. 28(4), 1364–1396 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  13. Ilango, R., Ren, H., Santhanam, R.: Robustness of average-case meta-complexity via pseudorandomness. In: Proceedings of the 54th Annual ACM SIGACT Symposium on Theory of Computing, pp. 1575–1583 (2022)

    Google Scholar 

  14. Impagliazzo, R.: A personal view of average-case complexity. In: Structure in Complexity Theory 1995, pp. 134–147 (1995)

    Google Scholar 

  15. Impagliazzo, R., LA, L.: No better ways to generate hard np instances than picking uniformly at random. In: Proceedings [1990] 31st Annual Symposium on Foundations of Computer Science, pp. 812–821. IEEE (1990)

    Google Scholar 

  16. Impagliazzo, R., Luby, M.: One-way functions are essential for complexity based cryptography (extended abstract). In: 30th Annual Symposium on Foundations of Computer Science, Research Triangle Park, North Carolina, USA, 30 October - 1 November 1989, pp. 230–235 (1989)

    Google Scholar 

  17. Ko, K.: On the notion of infinite pseudorandom sequences. Theor. Comput. Sci. 48(3), 9–33 (1986). https://doi.org/10.1016/0304-3975(86)90081-2

  18. Kolmogorov, A.N.: Three approaches to the quantitative definition of information. Int. J. Comput. Math. 2(1–4), 157–168 (1968)

    Article  MathSciNet  MATH  Google Scholar 

  19. Levin, L.A.: The tale of one-way functions. Prob. Inf. Trans. 39(1), 92–103 (2003). https://doi.org/10.1023/A:1023634616182

  20. Levin, L.A.: Universal search problems (Russian), translated into English by BA Trakhtenbrot in [32]. Prob. Inf. Transmission 9(3), 265–266 (1973)

    Google Scholar 

  21. Liu, Y., Pass, R.: On one-way functions and Kolmogorov complexity. In: 61st IEEE Annual Symposium on Foundations of Computer Science, FOCS 2020, Durham, NC, USA, 16–19 November 2020, pp. 1243–1254. IEEE (2020)

    Google Scholar 

  22. Liu, Y., Pass, R.: A note on one-way functions and sparse languages. Cryptology ePrint Archive (2021)

    Google Scholar 

  23. Liu, Y., Pass, R.: On one-way functions from np-complete problems. In: Proceedings of the 37th Computational Complexity Conference, pp. 1–24 (2022)

    Google Scholar 

  24. Longpré, L., Mocas, S.: Symmetry of information and one-way functions. In: Hsu, W.-L., Lee, R.C.T. (eds.) ISA 1991. LNCS, vol. 557, pp. 308–315. Springer, Heidelberg (1991). https://doi.org/10.1007/3-540-54945-5_75

    Chapter  Google Scholar 

  25. Lu, Z., Oliveira, I.C., Zimand, M.: Optimal coding theorems in time-bounded Kolmogorov complexity. In: 49th International Colloquium on Automata, Languages, and Programming (ICALP 2022). Schloss Dagstuhl-Leibniz-Zentrum für Informatik (2022)

    Google Scholar 

  26. Naor, M.: Bit commitment using pseudorandomness. J. Cryptol. 4(2), 151–158 (1991). https://doi.org/10.1007/BF00196774

    Article  MathSciNet  MATH  Google Scholar 

  27. Rivest, R.L., Shamir, A., Adleman, L.M.: A method for obtaining digital signatures and public-key cryptosystems (reprint). Commun. ACM 26(1), 96–99 (1983). https://doi.org/10.1145/357980.358017

  28. Rompel, J.: One-way functions are necessary and sufficient for secure signatures. In: STOC, pp. 387–394 (1990)

    Google Scholar 

  29. Shaltiel, R., Umans, C.: Simple extractors for all min-entropies and a new pseudorandom generator. J. ACM (JACM) 52(2), 172–216 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  30. Sipser, M.: A complexity theoretic approach to randomness. In: Proceedings of the 15th Annual ACM Symposium on Theory of Computing, 25–27 April 1983, Boston, Massachusetts, USA, pp. 330–335. ACM (1983)

    Google Scholar 

  31. Solomonoff, R.: A formal theory of inductive inference, part I. Inf. Control 7(1), 1–22 (1964). https://doi.org/10.1016/S0019-9958(64)90223-2

  32. Trakhtenbrot, B.A.: A survey of Russian approaches to Perebor (brute-force searches) algorithms. Annal. History Comput. 6(4), 384–400 (1984)

    Article  MATH  Google Scholar 

  33. Yao, A.C.: Theory and applications of trapdoor functions (extended abstract). In: 23rd Annual Symposium on Foundations of Computer Science, Chicago, Illinois, USA, 3–5 November 1982, pp. 80–91 (1982)

    Google Scholar 

  34. Zvonkin, A.K., Levin, L.A.: The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms. Russ. Math. Surv. 25(6), 83–124 (1970). https://doi.org/10.1070/RM1970v025n06ABEH001269

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yanyi Liu .

Editor information

Editors and Affiliations

A An Alternative Proof of Lemma 46 and Lemma 48

A An Alternative Proof of Lemma 46 and Lemma 48

As mentioned in Sect. 4.2, we provide direct proofs for Lemma 46 and Lemma 48. Let us start by a reminder of the statement of Lemma 46.

Lemma A1

(Lemma 46, restated). There exists a polynomial \(\gamma \) such that for all polynomial t, any \(t_D\)-time samplable ensemble \(\mathcal {D}\) is polynomially bounded by \(pK^t\) if \(t(n) \ge \gamma (t_D(n))\).

We recall the notion of a universal hash function [3].

Definition A2

Let \(\mathcal{H}_m^\ell \) be a family of functions where \(m<\ell \) and each function \(h \in \mathcal{H}^\ell _m\) maps \(\{0,1\}^\ell \) to \(\{0,1\}^m\). We say that \(\mathcal{H}^\ell _m\) is a universal hash family if (i) the functions \(h_{\sigma } \in \mathcal{H}_m^\ell \) can be described by a string \(\sigma \) of \(\ell ^c\) bits where c is a universal constant that does not depend on \(\ell \); (ii) for all \(x \ne x' \in \{0,1\}^\ell \), and for all \(y,y' \in \{0,1\}^m\)

$$ \Pr [h_\sigma \leftarrow \mathcal {H}^\ell _m:h_\sigma (x)=y \; \text{ and } \; h_\sigma (x')=y'] = 2^{-2m} $$

We will rely on the following properties of universal hash functions.

Proposition A3

Let \(\ell \in \mathbb {N}\), \(S \subseteq \{0,1\}^\ell \) be a set, \(\mathcal {H}^\ell _m\) be a universal hash family such that \(m \le \log |S|\). The following statements hold:

  • With probability at least \(1 - 2^{-\log |S| + m + 3}\) over \(h_\sigma \leftarrow \mathcal {H}^\ell _m\), there exists \(s \in S\) such that \(h_\sigma (s) = 0^m\).

  • With probability at least \(1 - 2^{-\ell + m + 3}\) over \(h_\sigma \leftarrow \mathcal {H}^\ell _m\), \(|h^{-1}_\sigma (0^m)| \le 2 \cdot 2^{\ell - m}\).

For completeness, we provide the proof of Proposition A3 here.

Proof

We first prove the former statement. We consider picking a random hash function \(h_\sigma \leftarrow \mathcal {H}^\ell _m\). For each element \(s \in S\), let \(X_s\) denote the random variable such that \(X_s = 1\) iff \(h_\sigma (s) = 0^m\). Let X denote the random variable \(X = \sum _{s \in S} X_s\). Note that \(\textsf{E}[X] = |S| / 2^{m}\) and the variance of X is

$$\textsf{V}(X) = \textsf{E}[X^2 - \textsf{E}[X]^2] = |S|(\frac{1}{2^m} - \frac{1}{2^{2m}}) \le \textsf{E}[X] $$

since \(\mathcal {H}^\ell _m\) is a universal hash family and all \(s_1, s_2 \in S\), \(X_{s_1}\) and \(X_{s_2}\) are independent. Therefore the variance of X is very small and we can apply Chebyshev’s Inequality to show that

$$\begin{aligned} \Pr [X = 0]&\le \Pr [|X - \textsf{E}[X]| \ge \textsf{E}[X] - 1] \\&\le \Pr [|X - \textsf{E}[X]| \ge (\sqrt{\textsf{V}(X)}/2) \sqrt{\textsf{V}(X)}] \\&\le \frac{1}{\textsf{V}(X) / 4} \le 2^{-\log |S| + m + 3} \end{aligned}$$

So we conclude that with probability at least \(1 - \Pr [X = 0] \ge 1 - 2^{-\log |S| + m + 3}\), there exists \(s \in S\) such that \(h_\sigma (s) = 0^m\).

The latter statement follows from essentially the same proof. For each element \(z \in \{0,1\}^\ell \), let \(Y_z\) denote the random variable such that \(Y_z = 1\) iff \(h_\sigma (z) = 0^m\). Let Y denote the random variable \(Y = \sum _{z \in \{0,1\}^\ell } Y_z\). Note that \(\textsf{E}[Y] = 2^\ell / 2^{m}\) and the variance of Y is

$$\textsf{V}(Y) = \textsf{E}[Y^2 - \textsf{E}[Y]^2] = 2^\ell (\frac{1}{2^m} - \frac{1}{2^{2m}}) $$

since \(\mathcal {H}^\ell _m\) is a universal hash family and all \(z_1, z_2 \in \{0,1\}^\ell \), \(Y_{z_1}\) and \(Y_{z_2}\) are independent.. Notice that by Chebyshev’s inequality,

$$\Pr [Y \ge 2 \cdot 2^{\ell - m}] \le \Pr [|Y - \textsf{E}[Y]| \ge (\sqrt{\textsf{V}(Y)}/2) \sqrt{\textsf{V}(Y)}] \le \frac{1}{\textsf{V}(Y) / 4} \le 2^{-\ell + m + 3}$$

So we conclude that with probability at least \(1 - 2^{-\ell + m + 3}\), \(|h^{-1}_\sigma (0^m)| \le 2 \cdot 2^{\ell - m}\).

We turn to introducing the linear universal hash family construction [3].

Proposition A4

([3]). Let \(\ell , m \in \mathbb {N}, m < \ell \). For each \(\sigma \in \{0,1\}^{\ell m + m}\), define \(h_\sigma \) to be the function such that for each \(x \in \{0,1\}^{\ell }\), \(h_\sigma (x) = A x + b\) where \(\sigma = (A, b)\), A is a binary matrix of \(m \times \ell \), and b is a binary vector of length m. Let \(\mathcal {H}^\ell _m = \{h_\sigma \,|\, \sigma \in \{0,1\}^{\ell m + m}\}\).

Then, it holds that \(\mathcal {H}^\ell _m\) is a universal hash family.

We are now ready to prove Lemma A1.

Proof

Consider any polynomial t, and any \(t_D\)-time samplable ensemble \(\mathcal {D}\). Let M be the \(\textsf{PPT}\) sampler such that \(M(1^n, r)\) uses \(r \in \{0,1\}^{t_D(n)}\) as random coins and samples \(D_n\) for each \(n \in \mathbb {N}\).

We will show that \(\mathcal {D}\) is polynomially bounded by \(pK_{1 - 2^{-n}}^t\). Consider any string \(x \in \{0,1\}^*\), \(n = |x|\). Let \(\ell = t_D(n)\) be the length of the random tape of M. Let \(p_x = \Pr [r \leftarrow \{0,1\}^{\ell }, x' = M(1^n, r): x' = x]\) denote the probability mass of x in \(D_n\). Our goal is to show that there exists a polynomial \(\delta \) such that \(p_x \le \delta (n) 2^{-pK^t(x)}\) holds for all x.

Let \(S = \{ r \in \{0,1\}^{\ell } : M(1^n, r) = x\}\) be the set of random tapes on which M will output x. (Note that \(|S| = 2^{\ell } p_x\).) Let \(m = \lceil \log |S| \rceil - 5\). Let \(\mathcal {H}^\ell _m\) be the universal hash family defined in Proposition A4.

For any hash function \(h_\sigma \in \mathcal {H}^\ell _m\), we refer to a hash function \(h_\sigma \) as being good if (1) \(\exists s \in S\), \(h_\sigma (s) = 0^m\) and (2) \(|h^{-1}(0^m)| \le 2\cdot 2^{\ell - m}\). We first claim that with high probability over \(h_\sigma \leftarrow \mathcal {H}^\ell _m\), \(h_\sigma \) will be good.

Claim 5

\(h_\sigma \) is good with probability at least 1/2 over \(h_\sigma \leftarrow \mathcal {H}^\ell _m\).

Proof

By Proposition A3 and a Union Bound, a random \(h_\sigma \) is good with probability at least \(1 - 2^{-\log |S| + m + 3} - 2^{-\ell + m - 3} \ge \frac{1}{2}\).

We next claim that given a good hash function \(h_\sigma \), there exists a short program of size roughly \(\log |S|\) that produce the string x.

Claim 6

For any good hash function \(h_\sigma \in \mathcal {H}^\ell _m\), there exists a program \(\varPi \) of length at most

$$ O(\log \ell ) + \lceil \log 1/p_x \rceil $$

that, given \(h_\sigma \) as input, outputs the string x within time \(O(\ell ^3)\).

Proof

Since \(h_\sigma \) is good, and let s be an string \(\in S\) such that \(h_\sigma (s) = 0^m\). Note that if s can be produced using a short program, x can be generated by running \(M(1^n, s)\), which adds \(|M| = c\) bits to the description and can be done in time \(t_D(n)\).

Finally, we show how to produce s using linear algebra. Recall that the hash function \(h_\sigma (x)\) is defined to be \(Ax + b\) where \(\sigma = (A, b)\), Ab are a binary matrix and a binary vector. We can use the Gaussian Elimination algorithm to find an vector \(v \in \{0,1\}^\ell \) such that \(Av + b = 0^m\) and a basis \((b_1, \ldots , b_d)\) for the kernel of A. Note that each \(y \in h^{-1}(0^m)\) can be represented by a d-bit coordinate vector (under the basis \((b_1, \ldots , b_d)\) and with respect to the offset vector v). So \(d \le \ell - m + 1\) and s can also be represented a coordinate vector of \(\ell - m + 1\) bits (and let e denote this vector). We then use this fact to construct a program \(\varPi \) with length \(\le 4\log \ell + \ell - m + O(1)\) bits to produce the string x. \(\varPi \) has the integers \(n, \ell \), the coordinate vector, and the code of M hardcoded (\(\le 4\log \ell + \ell - m + 1 + O(1)\) bits). On input a hash function description \(\sigma \), it computes v and \((b_1, \ldots , b_d)\) using Gaussian Elimination and Gram Schmidt, and computes \(s = \sum _{i \in [d]} b_i \cdot e_i + v\). Finally, \(\varPi \) outputs \(M_1(1^n, s)\). Notice that \(\varPi \) runs in time \(O(\ell ^3) + t_D(n) = O(t_D(n)^3) \le t(n)\). Also notice that \(\varPi \) can be described by \(4\log \ell + \ell - m + 1 + O(1)\), and we fix c to be the constant such that \(\varPi \) can be described using \(4\log \ell + \ell - m + c\) bits.

Finally, we are ready to show that \(p_x \le \delta (n) 2^{-pK^t(x)}\). Towards this, we will prove that

$$pK^t(x) \le O(\log \ell ) + \lceil \log 1/p_x \rceil $$

and the aforementioned inequality will follow if we set \(\delta (n) = \ell ^{O(1)} = t_D(n)^{O(1)}\) to be a large polynomial. Consider any random string \(r \in \{0,1\}^{2n(\ell m + m)}\), and we view r as \(r = \sigma _1 || \sigma _2 || \ldots || \sigma _{2n}\) where each \(\sigma _i\) is a description of a random hash function \(h_{\sigma _i} \leftarrow \mathcal {H}^\ell _m\). By Claim 5, with probability at least \(1 - 2^{-2n} \ge 2/3\), there exists \(i \in [2n]\) such that \(h_{\sigma _i}\) is a good hash function. By Claim 6, there exists a program \(\varPi '\) that on input \(h_{\sigma _i}\), outputs the string x. Thus, let \(\varPi \) be a program with the index i and \(\varPi '\) hardcoded, and \(\varPi \) on input r simply outputs \(\varPi '(h_{\sigma _i})\). Note that \(\varPi \) can be implemented using \(O(\log \ell ) + \lceil \log 1/p_x \rceil \) bits, and it terminates within time \(O(\ell ^3)\). By picking \(\gamma (n) = O(n^3)\), it follows that \(\varPi \) runs in \(O(\ell ^3) \le \gamma (\ell ) \le \gamma (t_D(n)) \le t(n)\).

We below prove Lemma 48. We first recall the statement.

Lemma A5

(Lemma 48, restated). Assume that \(\textsf{E}\not \subseteq \textsf{ioNSIZE}[2^{\varOmega (n)}]\). There exists a polynomial \(\gamma '\) such that for all polynomial \(t'\), any \(t_D\)-time samplable ensemble \(\mathcal {D}\) is polynomially bounded by \(K^{t'}\) if \(t'(n) \ge \gamma '(t_D(n))\).

Proof

The idea behind our proof is to derandomize the hash function used in Lemma A1. Therefore, this proof will rely on the proof of Lemma A1 heavily and we assume familiarity of Lemma A1.

Let \(t, t_D, \mathcal {D}, M, x, n, \ell , p_x, S, m, \mathcal {H}^\ell _m, h_\sigma \) be as in Lemma A1. The proof of Lemma A1 shows that (1) with probability at least 0.5, a random hash function is “good” (as defined in Lemma A1) and (2) if a hash function is good, there exists a small program \(\varPi \) that produces x on input the hash function within time \(\textsf{poly}(\ell )\). Note that for the purpose of derandomization, the probability 0.5 is good enough for us and we don’t need the parallel repetition performed in the end of Lemma A1.

Towards derandomizing \(h_\sigma \), we first show that whether \(h_\sigma \) is good can be verified by a non-deterministic circuit. Consider a non-deterministic circuit \(N_x\) with the string x hardcoded in it. Recall that a hash function \(h_\sigma \) is good if (1) \(\exists s \in S\) such that \(h_\sigma (s) = 0^m\) and (2) \(|h^{-1}(0^m)| \le 2\cdot 2^{\ell - m}\). (1) can be checked by guessing a string \(w \in \{0,1\}^\ell \), verifying if \(h_\sigma (w) = 0^m\) and if \(M(1^n, w)\) outputs x. (2) can be checked deterministically. Recall that \(h_\sigma \) is a linear hash function defined to be \(h_\sigma (x) = Ax + b\) where \(\sigma = (A, b)\). By facts in linear algebra, \(|h_\sigma ^{-1}(0^m)| = 2^d\) where d is the dimension of the kernel for A. And d can be computed using Gram Schmidt. Therefore, we can implement a non-deterministic circuit \(N_x\) such that \(N_x(\sigma ) = 1\) iff \(h_\sigma \) is good (so \(N_x\) accepts with probability at least 0.5), and \(N_x\) is of size \(\le O(|\sigma |^2)\).

Shaltiel and Umas [29] showed that under the assumption that \(\textsf{E}\not \subseteq \textsf{ioNSIZE}[2^{\varOmega (n)}]\), there exists a PRG \(G : \{0,1\}^{O(\log l)} \rightarrow \{0,1\}^{l}\) running in time \(\textsf{poly}(l)\) such that for all \(l \in \mathbb {N}\), for all non-deterministic circuits C of size \(\le O(l^2)\), it holds that

$$|\Pr [C(G(\mathcal{U}_{O(\log l)})) = 1] - \Pr [C(\mathcal{U}_{l}) = 1]| \le \frac{1}{6}$$

It follows that G also fools \(N_x\). Thus, there exists a seed \(z \in \{0,1\}^{O(\log |\sigma |)}\) such that \(h_{G(z)}\) is a good hash function.

We are now ready to show that x has a deterministic short description. We consider a program \(\varPi \) with the seed z hardcoded in it. \(\varPi \) first compute \(\sigma = G(z)\). Since \(h_\sigma \) is a good hash function, as shown in the proof of Lemma 46, there exists a program \(\varPi '\) of length at most

$$O(\log \ell ) + \lceil \log 1/p_x \rceil $$

that produces the string x on input the hash function description \(\sigma \) within time \(O(\ell ^3)\). \(\varPi \) also hardcodes the program \(\varPi '\), and \(\varPi \) just runs it on \(\sigma \) to obtain x. Note that \(\varPi \)’s running time is bounded by the PRG’s running time (\(\le \textsf{poly}|\sigma |\)) plus the running time of \(\varPi '\) (\(\le O(\ell ^3)\)). So there exists a polynomial \(\gamma '\) such that \(\varPi \) runs in time \(\gamma '(\ell ) \le \gamma '(t_D(n))\). Consider any polynomial \(t'(n) \ge \gamma '(t_D(n))\). It follows that

$$K^{t'}(x) \le |z| + O(1) + O(\log \ell ) + \lceil \log 1/p_x \rceil \le - \lceil \log 1/p_x \rceil + O(\log n)$$

which implies that there exists a polynomial \(\delta \) such that for all x, \(p_x \le \delta (n) 2^{-K^{t'}(x)}\). Note that this holds for any \(t_D\)-time ensemble if \(t'(n) \ge \gamma '(t_D(n))\), which concludes our proof.

Rights and permissions

Reprints and permissions

Copyright information

© 2023 International Association for Cryptologic Research

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, Y., Pass, R. (2023). One-Way Functions and the Hardness of (Probabilistic) Time-Bounded Kolmogorov Complexity w.r.t. Samplable Distributions. In: Handschuh, H., Lysyanskaya, A. (eds) Advances in Cryptology – CRYPTO 2023. CRYPTO 2023. Lecture Notes in Computer Science, vol 14082. Springer, Cham. https://doi.org/10.1007/978-3-031-38545-2_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-38545-2_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-38544-5

  • Online ISBN: 978-3-031-38545-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics