skip to main content
10.1145/1132516.1132571acmconferencesArticle/Chapter ViewAbstractPublication PagesstocConference Proceedingsconference-collections
Article

Can every randomized algorithm be derandomized?

Published: 21 May 2006 Publication History

Abstract

Among the most important modern algorithmic techniques is the use of random decisions. Starting in the 1970's, many of the most significant results were randomized algorithms solving basic compuatational problems that had (to that time) resisted efficient deterministic computation. (Ber72, SS79, Rab80, Sch80, Zip79, AKLLR). In contrast, many of the most exciting recent work has been on derandomizing these same algorithms, coming up with efficient deterministic versions, e.g., (AKS02, Rein05). This raises the question, can such results be obtained for all randomized algorithms? Will the remaining classical randomized algorithms be derandomized by similar techniques?Clear but complicated answers to these questions have emerged from complexity-theoretic studies of randomized complexity classes (e.g., RP and BPP) and pseudo-random generators. These questions are inextricably linked to another basic problem in complexity: which functions require large circuits to compute?In this talk, we'll survey some results from the theory of derandomization. I'll stress connections to other questions, especially circuit complexity, explicit extractors, hardness amplification, and error-correcting codes. Much of the talk is based on joint work with Valentine Kabanets and Avi Wigderson, but it will also include results by many other researchers.A priori, possibilities concerning the power of randomized algorithms include:
Randomization always helps speed up intractable problems, i.e., EXP=BPP.
The extent to which randomization helps is problem-specific. Depending on the problem, it can reduce complexity by any amount from not at all to exponentially.
True randomness is never needed, and random choices can always be simulated deterministically, i.e., P=BPP.
.Either of the last two possibilities seem plausible, but most consider the first wildly implausible. However, while a strong version of the middle possibility has been ruled out, the implausible first one is still open. Recent results indicate both that the last, P=BPP, is both very likely to be the case and very difficult to prove.More precisely:
Either no problem in E has strictly exponential circuit complexity or P=BPP. This seems to be strong evidence that, in fact, P=BPP, since otherwise circuits can always shortcut computation time for hard problems. (NW, BFNW, IW97, STV01, SU01, Uma02).
Either BPP=EXP, or any problem in BPP has a deterministic sub-exponential time algorithm that works on almost all instances. In other words, either randomness solves every hard problem, or it does not help exponentially, except on rare instances. This rules out strong problem-dependence, since if randomization helps exponentially for many instances of some problem, we can conclude that it helps exponentially for all intractible problems. (IW98).
If RP=P, then either the permanent problem requires super-polynomial algebraic circuits or there is a problem in NEXP that has no polynomial-size Boolean circuit. (IKW01, KI). That is, proving the last possibility requires one to prove a new circuit lower bound, and so is likely to be difficult. (Moreover, we do not need the full hypothesis that P=RP to obtain the same conclusion: it actually suffices that the Schwartz-Zippel identity testing algorithm be derandomizable. Thus, we will not be able to derandomize even the "classic" algorithms without proving circuit lower bounds.)
All of these results use the hardness-vs-randomness paradigm introduced by Yao (Yao82, see also BM, Levin): Use a hard computational problem to define a small set of "pseudo-random" strings, that no limited adversary can distinguish from random. Use these "pseudo-random" strings to replace the random choices in a probabilistic algorithm. The algorithm will not have enough time to distinguish the pseudo-random sequences from truly random ones, and so will behave the same as it would given random sequences.

References

[1]
M. Agrawal, N. Kayal, and N. Saxena, Primes is in P. Annals of Mathematics, Vol. 160, No. 2, 2004, pp. 781--793.
[2]
R. Aleliunas, R. Karp, R. Lipton, L. Lovasz, and C. Rackoff, Random Walks, Universal Traversal Sequences, and the Complexity of Maze Problems 20th FOCS, 1979, pp. 218--223.
[3]
L. Babai, L. Fortnow, N. Nisan, and A. Wigderson. BPP has subexponential time simulations unless EXPTIME has publishable proofs. Complexity, 3:307--318, 1993.
[4]
E.R. Berlekamp. Factoring Polynomials. Proc. of the 3rd Southeastern Conference on Combinatorics, GRAPH THEORY AND COMPUTING 1972, pp. 1--7.
[5]
"How to Generate Cryptographically Strong Sequences of Pseudo-Random Bits", SIAM J. Comput., Vol. 13, pages 850--864, 1984.
[6]
R. Impagliazzo, V. Kabanets, and A. Wigderson. In search of an easy witness: Exponential time vs. probabilistic polynomial time. In Proceedings of the Sixteenth Annual IEEE Conference on Computational Complexity, pages 1--11, 2001.
[7]
R. Impagliazzo and A. Wigderson. P=BPP if E requires exponential circuits: Derandomizing the XOR Lemma. In Proceedings of the Twenty-Ninth Annual ACM Symposium on Theory of Computing, pages 220--229, 1997.
[8]
R. Impagliazzo and A. Wigderson. Randomness vs. time: De-randomization under a uniform assumption. In Proceedings of the Thirty-Ninth Annual IEEE Symposium on Foundations of Computer Science, pages 734--743, 1998.
[9]
D. Johnson, The NP-completeness column: An ongoing guide. (12th article) Journal of Algorithms, Vol. 5, 1984, pp. 433--447.
[10]
V. Kabanets and R. Impagliazzo, Derandomizing Polynomial Identity Tests Means Proving Circuit Lower Bounds Computational Complexity, Vol. 13, No. 1-2, 2004, pp. 1--46.
[11]
L. A. Levin, One-Way Functions and Pseudorandom Generators. Combinatorica, Vol. 7, No. 4, pp. 357--363, 1987.
[12]
N. Nisan and A. Wigderson. Hardness vs. randomness. Journal of Computer and System Sciences, 49:149--167, 1994.
[13]
N. Nisan and D. Zuckerman. Randomness is Linear in Space. JCSS, Vol 52, No. 1, 1996, pp. 43--52.
[14]
M. O. Rabin. Probabilistic Algorithm for Testing Primality. Journal of Number Theory, 12:128--138, 1980.
[15]
O. Reingold. Undirected ST-connectivity in log-space. STOC '05, pp. 376--385, 2005
[16]
M. Santha and U. V. Vazirani, Generating Quasi-Random Sequences from Slightly Random Sources, 25th FOCS, 1984, pp. 434--440.
[17]
J.T. Schwartz. Fast probabilistic algorithms for verification of polynomial identities. Journal of the Association for Computing Machinery, 27(4):701--717, 1980.
[18]
R. Shaltiel and C. Umans. Simple extractors for all min-entropies and a new pseudo-random generator. In Proceedings of the Forty-Second Annual IEEE Symposium on Foundations of Computer Science, pages 648--657, 2001.
[19]
M. Sipser Extractors, Randomness, or Time versus Space. JCSS, vol 36, No. 3, 1988, pp. 379--383.
[20]
R. Solovay and V. Strassen, A fast Monte Carlo test for primality SIAM Journal on Computing 6(1):84--85, 1979.
[21]
M. Sudan, L. Trevisan, and S. Vadhan. Pseudorandom generators without the XOR lemma. Journal of Computer and System Sciences, 62(2):236--266, 2001. (preliminary version in STOC'99).
[22]
M. Sudan. Decoding of Reed Solomon codes beyond the error-correction bound. Journal of Complexity, 13(1):180--193, 1997.
[23]
L. Trevisan. Extractors and pseudorandom generators. Journal of the Association for Computing Machinery, 48(4):860--879, 2001. (preliminary version in STOC'99).
[24]
L. Trevisan, List Decoding Using the XOR Lemma. Electronic Colloquium on Computational Complexity tech report 03-042, 2003.
[25]
C. Umans. Pseudo-random generators for all hardnesses. In Proceedings of the Thirty-Fourth Annual ACM Symposium on Theory of Computing, 2002.
[26]
L. Valiant. Why is Boolean complexity theory difficult? In M.S. Paterson, editor, Boolean Function Complexity, volume 169 of London Math. Society Lecture Note Series, pages 84--94. Cambridge University Press, 1992.
[27]
J. von Neumann, Various Techniques Used in Relation to Random Digits Applied Math Series, Vol. 12, 1951, pp. 36--38.
[28]
A.C. Yao. Theory and applications of trapdoor functions. In Proceedings of the Twenty-Third Annual IEEE Symposium on Foundations of Computer Science, pages 80--91, 1982.
[29]
R.E. Zippel. Probabilistic algorithms for sparse polynomials. In Proceedings of an International Symposium on Symbolic and Algebraic Manipulation (EUROSAM'79), Lecture Notes in Computer Science, pages 216--226, 1979.
[30]
D. Zuckerman, General Weak Random Sources 31st FOCS, 1990, pp. 534--543.
[31]
D. Zuckerman, Simulating BPP Using a General Weak Random Source, FOCS, 1991, pp. 79--89.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
STOC '06: Proceedings of the thirty-eighth annual ACM symposium on Theory of Computing
May 2006
786 pages
ISBN:1595931341
DOI:10.1145/1132516
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 May 2006

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. algebraic circuit complexity
  2. circuit complexity
  3. complexity classes
  4. derandomization
  5. probabilistic algorithms
  6. pseudo-randomness

Qualifiers

  • Article

Conference

STOC06
Sponsor:
STOC06: Symposium on Theory of Computing
May 21 - 23, 2006
WA, Seattle, USA

Acceptance Rates

Overall Acceptance Rate 1,469 of 4,586 submissions, 32%

Upcoming Conference

STOC '25
57th Annual ACM Symposium on Theory of Computing (STOC 2025)
June 23 - 27, 2025
Prague , Czech Republic

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)1
Reflects downloads up to 14 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2011)Randomisation and Derandomisation in Descriptive Complexity TheoryLogical Methods in Computer Science10.2168/LMCS-7(3:14)20117:3Online publication date: 21-Sep-2011
  • (2011)Weak Derandomization of Weak Algorithms: Explicit Versions of Yao’s Lemmacomputational complexity10.1007/s00037-011-0006-420:1(87-143)Online publication date: 19-Apr-2011
  • (2010)Typically-correct derandomizationACM SIGACT News10.1145/1814370.181438941:2(57-72)Online publication date: 9-Jun-2010
  • (2009)Weak Derandomization of Weak AlgorithmsProceedings of the 2009 24th Annual IEEE Conference on Computational Complexity10.1109/CCC.2009.27(114-125)Online publication date: 15-Jul-2009

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media