skip to main content
research-article

On the Entropy Loss and Gap of Condensers

Published: 09 April 2019 Publication History

Abstract

Many algorithms are proven to work under the assumption that they have access to a source of random, uniformly distributed bits. However, in practice, sources of randomness are often imperfect, giving n random bits that have only k < n min-entropy. The value nk is called the entropy gap of the source. Randomness condensers are hash functions that hash any such source to a shorter source with reduced entropy gap g. The goal is to lose as little entropy as possible in this process. Condensers also have an error parameter ε and use a small seed of uniformly distributed bits whose length we desire to minimize as well.
In this work, we study the exact dependencies between the different parameters of seeded randomness condensers. We obtain a non-explicit upper bound, showing the existence of condensers with entropy loss log (1+log 1/ε / g) + O(1) and seed length log (nk / ε g) + O(1). In particular, this implies the existence of condensers with O(log 1 / ε) entropy gap and constant entropy loss. This extends (with slightly improved parameters) the non-explicit upper bound for condensers presented in the work of Dodis et al. (2014), which gives condensers with entropy loss at least log log 1 / ε. We also give a non-explicit upper bound for lossless condensers, which have entropy gap g ≥ log 1 / ε / ε + O(1) and seed length log (nk/ ε2 g) + O(1).
Furthermore, we address an open question raised in (Dodis et al. 2014), where Dodis et al. showed an explicit construction of condensers with constant gap and O(log log 1/ ε) loss, using seed length O(n log 1 / ε). In the same article they improve the seed length to O(k log k) and ask whether it can be further improved. In this work, we reduce the seed length of their construction to O(log (n / ε)log (k / &epsiv)) by a simple concatenation.
In the analysis, we use and prove a tight equivalence between condensers and extractors with multiplicative error. We note that a similar, but non-tight, equivalence was already proven by Dodis et al. (Dodis et al. 2014) using a weaker variant of extractors called unpredictability extractors. We also remark that this equivalence underlies the work of Ben-Aroya et al. (Ben-Aroya et al. 2016) and later work on explicit two-source extractors, and we believe it is interesting in its own right.

References

[1]
Avraham Ben-Aroya, Dean Doron, and Amnon Ta-Shma. 2016. Explicit two-source extractors for near-logarithmic min-entropy. In Electron. Colloq. Comput. Complex. (ECCC), Vol. 23. 88.
[2]
Gil Cohen. 2017. Towards optimal two-source extractors and Ramsey graphs. In Proceedings of the 49th ACM SIGACT Symposium on Theory of Computing. ACM, 1157--1170.
[3]
Yevgeniy Dodis, Krzysztof Pietrzak, and Daniel Wichs. 2014. Key derivation without entropy waste. In Proceedings of the International Conference on the Theory and Applications of Cryptographic Techniques. Springer, 93--110.
[4]
Venkatesan Guruswami, Christopher Umans, and Salil Vadhan. 2009. Unbalanced expanders and randomness extractors from Parvaresh--Vardy codes. J. ACM 56, 4 (2009), 20.
[5]
Wassily Hoeffding. 1963. Probability inequalities for sums of bounded random variables. J. Amer. Statist. Assoc. 58, 301 (1963), 13--30. http://www.jstor.org/stable/2282952.
[6]
Xin Li. 2017. Improved non-malleable extractors, non-malleable codes, and independent source extractors. In Proceedings of the 49th ACM SIGACT Symposium on Theory of Computing. ACM, 1144--1156.
[7]
Jaikumar Radhakrishnan and Amnon Ta-Shma. 2000. Bounds for dispersers, extractors, and depth-two superconcentrators. SIAM J. Disc. Math. 13, 1 (2000), 2--24.
[8]
Ran Raz, Omer Reingold, and Salil Vadhan. 2002. Extracting all the randomness and reducing the error in Trevisan’s extractors. J. Comput. System Sci. 65, 1 (2002), 97--128.
[9]
Amnon Ta-Shma and Christopher Umans. 2006. Better lossless condensers through derandomized curve samplers. In Proceedings of the 47th IEEE Symposium on Foundations of Computer Science (FOCS’06). IEEE, 177--186.
[10]
Amnon Ta-Shma and Christopher Umans. 2012. Better condensers and new extractors from Parvaresh-Vardy codes. In Proceedings of the 27th IEEE Conference on Computational Complexity (CCC’12). IEEE, 309--315.
[11]
Amnon Ta-Shma, Christopher Umans, and David Zuckerman. 2001. Loss-less condensers, unbalanced expanders, and extractors. In Proceedings of the 33rd ACM Symposium on Theory of Computing. ACM, 143--152.
[12]
Salil P. Vadhan. 2012. Pseudorandomness. Foundations and Trends® in Theoretical Computer Science 7, 1--3 (2012), 1--336.

Cited By

View all
  • (2024)Improved Condensers for Chor-Goldreich Sources2024 IEEE 65th Annual Symposium on Foundations of Computer Science (FOCS)10.1109/FOCS61266.2024.00096(1513-1549)Online publication date: 27-Oct-2024
  • (2023)Almost Chor-Goldreich Sources and Adversarial Random WalksProceedings of the 55th Annual ACM Symposium on Theory of Computing10.1145/3564246.3585134(1-9)Online publication date: 2-Jun-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Computation Theory
ACM Transactions on Computation Theory  Volume 11, Issue 3
September 2019
164 pages
ISSN:1942-3454
EISSN:1942-3462
DOI:10.1145/3323875
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 April 2019
Accepted: 01 February 2019
Revised: 01 January 2019
Received: 01 January 2018
Published in TOCT Volume 11, Issue 3

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Randomness extractors
  2. entropy gap
  3. entropy loss
  4. key derivation
  5. randomness condensers
  6. unpredictability extractors

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)29
  • Downloads (Last 6 weeks)1
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Improved Condensers for Chor-Goldreich Sources2024 IEEE 65th Annual Symposium on Foundations of Computer Science (FOCS)10.1109/FOCS61266.2024.00096(1513-1549)Online publication date: 27-Oct-2024
  • (2023)Almost Chor-Goldreich Sources and Adversarial Random WalksProceedings of the 55th Annual ACM Symposium on Theory of Computing10.1145/3564246.3585134(1-9)Online publication date: 2-Jun-2023

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media