Skip to main content
Log in

Information Lower Bounds via Self-Reducibility

  • Published:
Theory of Computing Systems Aims and scope Submit manuscript

Abstract

We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of I P n is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702–732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. A preliminary version of this paper appeared in Computer Science Symposium in Russia (CSR’13).

  2. See e,g [2].

References

  1. Ada, A., Chattopadhyay, A., Cook, S., Fontes, L., Koucky, M., Pitassi, T.: The hardness of being private. In: CCC, pp. 192–202. IEEE (2012)

  2. Alon, N., Spencer, J.: The Probabilistic Method. Wiley (1992)

  3. Bar-Yossef, Z., Jayram, T. S., Kumar, R., Sivakumar, D.: An information statistics approach to data stream and communication complexity. J. Comput. Syst. Sci. 68(4), 702–732 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  4. Barak, B., Braverman, M., Chen, X., Rao, A.: How to compress interactive communication. In: STOC, pp. 67–76 (2010)

  5. Braverman, M.: Interactive information complexity. In: STOC, pp 505–524 (2012)

  6. Braverman, M., Garg, A., Pankratov, D., Weinstein, O.: From information to exact communication. Electronic Colloquium on Computational Complexity (ECCC) 19(171) (2012)

  7. Braverman, M., Moitra, A.: An information complexity approach to extended formulations. Electronic Colloquium on Computational Complexity (ECCC) 19, 131 (2012)

    MATH  Google Scholar 

  8. Braverman, M., Rao, A.: Information equals amortized communication. arXiv:1106.3595 (2010)

  9. Braverman, M., Weinstein, O.: A discrepancy lower bound for information complexity. Electronic Colloquium on Computational Complexity (ECCC) 18, 164 (2011)

    MATH  Google Scholar 

  10. Chakrabarti, A., Kondapally, R., Wang, Z.: Information complexity versus corruption and applications to orthogonality and gap-hamming. arXiv:1205.0968 (2012)

  11. Chakrabarti, A., Regev, O.: An optimal lower bound on the communication complexity of gap-hamming-distance. In: STOC, pp. 51–60 (2011)

  12. Chakrabarti, A., Shi, Y., Wirth, A., Yao, A.: Informational complexity and the direct sum problem for simultaneous message complexity. In: FOCS, pp. 270–278 (2001)

  13. Chor, B., Goldreich, O.: Unbiased bits from sources of weak randomness and probabilistic communication complexity. SIAM J. Comput. 17(2), 230–261 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  14. Ganor, A., Kol, G., Raz, R.: Exponential separation of information and communication for boolean functions. In: STOC, pp. 557–566 (2015)

  15. Huffman, D.: A method for the construction of minimum-redundancy codes. Proc. IRE 40(9), 1098–1101 (1952)

    Article  MATH  Google Scholar 

  16. Kerenidis, I., Laplante, S., Lerays, V., Roland, J., Xiao, D.: Lower bounds on information complexity via zero-communication protocols and applications. arXiv:1204.1505 (2012)

  17. Klauck, H.: Quantum and approximate privacy. Theory Comput. Syst. 37(1), 221–246 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  18. Kushilevitz, E., Nisan, N.: Communication Complexity. Cambridge University Press, Cambridge (1997)

    Book  MATH  Google Scholar 

  19. McGregor, A., Mironov, I., Pitassi, T., Reingold, O., Talwar, K., Vadhan, S.: The limits of two-party differential privacy. In: FOCS, pp. 81–90 (2010)

  20. Shannon, C. E.: A mathematical theory of communication. Bell System Technical Journal 27 (1948). Monograph B-1598

  21. Yao, A.C.-C.: Some complexity questions related to distributive computing (preliminary report). In: STOC, pp. 209–213 (1979)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Denis Pankratov.

Additional information

Research supported in part by an Alfred P. Sloan Fellowship, an NSF CAREER award, and a Turing Centenary Fellowship.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Braverman, M., Garg, A., Pankratov, D. et al. Information Lower Bounds via Self-Reducibility. Theory Comput Syst 59, 377–396 (2016). https://doi.org/10.1007/s00224-015-9655-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00224-015-9655-z

Keywords

Navigation