Abstract
Does the information complexity of a function equal its communication complexity? We examine whether any currently known techniques might be used to show a separation between the two notions. Ganor et al. [2014] recently provided such a separation in the distributional case for a specific input distribution. We show that in the non-distributional setting, the relative discrepancy bound is smaller than the information complexity; hence, it cannot separate information and communication complexity. In addition, in the distributional case, we provide a linear program formulation for relative discrepancy and relate it to variants of the partition bound, resolving also an open question regarding the relation of the partition bound and information complexity. Last, we prove the equivalence between the adaptive relative discrepancy and the public-coin partition, implying that the logarithm of the adaptive relative discrepancy bound is quadratically tight with respect to communication.
- Ziv Bar-Yossef, T. S. Jayram, Ravi Kumar, and D. Sivakumar. 2004. An information statistics approach to data stream and communication complexity. J. Comput. System Sci. 68, 4 (2004), 702--732. Google ScholarDigital Library
- Boaz Barak, Mark Braverman, Xi Chen, and Anup Rao. 2010. How to compress interactive communication. In Proceedings of the ACM Symposium on the Theory of Computing. Google ScholarDigital Library
- Mark Braverman. 2012. Interactive information complexity. In Proceedings of the 44th Symposium on Theory of Computing. 505--524. Google ScholarDigital Library
- Mark Braverman and Anup Rao. 2014. Information equals amortized communication. IEEE Transactions on Information Theory 60, 10 (2014), 6058--6069. Early version appeared in FOCS, pages 748--757, 2011. Google ScholarDigital Library
- Amit Chakrabarti, Anthony Wirth, Andrew Yao, and Yaoyun Shi. 2001. Informational complexity and the direct sum problem for simultaneous message complexity. In Proceedings of the IEEE Symposium on Foundations of Computer Science. 270--278. Google ScholarDigital Library
- Thomas M. Cover and Joy A. Thomas. 1991. Elements of Information Theory. Wiley-Interscience, New York. 542 pages. Google ScholarDigital Library
- R. M. Fano. 1949. The Transmission of Information. Technical Report 65. Research Laboratory for Electronics, MIT, Cambridge, MA.Google Scholar
- Anat Ganor, Gillat Kol, and Ran Raz. 2014a. Exponential separation of information and communication. Electronic Colloquium on Computational Complexity 21 (2014), 49. Retrieved from http://eccc.hpi-web.de/report/2014/049.Google Scholar
- Anat Ganor, Gillat Kol, and Ran Raz. 2014b. Exponential separation of information and communication for boolean functions. In Electronic Colloquium on Computational Complexity, Vol. 113. Retrieved from http://eccc.hpi-web.de/report/2014/113/.Google Scholar
- Anat Ganor, Gillat Kol, and Ran Raz. 2016. Exponential separation of communication and external information. In Proceedings of the 48th Annual ACM SIGACT Symposium on Theory of Computing. ACM, 977--986. Google ScholarDigital Library
- Dmitry Gavinsky and Shachar Lovett. 2014. En route to the log-rank conjecture: New reductions and equivalent formulations. In Proceedings of the 41st International Colloquium on Automata, Languages, and Programming. 514--524.Google ScholarCross Ref
- Prahladh Harsha, Rahul Jain, David A. McAllester, and Jaikumar Radhakrishnan. 2007. The communication complexity of correlation. In Proceedings of the 22nd Annual IEEE Conference on Computational Complexity. 10--23. Google ScholarDigital Library
- Rahul Jain and Hartmut Klauck. 2010. The partition bound for classical communication complexity and query complexity. In Proceedings of the IEEE Conference on Computational Complexity. 1--28. Google ScholarDigital Library
- Rahul Jain, Troy Lee, and Nisheeth K. Vishnoi. 2014. A quadratically tight partition bound for classical communication complexity and query complexity. CoRR abs/1401.4512 (2014). http://arxiv.org/abs/1401.4512.Google Scholar
- Rahul Jain, Jaikumar Radhakrishnan, and Pranab Sen. 2003. A direct sum theorem in communication complexity via message compression. In Proceedings of the 30th International Colloquium on Automata, Languages and Programming. 300--315. Google ScholarDigital Library
- Rahul Jain, Jaikumar Radhakrishnan, and Pranab Sen. 2008. Optimal direct sum and privacy trade-off results for quantum and classical communication complexity. CoRR abs/0807.1267 (2008), 285--296. http://arxiv.org/abs/0807.1267.Google Scholar
- Iordanis Kerenidis, Sophie Laplante, Virginie Lerays, Jérémie Roland, and David Xiao. 2012. Lower bounds on information complexity via zero-communication protocols and applications. In Proceedings of the 53rd Annual IEEE Symposium on Foundations of Computer Science. 500--509. Google ScholarDigital Library
- Claude E. Shannon. 1948. A mathematical theory of computation. The Bell System Technical Journal 27, 379--423, 623--656.Google ScholarCross Ref
- Andrew Chi-Chih Yao. 1979. Some complexity questions related to distributive computing (preliminary report). In Proceedings of the 11h Annual ACM Symposium on Theory of Computing. 209--213. Google ScholarDigital Library
- Andrew Chi-Chih Yao. 1983. Lower bounds by probabilistic arguments. In 24th Annual Symposium on Foundations of Computer Science. 420--428. Google ScholarDigital Library
Index Terms
- Relative Discrepancy Does Not Separate Information and Communication Complexity
Recommendations
Exponential separation of communication and external information
STOC '16: Proceedings of the forty-eighth annual ACM symposium on Theory of ComputingWe show an exponential gap between communication complexity and external information complexity, by analyzing a communication task suggested as a candidate by Braverman. Previously, only a separation of communication complexity and internal information ...
Exponential Separation of Information and Communication for Boolean Functions
We show an exponential gap between communication complexity and information complexity by giving an explicit example of a partial boolean function with information complexity ≤ O(k), and distributional communication complexity ≥ 2k. This shows that a ...
Lower Bounds on Information Complexity via Zero-Communication Protocols and Applications
FOCS '12: Proceedings of the 2012 IEEE 53rd Annual Symposium on Foundations of Computer ScienceWe show that almost all known lower bound methods for communication complexity are also lower bounds for the information complexity. In particular, we define a relaxed version of the partition bound of Jain and Klauck and prove that it lower bounds the ...
Comments