Abstract:
We present an information-theoretic approach to lower bound the oracle complexity of nonsmooth black box convex optimization, unifying previous lower bounding techniques ...Show MoreMetadata
Abstract:
We present an information-theoretic approach to lower bound the oracle complexity of nonsmooth black box convex optimization, unifying previous lower bounding techniques by identifying a combinatorial problem, namely string guessing, as a single source of hardness. As a measure of complexity, we use distributional oracle complexity, which subsumes randomized oracle complexity as well as worst case oracle complexity. We obtain strong lower bounds on distributional oracle complexity for the box [-1,1]^{n} , as well as for the L^{p} -ball for p \geq 1 (for both low-scale and large-scale regimes), matching worst case upper bounds, and hence we close the gap between distributional complexity, and in particular, randomized complexity and worst case complexity. Furthermore, the bounds remain essentially the same for high-probability and bounded-error oracle complexity, and even for combination of the two, i.e., bounded-error high-probability oracle complexity. This considerably extends the applicability of known bounds.
Published in: IEEE Transactions on Information Theory ( Volume: 63, Issue: 7, July 2017)