skip to main content
10.1145/3188745.3188752acmconferencesArticle/Chapter ViewAbstractPublication PagesstocConference Proceedingsconference-collections
research-article

The adaptive complexity of maximizing a submodular function

Published: 20 June 2018 Publication History

Abstract

In this paper we study the adaptive complexity of submodular optimization. Informally, the adaptive complexity of a problem is the minimal number of sequential rounds required to achieve a constant factor approximation when polynomially-many queries can be executed in parallel at each round. Adaptivity is a fundamental concept that is heavily studied in computer science, largely due to the need for parallelizing computation. Somewhat surprisingly, very little is known about adaptivity in submodular optimization. For the canonical problem of maximizing a monotone submodular function under a cardinality constraint, to the best of our knowledge, all that is known to date is that the adaptive complexity is between 1 and Ω(n).
Our main result in this paper is a tight characterization showing that the adaptive complexity of maximizing a monotone submodular function under a cardinality constraint is Θ(log n): - We describe an algorithm which requires O(log n) sequential rounds and achieves an approximation that is arbitrarily close to 1/3; - We show that no algorithm can achieve an approximation better than O(1 / log n) with fewer than O(log n / log log n) rounds. Thus, when allowing for parallelization, our algorithm achieves a constant factor approximation exponentially faster than any known existing algorithm for submodular maximization.
Importantly, the approximation algorithm is achieved via adaptive sampling and complements a recent line of work on optimization of functions learned from data. In many cases we do not know the functions we optimize and learn them from labeled samples. Recent results show that no algorithm can obtain a constant factor approximation guarantee using polynomially-many labeled samples as in the PAC and PMAC models, drawn from any distribution. Since learning with non-adaptive samples over any distribution results in a sharp impossibility, we consider learning with adaptive samples where the learner obtains poly(n) samples drawn from a distribution of her choice in every round. Our result implies that in the realizable case, where there is a true underlying function generating the data, Θ(log n) batches of adaptive samples are necessary and sufficient to approximately “learn to optimize” a monotone submodular function under a cardinality constraint.

Supplementary Material

MP4 File (7c-5.mp4)

References

[1]
{AAAK17} Arpit Agarwal, Shivani Agarwal, Sepehr Assadi, and Sanjeev Khanna. Learning with limited rounds of adaptivity: Coin tossing, multi-armed bandits, and ranking from pairwise comparisons. In COLT, 2017.
[2]
{AM10} Saeed Alaei and Azarakhsh Malekian. Maximizing sequencesubmodular functions and its application to online advertising. arXiv preprint arXiv:1009.4153, 2010.
[3]
{ANRW15} Noga Alon, Noam Nisan, Ran Raz, and Omri Weinstein. Welfare maximization with limited interaction. In FOCS, pages 1499–1512. IEEE, 2015.
[4]
{AWZ08} Akram Aldroubi, Haichao Wang, and Kourosh Zarringhalam. Sequential adaptive compressed sampling via huffman codes. arXiv preprint arXiv:0810.4916, 2008.
[5]
{Bal15} Maria-Florina Balcan. Learning submodular functions with applications to multi-agent systems. In AAMAS, 2015.
[6]
{BCIW12} Maria-Florina Balcan, Florin Constantin, Satoru Iwata, and Lei Wang. Learning valuation functions. In COLT, 2012.
[7]
{BDF + 12} Ashwinkumar Badanidiyuru, Shahar Dobzinski, Hu Fu, Robert Kleinberg, Noam Nisan, and Tim Roughgarden. Sketching valuation functions. In SODA, pages 1025–1035, 2012.
[8]
{BENW15} Rafael Barbosa, Alina Ene, Huy Nguyen, and Justin Ward. The power of randomization: Distributed submodular maximization on massive datasets. In ICML, pages 1236–1244, 2015.
[9]
{BENW16} Rafael da Ponte Barbosa, Alina Ene, Huy L Nguyen, and Justin Ward. A new framework for distributed submodular maximization. In FOCS, pages 645–654. Ieee, 2016.
[10]
{BGSMdW12} Harry Buhrman, David García-Soriano, Arie Matsliah, and Ronald de Wolf. The non-adaptive query complexity of testing k-parities. arXiv preprint arXiv:1209.3849, 2012.
[11]
{BH11} Maria-Florina Balcan and Nicholas JA Harvey. Learning submodular functions. In STOC, pages 793–802. ACM, 2011.
[12]
{Ble96} Guy E Blelloch. Programming parallel algorithms. Communications of the ACM, 39(3):85–97, 1996.
[13]
{BMW16} Mark Braverman, Jieming Mao, and S Matthew Weinberg. Parallel algorithms for select and partition with noisy comparisons. In STOC, pages 851–862. ACM, 2016.
[14]
{BPR + 16} Ashwinkumar Badanidiyuru, Christos Papadimitriou, Aviad Rubinstein, Lior Seeman, and Yaron Singer. Locally adaptive optimization: Adaptive seeding for monotone submodular functions. In SODA, pages 414–429. Society for Industrial and Applied Mathematics, 2016.
[15]
{BPT11} Guy E Blelloch, Richard Peng, and Kanat Tangwongsan. Linear-work greedy parallel approximate set cover and variants. In SPAA, pages 23–32. ACM, 2011.
[16]
{BRM98} Guy E Blelloch and Margaret Reid-Miller. Fast set operations using treaps. In SPAA, pages 16–26. ACM, 1998.
[17]
{BRS89} Bonnie Berger, John Rompel, and Peter W Shor. Efficient nc algorithms for set cover with applications to learning and geometry. In FOCS, pages 54–59. IEEE, 1989.
[18]
STOC’18, June 25–29, 2018, Los Angeles, CA, USA Eric Balkanski and Yaron Singer
[19]
{BRS17} Eric Balkanski, Aviad Rubinstein, and Yaron Singer. The limitations of optimization from samples. STOC, 2017.
[20]
{BS17a} Eric Balkanski and Yaron Singer. Minimizing a submodular function from samples. 2017.
[21]
{BS17b} Eric Balkanski and Yaron Singer. The sample complexity of optimizing a convex function. In COLT, pages 275–301, 2017.
[22]
{BST12} Guy E Blelloch, Harsha Vardhan Simhadri, and Kanat Tangwongsan. Parallel and i/o efficient set covering algorithms. In SPAA, pages 82–90. ACM, 2012.
[23]
{BVP11} Gregory R Bowman, Vincent A Voelz, and Vijay S Pande. Taming the complexity of protein folding. Current opinion in structural biology, 21(1):4–11, 2011.
[24]
{CG17} Clement Canonne and Tom Gur. An adaptivity hierarchy theorem for property testing. arXiv preprint arXiv:1702.05678, 2017.
[25]
{CKT10} Flavio Chierichetti, Ravi Kumar, and Andrew Tomkins. Max-cover in map-reduce. In WWW, pages 231–240. ACM, 2010.
[26]
{Col88} Richard Cole. Parallel merge sort. SIAM Journal on Computing, 17(4):770–785, 1988.
[27]
{CST + 17} Xi Chen, Rocco A Servedio, Li-Yang Tan, Erik Waingarten, and Jinyu Xie. Settling the query complexity of non-adaptive junta testing. arXiv preprint arXiv:1704.06314, 2017.
[28]
{CWW10} Wei Chen, Chi Wang, and Yajun Wang. Scalable influence maximization for prevalent viral marketing in large-scale social networks. In KDD, pages 1029–1038. ACM, 2010.
[29]
{CWY09} Wei Chen, Yajun Wang, and Siyu Yang. Efficient influence maximization in social networks. In KDD, pages 199–208. ACM, 2009.
[30]
{DG08} Jeffrey Dean and Sanjay Ghemawat. Mapreduce: simplified data processing on large clusters. Communications of the ACM, 51(1):107–113, 2008.
[31]
{DGS84} Pavol Duris, Zvi Galil, and Georg Schnitger. Lower bounds on communication complexity. In STOC, pages 81–91. ACM, 1984.
[32]
{DHK + 16} Nikhil R Devanur, Zhiyi Huang, Nitish Korula, Vahab S Mirrokni, and Qiqi Yan. Whole-page optimization and submodular welfare maximization with online bidders. EC, 4(3):14, 2016.
[33]
{DNO14} Shahar Dobzinski, Noam Nisan, and Sigal Oren. Economic efficiency requires interaction. In STOC, pages 233–242. ACM, 2014.
[34]
{DR01} Pedro Domingos and Matt Richardson. Mining the network value of customers. In KDD, pages 57–66. ACM, 2001.
[35]
{EMZ17} Alessandro Epasto, Vahab S. Mirrokni, and Morteza Zadimoghaddam. Bicriteria distributed submodular maximization in a few rounds. In SPAA, pages 25–33, 2017.
[36]
{FJK10} Joseph D Frazier, Peter K Jimack, and Robert M Kirby. On the use of adjoint-based sensitivity estimates to control local mesh refinement. Commun Comput Phys, 7:631–8, 2010.
[37]
{FK14} Vitaly Feldman and Pravesh Kothari. Learning coverage functions and private release of marginals. In COLT, pages 679–702, 2014.
[38]
{FMV11} Uriel Feige, Vahab S Mirrokni, and Jan Vondrak. Maximizing nonmonotone submodular functions. SIAM Journal on Computing, 40(4):1133–1153, 2011.
[39]
{FV13} Vitaly Feldman and Jan Vondrák. Optimal bounds on approximation of submodular and XOS functions by juntas. In FOCS, 2013.
[40]
{FV15} Vitaly Feldman and Jan Vondrák. Tight bounds on low-degree spectral concentration of submodular and xos functions. In FOCS, pages 923– 942. IEEE, 2015.
[41]
{GK10} Daniel Golovin and Andreas Krause. Adaptive submodularity: A new approach to active learning and stochastic optimization. In COLT, pages 333–345, 2010.
[42]
{GLL11} Amit Goyal, Wei Lu, and Laks VS Lakshmanan. Celf++: optimizing the greedy algorithm for influence maximization in social networks. In WWW, pages 47–48. ACM, 2011.
[43]
{HBCN09} Jarvis D Haupt, Richard G Baraniuk, Rui M Castro, and Robert D Nowak. Compressive distilled sensing: Sparse recovery using adaptivity in compressive measurements. In SSC, pages 1551–1555. IEEE, 2009.
[44]
{HNC09} Jarvis Haupt, Robert Nowak, and Rui Castro. Adaptive sensing for sparse signal recovery. In Digital Signal Processing Workshop and 5th IEEE Signal Processing Education Workshop, 2009.
[45]
{HS15} Thibaut Horel and Yaron Singer. Scalable methods for adaptively seeding a social network. In WWW, pages 441–451, 2015.
[46]
{IPW11} Piotr Indyk, Eric Price, and David P Woodruff. On the power of adaptivity in sparse recovery. In FOCS, 2011.
[47]
{ JBS13} Stefanie Jegelka, Francis Bach, and Suvrit Sra. Reflection methods for user-friendly submodular optimization. In NIPS, pages 1313–1321, 2013.
[48]
{ JLB11} Stefanie Jegelka, Hui Lin, and Jeff A Bilmes. On fast approximate submodular minimization. In NIPS, pages 460–468, 2011.
[49]
{ JXC08} Shihao Ji, Ya Xue, and Lawrence Carin. Bayesian compressive sensing. IEEE Transactions on Signal Processing, 56(6):2346–2356, 2008.
[50]
{KKT03} David Kempe, Jon Kleinberg, and Éva Tardos. Maximizing the spread of influence through a social network. In KDD, pages 137–146. ACM, 2003.
[51]
{KMVV15} Ravi Kumar, Benjamin Moseley, Sergei Vassilvitskii, and Andrea Vattani. Fast greedy algorithms in mapreduce and streaming. ACM Transactions on Parallel Computing, 2(3):14, 2015.
[52]
{KSV10} Howard Karloff, Siddharth Suri, and Sergei Vassilvitskii. A model of computation for mapreduce. In SODA, pages 938–948. Society for Industrial and Applied Mathematics, 2010.
[53]
{MKBK15} Baharan Mirzasoleiman, Amin Karbasi, Ashwinkumar Badanidiyuru, and Andreas Krause. Distributed submodular cover: Succinctly summarizing massive data. In NIPS, pages 2881–2889, 2015.
[54]
{MKSK13} Baharan Mirzasoleiman, Amin Karbasi, Rik Sarkar, and Andreas Krause. Distributed submodular maximization: Identifying representative elements in massive data. In NIPS, pages 2049–2057, 2013.
[55]
{MNSW95} Peter Bro Miltersen, Noam Nisan, Shmuel Safra, and Avi Wigderson. On data structures and asymmetric communication complexity. In STOC, pages 103–111. ACM, 1995.
[56]
{MSW08} Dmitry M Malioutov, Sujay Sanghavi, and Alan S Willsky. Compressed sensing with sequential observations. In ICASSP, pages 3357–3360. IEEE, 2008.
[57]
{MZ15} Vahab Mirrokni and Morteza Zadimoghaddam. Randomized composable core-sets for distributed submodular maximization. In STOC, pages 153–162. ACM, 2015.
[58]
{NJJ14} Robert Nishihara, Stefanie Jegelka, and Michael I Jordan. On the convergence rate of decomposable submodular function minimization. In NIPS, pages 640–648, 2014.
[59]
{NPS15} Harikrishna Narasimhan, David C. Parkes, and Yaron Singer. Learnability of influence in networks. In NIPS, pages 3186–3194, 2015.
[60]
{NW91} Noam Nisan and Avi Widgerson. Rounds in communication complexity revisited. In STOC, pages 419–429. ACM, 1991.
[61]
{NWF78} George L Nemhauser, Laurence A Wolsey, and Marshall L Fisher. An analysis of approximations for maximizing submodular set functionsâĂŤi. Mathematical Programming, 14(1):265–294, 1978.
[62]
{PJG + 14} Xinghao Pan, Stefanie Jegelka, Joseph E Gonzalez, Joseph K Bradley, and Michael I Jordan. Parallel double greedy submodular maximization. In NIPS, pages 118–126, 2014.
[63]
{PS84} Christos H Papadimitriou and Michael Sipser. Communication complexity. Journal of Computer and System Sciences, 28(2):260–269, 1984.
[64]
{RD02} Matthew Richardson and Pedro Domingos. Mining knowledge-sharing sites for viral marketing. In KDD, pages 61–70. ACM, 2002.
[65]
{RS06} Sofya Raskhodnikova and Adam Smith. A note on adaptivity in testing properties of bounded degree graphs. ECCC, 2006.
[66]
{RV98} Sridhar Rajagopalan and Vijay V Vazirani. Primal-dual rnc approximation algorithms for set cover and covering integer programs. SIAM Journal on Computing, 28(2):525–540, 1998.
[67]
{SS13} Lior Seeman and Yaron Singer. Adaptive seeding in social networks. In FOCS, pages 459–468. IEEE, 2013.
[68]
{STK16} Adish Singla, Sebastian Tschiatschek, and Andreas Krause. Noisy submodular maximization via adaptive sampling with applications to crowdsourced image collection summarization. In AAAI, pages 2037–2043, 2016.
[69]
{STW15} Rocco A Servedio, Li-Yang Tan, and John Wright. Adaptivity helps for testing juntas. In CCC, pages 264–279, 2015.
[70]
{Tho90} Steven K Thompson. Adaptive cluster sampling. Journal of the American Statistical Association, 85(412):1050–1059, 1990.
[71]
{TIWB14} Sebastian Tschiatschek, Rishabh K Iyer, Haochen Wei, and Jeff A Bilmes. Learning mixtures of submodular functions for image collection summarization. In NIPS, pages 1413–1421, 2014.
[72]
{Val75} Leslie G Valiant. Parallelism in comparison problems. SIAM Journal on Computing, 4(3):348–355, 1975.
[73]
{Val84} L. G. Valiant. A theory of the learnable. Commun. ACM, 27(11):1134– 1142, 1984.
[74]
{WIB14} Kai Wei, Rishabh Iyer, and Jeff Bilmes. Fast multi-stage submodular maximization. In ICML, pages 1494–1502, 2014.

Cited By

View all
  • (2024)Constrained Submodular Maximization via New Bounds for DR-Submodular FunctionsProceedings of the 56th Annual ACM Symposium on Theory of Computing10.1145/3618260.3649630(1820-1831)Online publication date: 10-Jun-2024
  • (2024)Optimizing Timely Coverage in Communication-Constrained Collaborative Sensing SystemsIEEE Transactions on Control of Network Systems10.1109/TCNS.2022.320380611:3(1717-1729)Online publication date: Sep-2024
  • (2024)Deterministic Algorithm and Faster Algorithm for Submodular Maximization Subject to a Matroid Constraint2024 IEEE 65th Annual Symposium on Foundations of Computer Science (FOCS)10.1109/FOCS61266.2024.00050(700-712)Online publication date: 27-Oct-2024
  • Show More Cited By

Index Terms

  1. The adaptive complexity of maximizing a submodular function

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    STOC 2018: Proceedings of the 50th Annual ACM SIGACT Symposium on Theory of Computing
    June 2018
    1332 pages
    ISBN:9781450355599
    DOI:10.1145/3188745
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 20 June 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Adaptivity
    2. adaptive sampling
    3. parallel algorithms
    4. submodular optimization

    Qualifiers

    • Research-article

    Funding Sources

    • BSF grant
    • NSF CAREER
    • NSF grant

    Conference

    STOC '18
    Sponsor:
    STOC '18: Symposium on Theory of Computing
    June 25 - 29, 2018
    CA, Los Angeles, USA

    Acceptance Rates

    Overall Acceptance Rate 1,469 of 4,586 submissions, 32%

    Upcoming Conference

    STOC '25
    57th Annual ACM Symposium on Theory of Computing (STOC 2025)
    June 23 - 27, 2025
    Prague , Czech Republic

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)105
    • Downloads (Last 6 weeks)10
    Reflects downloads up to 06 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Constrained Submodular Maximization via New Bounds for DR-Submodular FunctionsProceedings of the 56th Annual ACM Symposium on Theory of Computing10.1145/3618260.3649630(1820-1831)Online publication date: 10-Jun-2024
    • (2024)Optimizing Timely Coverage in Communication-Constrained Collaborative Sensing SystemsIEEE Transactions on Control of Network Systems10.1109/TCNS.2022.320380611:3(1717-1729)Online publication date: Sep-2024
    • (2024)Deterministic Algorithm and Faster Algorithm for Submodular Maximization Subject to a Matroid Constraint2024 IEEE 65th Annual Symposium on Foundations of Computer Science (FOCS)10.1109/FOCS61266.2024.00050(700-712)Online publication date: 27-Oct-2024
    • (2023)Adaptivity complexity for causal graph discoveryProceedings of the Thirty-Ninth Conference on Uncertainty in Artificial Intelligence10.5555/3625834.3625871(391-402)Online publication date: 31-Jul-2023
    • (2023)An adaptive algorithm for maximization of non-submodular function with a matroid constraintJournal of Industrial and Management Optimization10.3934/jimo.202203119:3(2050-2070)Online publication date: 2023
    • (2023)The One-Way Communication Complexity of Submodular Maximization with Applications to Streaming and RobustnessJournal of the ACM10.1145/358856470:4(1-52)Online publication date: 12-Aug-2023
    • (2023)A note for approximating the submodular cover problem over integer lattice with low adaptive and query complexitiesInformation Processing Letters10.1016/j.ipl.2023.106393182(106393)Online publication date: Aug-2023
    • (2023)Algorithms for Cardinality-Constrained Monotone DR-Submodular Maximization with Low Adaptivity and Query ComplexityJournal of Optimization Theory and Applications10.1007/s10957-023-02353-7200:1(194-214)Online publication date: 18-Dec-2023
    • (2023)Nonmonotone Submodular Maximization Under Routing ConstraintsTheoretical Computer Science10.1007/978-981-99-7743-7_1(3-17)Online publication date: 26-Nov-2023
    • (2023)Fast Parallel Algorithms for Submodular p-Superseparable MaximizationApproximation and Online Algorithms10.1007/978-3-031-49815-2_16(219-233)Online publication date: 22-Dec-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media