Skip to main content
Log in

Algorithms for Cardinality-Constrained Monotone DR-Submodular Maximization with Low Adaptivity and Query Complexity

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

Submodular maximization is a NP-hard combinatorial optimization problem regularly used in machine learning and data mining with large-scale data sets. To quantify the running time of approximation algorithms, the query complexity and adaptive complexity are two important measures, where the adaptivity of an algorithm is the number of sequential rounds it makes when each round can execute polynomially many function evaluations in parallel. These two concepts reasonably quantify the efficiency and practicability of parallel computation. In this paper, we consider the problem of maximizing a nonnegative monotone DR-submodular function over a bounded integer lattice with a cardinality constraint in a value oracle model. Prior to our work, Soma and Yoshida (Math Program 172:539–563, 2018) have studied this problem and present an approximation algorithm with almost optimal approximate ratio and the same adaptivity and query complexity. We develop two novel algorithms, called low query algorithm and low adaptivity algorithm, which have approximation ratios equal to Soma and Yoshida (2018), but reach a new record of query complexity and adaptivity. As for methods, compared with the threshold greedy in Soma and Yoshida (2018), the low query algorithm reduces the number of possible thresholds and improves both the adaptivity and query complexity. The low adaptivity algorithm further integrates a vector sequencing technique to accelerate adaptive complexity in an exponential manner while only making logarithmic sacrifices on oracle queries.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Algorithm 2
Algorithm 3
Algorithm 4
Algorithm 5

Similar content being viewed by others

Data availability

Not applicable.

References

  1. Agarwal, A., Agarwal, S., Assadi, S., Khanna, S.: Learning with limited rounds of adaptivity: coin tossing, multi-armed bandits, and ranking from pairwise comparisons. Conf. Learn. Theory 65, 39–75 (2017)

    Google Scholar 

  2. Alaei, S., Makhdoumi, A., Malekian, A.: Maximizing sequence-submodular functions and its application to online advertising. Manag. Sci. 67(10), 6030–6054 (2021)

    Article  Google Scholar 

  3. Amiridi, M., Kargas, N., Sidiropoulos, N.D.: Information-theoretic feature selection via tensor decomposition and submodularity. IEEE Trans. Signal Process. 69, 6195–6205 (2021)

    Article  MathSciNet  Google Scholar 

  4. Anari, N., Goel, G., Nikzad, A.: Mechanism design for crowdsourcing: an optimal \(1-1/e\) competitive budget-feasible mechanism for large markets. In: 2014 IEEE 55th Annual Symposium on Foundations of Computer Science (FOCS), pp. 266–275 (2014)

  5. Asadpour, A., Niazadeh, R., Saberi A., Shameli, A.: Sequential submodular maximization and applications to ranking an assortment of products. Chicago Booth Research Paper 20-26 (2020)

  6. Badanidiyuru, A., Mirzasoleiman, B., Karbasi A., Krause, A.: Streaming submodular maximization: massive data summarization on the fly. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 671–680 (2014)

  7. Badanidiyuru, A., Vondrák, J.: Fast algorithms for maximizing submodular functions. In: Proceedings of the 2014 Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), pp. 497–1514 (2014)

  8. Balkanski, E., Breuer, A., Singer, Y.: Non-monotone submodular maximization in exponentially fewer iterations. Adv. Neural. Inf. Process. Syst. 31, 2353–2364 (2018)

    Google Scholar 

  9. Balkanski, E., Rubinstein, A., Singer, Y.: An exponential speedup in parallel running time for submodular maximization without loss in approximation. In: Proceedings of the 30th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), pp. 283–302 (2019)

  10. Balkanski, E., Rubinstein, A., Singer, Y.: An optimal approximation for submodular maximization under a matroid constraint in the adaptive complexity model. In: Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing (STOC), pp. 66–77 (2019)

  11. Balkanski, E., Singer, Y.: The adaptive complexity of maximizing a submodular function. In: Proceedings of the 50th Annual ACM SIGACT Symposium on Theory of Computing (STOC), pp. 1138–1151 (2018)

  12. Barbosa, R., Ene, A., Nguyen, H.L., Ward, J.: The power of randomization: Distributed submodular maximization on massive datasets. In: Proceedings of the 32nd International Conference on Machine Learning, vol. 37, pp. 1236–1244 (2015)

  13. Braverman, M., Mao, J., Weinberg, S.M.: Parallel algorithms for select and partition with noisy comparisons. In: Proceedings of the 48th Annual ACM Symposium on Theory of Computing, pp. 851–862 (2016)

  14. Breuer, A., Balkanski, E., Singer, Y.: The FAST algorithm for submodular maximization. Int. Conf. Mach. Learn. (PMLR) 119, 1134–1143 (2020)

    Google Scholar 

  15. Chekuri, C., Quanrud, K.: Submodular function maximization in parallel via the multilinear relaxation. In: Proceedings of the 2019 Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), pp. 303–322 (2019)

  16. Chen, X., Servedio, R.A., Tan, L.Y., Waingarten, E., Xie, J.: Settling the query complexity of non-adaptive junta testing. J. ACM 65, 1–18 (2018)

    Article  MathSciNet  Google Scholar 

  17. Das, A., Kempe, D.: Algorithms for subset selection in linear regression. In: Proceedings of the 40th Annual ACM Symposium on Theory of Computing, pp. 45–54 (2008)

  18. Dean, J., Ghemawat, S.: Mapreduce: simplified data processing on large clusters. Commun. ACM 51, 107–113 (2008)

    Article  Google Scholar 

  19. El-Arini, K., Guestrin, C.: Beyond keyword search: discovering relevant scientific literature. In: Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining, pp. 439–447 (2011)

  20. Ene, A., Nguyen, H.L.: A nearly-linear time algorithm for submodular maximization with a knapsack constraint. In: 46th International Colloquium on Automata, Languages, and Programming (ICALP), vol. 132, pp. 53:1–53:12 (2019)

  21. Ene, A., Nguyen, H.L.: A reduction for optimizing lattice submodular functions with diminishing returns. arXiv:1606.08362 (2016)

  22. Ene, A., Nguyen, H.L.: Submodular maximization with nearly-optimal approximation and adaptivity in nearly-linear time. In: Proceedings of the 30th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), pp. 274–282 (2019)

  23. Epasto, A., Mirrokni, V., Zadimoghaddam, M.: Bicriteria distributed submodular maximization in a few rounds. In: Proceedings of the 29th ACM Symposium on Parallelism in Algorithms and Architectures, pp. 25–33 (2017)

  24. Fahrbach, M., Mirrokni, V., Zadimoghaddam, M.: Submodular maximization with nearly optimal approximation, adaptivity and query complexity. In: Proceedings of the 30th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), pp. 255–273 (2019)

  25. Feige, U., Mirrokni, V.S., Vondrák, J.: Maximizing non-monotone submodular functions. SIAM J. Comput. 40, 1133–1153 (2011)

    Article  MathSciNet  Google Scholar 

  26. Ghadimi, A., Beigy, H.: Deep submodular network: an application to multi-document summarization. Expert Syst. Appl. 152, 113392 (2020)

    Article  Google Scholar 

  27. Gong, S., Nong, Q., Bao, S., Fang, Q., Ding, D.-Z.: A fast and deterministic algorithm for Knapsack-constrained monotone DR-submodular maximization over an integer lattice. J. Glob. Optim. 85, 15–38 (2023)

    Article  MathSciNet  Google Scholar 

  28. Indyk, P., Price, E., Woodruff, D.P.: On the power of adaptivity in sparse recovery. In: 2011 IEEE 52nd Annual Symposium on Foundations of Computer Science (FOCS), pp. 285–294 (2011)

  29. Kapralov, M., Post, I., Vondrák, J.: Online submodular welfare maximization: Greedy is optimal. In: Proceedings of the 2013 Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), pp. 1216–1225 (2013)

  30. Kazemi, E., Zadimoghaddam, M., Karbasi, A.: Scalable deletion-robust submodular maximization: data summarization with privacy and fairness constraints. Int. Conf. Mach. Learn. 80, 2549–2558 (2018)

    Google Scholar 

  31. Kempe, D., Kleinberg, J., Tardos, É.: Maximizing the spread of influence through a social network. In: Proceedings of the 9th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 137–146 (2003)

  32. Kothawade, S., Beck, N., Killamsetty, K., Iyer, R.: Similar: submodular information measures based active learning in realistic scenarios. Adv. Neural Inf. Process. Syst. 34, 18685–18697 (2021)

    Google Scholar 

  33. Krause, A., Guestrin, C.: Near-optimal nonmyopic value of information in graphical models. arXiv:1207.1394 (2012)

  34. Krause, A., Singh, A.P., Guestrin, C.: Near-optimal sensor placements in Gaussian processes: theory, efficient algorithms and empirical studies. J. Mach. Learn. Res. 9, 235–284 (2008)

    Google Scholar 

  35. Kumar, R., Moseley, B., Vassilvitskii, S., Vattani, A.: Fast greedy algorithms in mapreduce and streaming. ACM Trans. Parallel Comput. 2, 1–22 (2015)

    Article  Google Scholar 

  36. Lin, H., Bilmes, J.: A class of submodular functions for document summarization. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies (HLT), vol. 1, pp. 510–520 (2011)

  37. Malings, C., Pozzi, M.: Submodularity issues in value-of-information-based sensor placement. Reliab. Eng. Syst. Saf. 183, 93–103 (2019)

    Article  Google Scholar 

  38. Minoux, M.: Accelerated greedy algorithms for maximizing submodular set functions. Optim. Tech. 7, 234–243 (1978)

    Article  MathSciNet  Google Scholar 

  39. Mirzasoleiman, B., Badanidiyuru, A., Karbasi, A., Vondrák, J., Krause, A.: Lazier than lazy greedy. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 29, pp. 1812–1818 (2015)

  40. Mirzasoleiman, B., Karbasi, A., Sarkar, R., Krause, A.: Distributed submodular maximization: identifying representative elements in massive data. Adv. Neural Inf. Process. Syst. 26, 2049–2057 (2013)

    Google Scholar 

  41. Mirzasoleiman, B., Karbasi, A., Krause, A.: Deletion-robust submodular maximization: data summarization with the right to be forgotten. Int. Conf. Mach. Learn. 70, 2449–2458 (2017)

    Google Scholar 

  42. Mitrovic, S., Bogunovic, I., Norouzi-Fard, A., Tarnawski, J.M., Cevher, V.: Streaming robust submodular maximization: a partitioned thresholding approach. Adv. Neural Inf. Process. Syst. 30, 4557–4566 (2017)

    Google Scholar 

  43. Namkoong, H., Sinha, A., Yadlowsky, S., Duchi, J.C.: Adaptive sampling probabilities for non-smooth optimization. Int. Conf. Mach. Learn. 70, 2574–2583 (2017)

    Google Scholar 

  44. Nemhauser, G.L., Wolsey, L.A.: Best algorithms for approximating the maximum of a submodular set function. Math. Oper. Res. 3, 177–188 (1978)

    Article  MathSciNet  Google Scholar 

  45. Nemhauser, G.L., Wolsey, L.A., Fisher, M.L.: An analysis of approximations for maximizing submodular set functions-I. Math. Program. 14, 265–294 (1978)

    Article  MathSciNet  Google Scholar 

  46. Nisan, N., Widgerson, A.: Rounds in communication complexity revisited. SIAM J. Comput. 22(1), 211–219 (1993)

    Article  MathSciNet  Google Scholar 

  47. Salha, G., Tziortziotis, N., Vazirgiannis, M.: Adaptive submodular influence maximization with myopic feedback. In: 2018 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM), pp. 455–462 (2018)

  48. Soma, T., Kakimura, N., Inaba, K., Kawarabayashi, K.: Optimal budget allocation: theoretical guarantee and efficient algorithm. In: Proceedings of the 31st International Conference on Machine Learning (PMLR), vol. 32, pp. 351–359 (2014)

  49. Soma, T., Yoshida, Y.: A generalization of submodular cover via the diminishing return property on the integer lattice. Adv. Neural Inf. Process. Syst. 28, 847–855 (2015)

    Google Scholar 

  50. Soma, T., Yoshida, Y.: Maximizing monotone submodular functions over the integer lattice. Math. Program. 172, 539–563 (2018)

    Article  MathSciNet  Google Scholar 

  51. Sviridenko, M.: A note on maximizing a submodular set function subject to a knapsack constraint. Oper. Res. Lett. 32, 41–43 (2004)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This research was supported in part by the National Natural Science Foundation of China under Grant Nos. 12171444 and 11871442 and was also supported in part by the Natural Science Foundation of Shandong Province under Grant No. ZR2019MA052 and the Fundamental Research Funds for the Central Universities.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qingqin Nong.

Additional information

Communicated by Antoine Deza.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gong, S., Nong, Q., Fang, J. et al. Algorithms for Cardinality-Constrained Monotone DR-Submodular Maximization with Low Adaptivity and Query Complexity. J Optim Theory Appl 200, 194–214 (2024). https://doi.org/10.1007/s10957-023-02353-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-023-02353-7

Keywords

Mathematics Subject Classification

Navigation