Skip to main content
Log in

A new bounded degree hierarchy with SOCP relaxations for global polynomial optimization and conic convex semi-algebraic programs

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

In this paper, we propose a bounded degree hierarchy of both primal and dual conic programming relaxations involving both semi-definite and second-order cone constraints for solving a nonconvex polynomial optimization problem with a bounded feasible set. This hierarchy makes use of some key aspects of the convergent linear programming relaxations of polynomial optimization problems (Lasserre in Moments, positive polynomials and their applications, World Scientific, Singapore, 2010) associated with Krivine–Stengle’s certificate of positivity in real algebraic geometry and some advantages of the scaled diagonally dominant sum of squares (SDSOS) polynomials (Ahmadi and Hall in Math Oper Res, 2019. https://doi.org/10.1287/moor.2018.0962; Ahmadi and Majumdar in SIAM J Appl Algebra Geom 3:193–230, 2019). We show that the values of both primal and dual relaxations converge to the global optimal value of the original polynomial optimization problem under some technical assumptions. Our hierarchy, which extends the so-called bounded degree Lasserre hierarchy (Lasserre et al. in Eur J Comput Optim 5:87–117, 2017), has a useful feature that the size and the number of the semi-definite and second-order cone constraints of the relaxations are fixed and independent of the step or level of the approximation in the hierarchy. As a special case, we provide a convergent bounded degree second-order cone programming (SOCP) hierarchy for solving polynomial optimization problems. We then present finite convergence at step one of the SOCP hierarchy for classes of polynomial optimization problems. This includes one-step convergence for a new class of first-order SDSOS-convex polynomial programs. In this case, we also show how a global solution is recovered from the level one SOCP relaxation. We finally derive a corresponding convergent conic linear programming hierarchy for conic-convex semi-algebraic programs. Whenever the semi-algebraic set of the conic-convex program is described by concave polynomial inequalities, we show further that the values of the relaxation problems converge to the common value of the convex program and its Lagrangian dual under a constraint qualification.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

Notes

  1. This was kindly pointed out to us by one of the referees

  2. It is worth noting that [2, Proposition 3.15] provides a further class of examples for convex quadratic functions which are not SDSOS.

References

  1. Ahmadi, A.A., Hall, G.: On the construction of converging hierarchies for polynomial optimization based on certificates of global positivity. Math. Oper. Res. (2019). https://doi.org/10.1287/moor.2018.0962

    Article  Google Scholar 

  2. Ahmadi, A.A., Majumdar, A.: DSOS and SDSOS optimization: more tractable alternatives to sum of squares and semidefinite optimization. SIAM J. Appl. Algebra Geom. 3, 193–230 (2019)

    Article  MathSciNet  Google Scholar 

  3. Ahmadi, A.A., Parrilo, P.A.: A complete characterization of the gap between convexity and SOS-convexity. SIAM J. Optim. 23(2), 811–833 (2013)

    Article  MathSciNet  Google Scholar 

  4. Belousov, E.G., Klatte, D.: A Frank–Wolfe type theorem for convex polynomial programs. Comput. Optim. Appl. 22(1), 37–48 (2002)

    Article  MathSciNet  Google Scholar 

  5. Bertsimas, D., Freund, R.M., Sun, X.A.: An accelerated first-order method for solving SOS relaxations of unconstrained polynomial optimization problems. Optim. Methods Softw. 28, 424–441 (2013)

    Article  MathSciNet  Google Scholar 

  6. Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, Cambridge (2004)

    Book  Google Scholar 

  7. Chuong, T.D., Jeyakumar, V.: Convergent conic linear programming relaxations for cone convex polynomial programs. Oper. Res. Lett. 45(3), 220–226 (2017)

    Article  MathSciNet  Google Scholar 

  8. Chuong, T.D., Jeyakumar, V.: Generalized Lagrangian duality for nonconvex polynomial programs with polynomial multipliers. J. Global Optim. 72(4), 655–678 (2018)

    Article  MathSciNet  Google Scholar 

  9. D’Angelo, P., Putinar, M.: Polynomial Optimization on Odd-Dimensional Spheres, in Emerging Applications of Algebraic Geometry. Springer, New York (2008)

    Google Scholar 

  10. Fidalgo, C., Kovacec, A.: Positive semidefinite diagonal minus tail forms are sums of squares. Math. Z. 269, 629–645 (2011)

    Article  MathSciNet  Google Scholar 

  11. Floudas, C.A., Pardalos, P.M., Adjiman, C.S., Esposito, W.R., Gumus, Z.H., Harding, S.T., Klepeis, J.L., Meyer, C.A., Schweiger, C.A.: Handbook of Test Problems in Local and Global Optimization. Kluwer Academic Publishers, Dordrecht (1999)

    Book  Google Scholar 

  12. Ghaddar, B., Vera, J.C., Anjos, M.F.: A dynamic inequality generation scheme for polynomial programming. Math. Program. 156, 21–57 (2016)

    Article  MathSciNet  Google Scholar 

  13. Ghasemi, M., Marshall, M.: Lower bounds for polynomials using geometric programming. SIAM J. Optim. 22(2), 460–473 (2012)

    Article  MathSciNet  Google Scholar 

  14. Henrion, D., Lasserre, J.B., Loefberg, J.: GloptiPoly 3: moments, optimization and semidefinite programming. Optim. Methods Softw. 24, 761–779 (2009)

    Article  MathSciNet  Google Scholar 

  15. Horn, R., Johnson, C.R.: Matrix Analysis, 2nd edn, p. xviii+643. Cambridge University Press, Cambridge (2013)

    MATH  Google Scholar 

  16. Hu, S., Li, G., Qi, L.: A tensor analogy of Yuan’s theorem of the alternative and polynomial optimization with sign structure. J. Optim. Theory Appl. 168(2), 446–474 (2016)

    Article  MathSciNet  Google Scholar 

  17. Helton, J.W., Nie, J.W.: Semidefinite representation of convex sets. Math. Program. 122, 21–64 (2010)

    Article  MathSciNet  Google Scholar 

  18. Jeyakumar, V.: Constraint qualifications characterizing Lagrangian duality in convex optimization. J. Optim. Theory Appl. 136(1), 31–41 (2008)

    Article  MathSciNet  Google Scholar 

  19. Jeyakumar, V., Lee, G.M., Li, G.: Alternative theorems for quadratic inequality systems and global quadratic optimization. SIAM J. Optim. 2, 667–690 (2009)

    MathSciNet  MATH  Google Scholar 

  20. Jeyakumar, V., Li, G.: Exact conic programming relaxations for a class of convex polynomial cone programs. J. Optim. Theory Appl. 172(1), 156–178 (2017)

    Article  MathSciNet  Google Scholar 

  21. Jeyakumar, V., Kim, S., Lee, G.M., Li, G.: Solving global optimization problems with sparse polynomials and unbounded semialgebraic feasible sets. J. Global Optim. 65, 175–190 (2016)

    Article  MathSciNet  Google Scholar 

  22. Josa, C., Molzahn, D.: Lasserre hierarchy for large scale polynomial optimization in real and complex variables. SIAM J. Optim. 28, 1017–1048 (2018)

    Article  MathSciNet  Google Scholar 

  23. Krivine, J.L.: Anneaux préordonnés. J. Anal. Math. 12, 307–326 (1964)

    Article  Google Scholar 

  24. Kim, S., Kojima, M.: Exact solutions of some nonconvex quadratic optimization problems via SDP and SOCP relaxations. Comput. Optim. Appl. 26(2), 143–154 (2003)

    Article  MathSciNet  Google Scholar 

  25. Kuang, X., Ghaddar, B., Naoum-Sawaya, J., Zuluaga, L.F.: Alternative SDP and SOCP approximations for polynomial optimization. Eur. J. Comp. Optim. 7, 153–175 (2019)

    Article  MathSciNet  Google Scholar 

  26. Lasserre, J.B.: A Lagrangian relaxation view of linear and semidefinite hierarchies. SIAM J. Optim 23(3), 1742–1756 (2013)

    Article  MathSciNet  Google Scholar 

  27. Lasserre, J.B.: Moments, Positive Polynomials and Their Applications. World Scientific, Singapore (2010)

    MATH  Google Scholar 

  28. Lasserre, J.B.: Representation of nonnegative convex polynomial. Arch. Math. 91, 126–130 (2008)

    Article  MathSciNet  Google Scholar 

  29. Laurent, M.: Sums of squares, moment matrices and optimization over polynomials. In: Putinar, M., Sullivant, S. (eds.) Emerging Applications of Algebraic Geometry, Vol. 149 of IMA Volumes in Mathematics and its Applications, vol. 149, pp. 157–270. Springer, Berlin (2009)

    Google Scholar 

  30. Lasserre, J.B., Toh, K.C., Yang, S.: A bounded degree SOS hierarchy for polynomial optimization. Eur. J. Comput. Optim. 5, 87–117 (2017)

    Article  MathSciNet  Google Scholar 

  31. Mordukhovich, B.S., Nam, N.M.: An Easy Path to Convex Analysis and Applications, Synthesis Lectures on Mathematics and Statistics, 14. Morgan & Claypool Publishers, Williston (2014)

    Google Scholar 

  32. Megretski, A.: SPOT (Systems polynomial optimization tools) Manual, 2010, http://web.mit.edu/ameg/www/images/spot_manual.pdf

  33. Nie, J.W.: Polynomial matrix inequality and semidefinite representation. Math. Oper. Res. 36, 398–415 (2011)

    Article  MathSciNet  Google Scholar 

  34. Nie, J.W., Wang, L.: Regularization methods for SDP relaxations in large-scale polynomial optimization. SIAM J. Optim. 22, 408–428 (2012)

    Article  MathSciNet  Google Scholar 

  35. Parrilo, P.A.: Semidefinite programming relaxations for semialgebraic problems. Math. Program. 96, 293–320 (2003)

    Article  MathSciNet  Google Scholar 

  36. Shapiro, A.: First and second order analysis of nonlinear semidefinite programs. Math. Program. 77, 301–320 (1997)

    MathSciNet  MATH  Google Scholar 

  37. Waki, H., Kim, S., Kojima, M., Muramatsu, M.: Sums of squares and semidefinite programming relaxations for polynomial optimization problems with structured sparsity. SIAM J. Optim. 17, 218–242 (2006)

    Article  MathSciNet  Google Scholar 

  38. Weisser, T., Lasserre, J., Toh, K.: Sparse-BSOS: a bounded degree SOS hierarchy for large scale polynomial optimization with sparsity. Math. Program. Comput. 5, 1–32 (2017)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the referees for their valuable comments and suggestions which greatly improved the original version of the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to V. Jeyakumar.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Research was supported by a research Grant from Australian Research Council.

Appendix: Test problems

Appendix: Test problems

  • TP1 ([11, Test problem 3, page 24]): Consider the following nonconvex polynomial problem:

    $$\begin{aligned} \begin{array}{ll} \displaystyle \min _{x \in {\mathbb {R}}^6} &{} -25(x_1 - 2)^2 - (x_2 - 2)^2 - (x_3 - 1)^2-(x_4 - 4)^2 - (x_5 - 1)^2 - (x_6 - 4)^2 \\ \text{ s.t. } &{} (x_3-3)^2+x_4\ge 4, (x_5-3)^2+x_6\ge 4, x_1-3x_2\le 2, -x_1\\ &{}+x_2\le 2, x_1+x_2\le 6,\\ &{} x_1+x_2\ge 2, x_1\ge 0, x_2\ge 0, 1\le x_3\le 5, 0\le x_4\le 6, 1\le x_5\le 5, 0\le x_6\le 10. \end{array} \end{aligned}$$

    In [11, Test problem 3, page 24], it is shown that the optimal value is \(-310\) and is attained at \(x^*=(5,1,5,0,5,10)^T\).

  • TP2 ([11, Test problem 3, page 7]): Consider the following nonconvex polynomial problem:

    $$\begin{aligned} \begin{array}{ll} \displaystyle \min _{x \in {\mathbb {R}}^{13}} &{} 50x_1+50x_2+50x_3+50x_4-\sum _{i=1}^450x_i^2-\sum _{i=5}^{13} x_i \\ \text{ s.t. } &{} 2x_1+ 2x_2+ x_{10}+x_{11}\le 10, 2x_1\\ &{} + 2x_3+ x_{10}+x_{12}\le 10, 2x_2+ 2x_3+ x_{11}+x_{12}\le 10,\\ &{} -8x_1+x_{10}\le 0, -8x_2+x_{11}\le 0, -8x_3+x_{12}\le 0, -2x_4-x_5+x_{10}\le 0, \\ &{} -2x_6-x_7+x_{11}\le 0, -2x_8-x_9+x_{12}\le 0, 0\le x_i\le 3, i=10,11,12,\\ &{} 0\le x_i\le 1, i=1,\ldots ,9,13. \end{array} \end{aligned}$$

    In [11, Test problem 3, page 7], it is shown that the optimal value is \(-15\) and is attained at \(x^*=(1,1,1, 1,1,1,1,1,1,3,3,3,1)^T\).

  • TP3 ([30, Example P4-4, page 109]): Consider the following nonconvex polynomial problem:

    $$\begin{aligned} \begin{array}{ll} \displaystyle \min _{x \in {\mathbb {R}}^{4}} &{} x_1^4-x_2^4+x_3^4-x_4^4 \\ \text{ s.t. } &{} 0\le 2x_1^4+3x_2^2+2x_1x_2+2x_3^4+3x_4^2+2x_3x_4\le 1,\\ &{} 0\le 3x_1^2+2x_2^2-4x_1x_2+3x_3^2+2x_4^2-4x_3x_4\le 1, \\ &{} 0\le x_1^2+6x_2^2-4x_1x_2+x_3^2+6x_4^2-4x_3x_4\le 1,\\ &{} 0\le x_1^2+4x_2^4-3x_1x_2+x_3^2+4x_4^4-3x_3x_4\le 1,\\ &{} 0\le 2x_1^2+5x_2^2+3x_1x_2+2x_3^2+5x_4^2+3x_3x_4\le 1, x\in [0,1]^4. \end{array} \end{aligned}$$

    In [30, Example P4-4, page 109], it is shown that the optimal value is \(-0.033539\).

  • TP4 ([30, Example P8-4, page 111]): Consider the following nonconvex polynomial problem:

    $$\begin{aligned} \begin{array}{ll} \displaystyle \min _{x \in {\mathbb {R}}^{8}} &{} x_1^4-x_2^4+x_3^4-x_4^4+x_5^4-x_6^4+x_7^4-x_8^4+x_1-x_2 \\ \text{ s.t. } &{} 0\le 2x_1^4+3x_2^2+2x_1x_2+2x_3^4+3x_4^2+2x_3x_4+2x_5^4+3x_6^2+2x_5x_6+2x_7^4+3x_8^2\\ &{} +2x_7x_8\le 1,\\ &{} 0\le 3x_1^2+2x_2^2-4x_1x_2+3x_3^2+2x_4^2-4x_3x_4+3x_5^2+2x_6^2-4x_5x_6\\ &{} +3x_7^2+2x_8^2-4x_7x_8\le 1, \\ &{} 0\le x_1^2+6x_2^2-4x_1x_2+x_3^2+6x_4^2-4x_3x_4+x_5^2+6x_6^2-4x_5x_6+x_7^2\\ &{} +6x_8^2-4x_7x_8\le 1,\\ &{} 0\le x_1^2+4x_2^4-3x_1x_2+x_3^2+4x_4^4-3x_3x_4+x_5^2+4x_6^4-3x_5x_6+x_7^2\\ &{} +4x_8^4-3x_7x_8\le 1,\\ &{} 0\le 2x_1^2+5x_2^2+3x_1x_2+2x_3^2+5x_4^2+3x_3x_4+2x_5^2+5x_6^2+3x_5x_6+2x_7^2\\ &{} +5x_8^2+3x_7x_8\le 1,\\ &{} x\in [0,1]^8. \end{array} \end{aligned}$$

    In [30, Example P8-4, page 111], it is shown that the optimal value is \(-0.43603\).

  • TP5 ([30, Example P20-4, page 113]): Consider the following nonconvex polynomial problem:

    $$\begin{aligned} \begin{array}{ll} \displaystyle \min _{x \in {\mathbb {R}}^{20}} &{} x_1^4-x_2^4+x_3^2-x_4^2+x_5^2-x_6^2+x_7^2-x_8^2+x_9^2-x_{10}^2+x_{11}^2-x_{12}^2\\ &{} +x_1-x_2+x_{13}^2-x_{14}^2+x_{15}^2\\ &{} -x_{16}^2+x_{17}^2-x_{18}^2+x_{19}^2-x_{20}^2 \\ \text{ s.t. } &{} 0\le 2x_1^2+3x_2^2+2x_1x_2+2x_3^2+3x_4^2+2x_3x_4+2x_5^2+3x_6^2\\ &{} +2x_5x_6+2x_7^2+3x_8^2+2x_7x_8\\ &{} +2x_9^2+3x_{10}^2+2x_9x_{10}+2x_{11}^2+3x_{12}^2+2x_{11}x_{12}\\ &{} +2x_{13}^2+ 3x_{14}^2+2x_{13}x_{14}+2x_{15}^2 +3x_{16}^2\\ &{} +2x_{15}x_{16}+2x_{17}^2+3x_{18}^2+2x_{17}x_{18}+2x_{19}^2\\ &{} +3x_{20}^2+2x_{19}x_{20}\le 1,\\ &{} 0\le 3x_1^2+2x_2^2-4x_1x_2+3x_3^2+2x_4^2-4x_3x_4+3x_5^2+2x_6^2-4x_5x_6\\ &{} +3x_7^2+2x_8^2-4x_7x_8\\ &{} +3x_9^2+2x_{10}^2-4x_9x_{10}+3x_{11}^2+2x_{12}^2-4x_{11}x_{12}+3x_{13}^2\\ &{} + 2x_{14}^2-4x_{13}x_{14}+3x_{15}^2 +2x_{16}^2\\ &{} -4x_{15}x_{16}+3x_{17}^2+2x_{18}^2-4x_{17}x_{18}+3x_{19}^2\\ &{} +2x_{20}^2-4x_{19}x_{20}\le 1, \\ &{} 0\le x_1^2+6x_2^2-4x_1x_2+x_3^2+6x_4^2-4x_3x_4+x_5^2+6x_6^2-4x_5x_6\\ &{} +x_7^2+6x_8^2-4x_7x_8\\ &{} +x_9^2+6x_{10}^2-4x_9x_{10}+x_{11}^2+6x_{12}^2-4x_{11}x_{12}+x_{13}^2\\ &{} + 6x_{14}^2-4x_{13}x_{14}+x_{15}^2 +6x_{16}^2\\ &{} -4x_{15}x_{16}+x_{17}^2+6x_{18}^2-4x_{17}x_{18}+x_{19}^2+6x_{20}^2-4x_{19}x_{20}\le 1,\\ &{} 0\le x_1^2+4x_2^2-3x_1x_2+x_3^2+4x_4^2-3x_3x_4+x_5^2+4x_6^2-3x_5x_6\\ &{} +x_7^2+4x_8^2-3x_7x_8\\ &{} +x_9^2+4x_{10}^2-3x_9x_{10}+x_{11}^2+4x_{12}^2-3x_{11}x_{12}+x_{13}^2+ 4x_{14}^2\\ &{} -3x_{13}x_{14}+x_{15}^2 +4x_{16}^2\\ &{} -3x_{15}x_{16}+x_{17}^2+4x_{18}^2-3x_{17}x_{18}+x_{19}^2+4x_{20}^2\\ &{} -3x_{19}x_{20}\le 1,\\ &{} 0\le 2x_1^2+5x_2^2+3x_1x_2+2x_3^2+5x_4^2+3x_3x_4+2x_5^2+5x_6^2\\ &{} +3x_5x_6+2x_7^2+5x_8^2+3x_7x_8\\ &{} +2x_9^2+5x_{10}^2+3x_9x_{10}+2x_{11}^2+5x_{12}^2+3x_{11}x_{12}+2x_{13}^2\\ &{} + 5x_{14}^2+3x_{13}x_{14}+2x_{15}^2 +5x_{16}^2\\ &{} +3x_{15}x_{16}+2x_{17}^2+5x_{18}^2+3x_{17}x_{18}+2x_{19}^2\\ &{} +5x_{20}^2+3x_{19}x_{20}\le 1,\\ &{} x\in [0,1]^{20}. \end{array} \end{aligned}$$

    In [30, Example P20-4, page 113], it is shown that the optimal value is \(-0.43603\).

  • TP6 ([30, Example P6-6, page 110]): Consider the following nonconvex polynomial problem:

    $$\begin{aligned} \begin{array}{ll} \displaystyle \min _{x \in {\mathbb {R}}^{6}} &{} x_1^6-x_2^6+x_3^6-x_4^6+x_5^6-x_6^6+x_1-x_2 \\ \text{ s.t. } &{} 0\le 2x_1^6+3x_2^2+2x_1x_2+2x_3^6+3x_4^2+2x_3x_4+2x_5^6+3x_6^2+2x_5x_6\le 1,\\ &{} 0\le 3x_1^2+2x_2^2-4x_1x_2+3x_3^2+2x_4^2-4x_3x_4+3x_5^2+2x_6^2-4x_5x_6\le 1, \\ &{} 0\le x_1^2+6x_2^2-4x_1x_2+x_3^2+6x_4^2-4x_3x_4+x_5^2+6x_6^2-4x_5x_6\le 1,\\ &{} 0\le x_1^2+4x_2^6-3x_1x_2+x_3^2+4x_4^6-3x_3x_4+x_5^2+4x_6^6-3x_5x_6\le 1,\\ &{} 0\le 2x_1^2+5x_2^2\!+\!3x_1x_2\!+\!2x_3^2\!+\!5x_4^2\!+\!3x_3x_4\!+\!2x_5^2\!+\!5x_6^2+3x_5x_6\le 1, x\in [0,1]^6. \end{array} \end{aligned}$$

    In [30, Example P6-6, page 110], it is shown that the optimal value is \(-0.41288\).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chuong, T.D., Jeyakumar, V. & Li, G. A new bounded degree hierarchy with SOCP relaxations for global polynomial optimization and conic convex semi-algebraic programs. J Glob Optim 75, 885–919 (2019). https://doi.org/10.1007/s10898-019-00831-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-019-00831-9

Keywords

Navigation