Avoid common mistakes on your manuscript.
Polynomials and tensors are inimately related. An order-d symmetric tensor over an n-dimensional vector space may be viewed as a homogeneous polynomial of degree d in n variables. More generally, an order-d tensor over an n-dimensional vector space is equivalent to a homogeneous polynomial of degree d in n noncommutative variables. This special issue was created with the intention of exploring polynomial optimization, tensor approximations, and their interconnections. Targeted topics include semidefinite programming relaxations, nonnegative polynomials, truncated moments, sparsity polynomials, and various algorithms and complexity issues related to tensor problems. In the following we present a digest of the ten excellent articles included in this special issue.
Doubly nonnegative (DNN) relaxations for binary and box constrained problems are common in polynomial optimization. Both dense and sparse DNN relaxations can be reduced to conic optimization which can in turn be solved efficiently with bisection-projection-type algorithms.
When the feasible set is the unit sphere, one may prove convergent upper bounds for the minimum value, obtained using moment relaxations with sum-of-squares type probability distributions. It is important to estimate the convergence rate for such relaxation hierarchies, which shed light on the convergence rates of various relaxation methods for generalized moment problems.
For polynomial optimization over more general compact sets, we may similarly compute minimum value using moment relaxations with probability measures that are products of sum-of-squares type distributions and other distributions supported on the given sets. By bounding the degree of sum-of-squares polynomials, one obtains a hierarchy of upper bounds for the minimum value. When the feasible set is the hypercube, one may establish a convergence rate of \(O(1/r^2)\), with r the degree parameter, a result that also extends to certain other feasible sets.
Exploiting sparsity is important for efficient relaxation methods. Sparse moment-SOS relaxations can be obtained for noncommutative polynomial optimization, analogous to those for commutative polynomial optimization.
Quadratically constrained quadratic programs (QCQP) is a specific class of polynomial optimization problem that may be solved approximately with standard semidefinite programming relaxations, i.e., moment-SOS relaxations of degree two. It is interesting to know what classes of QCQPs can be solved exactly with this procedure and this can be done using local stability analysis.
Quantum states can be represented by positive semidefinite Hermitian matrices of unit trace. It is an important problem to decide whether a quantum state is separable or entangled. It turns out that this can be formulated as a truncated moment problem. One may then investigate necessary and sufficient conditions for truncated moment cones. They are expressed as linear matrix inequalities that can be derived by examining the dual cone of nonnegative polynomials.
Polynomials can be multiplied, evaluated, differentiated, integrated; they have moment-SOS hierarchies and satisfy various positivstellensätze—concrete techniques that can be brought to bear on tensors when we represent them as polynomials. Indeed, many tensor problems related to decomposition, approximation, or completion may be formulated as nonconvex polynomial optimization problems and solved using sum-of-squares relaxations. Nevertheless, for largescale tensor problems, such rigorous algorithms may be intractable and local search methods such as stochastic gradient descent may be more practical. In such cases, it is important to understand the mathematics behind these problem as naive application of local optimization methods to NP-hard problems invariably leads to poor results. This is where one brings in modern methodologies like optimization landscape, an important development in machine learning, or smoothed analysis, a powerful technique for overcoming worst-case intractability.
We would like to thank the referees of these articles for their services on our behalf.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Friedland, S., Lasserre, JB., Lim, LH. et al. Special Issue: Polynomial and Tensor Optimization. Math. Program. 193, 511–512 (2022). https://doi.org/10.1007/s10107-022-01826-3
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10107-022-01826-3