Polynomial SOS

{{about|representing a non-negative polynomial as sum of squares of polynomials|representing a polynomial as a sum of squares of rational functions|Hilbert's seventeenth problem|the sum of squares of consecutive integers|square pyramidal number|representing an integer as a sum of squares of integers|Lagrange's four-square theorem}}

In mathematics, a form (i.e. a homogeneous polynomial) h(x) of degree 2m in the real n-dimensional vector x is sum of squares of forms (SOS) if and only if there exist forms g_1(x),\ldots,g_k(x) of degree m such that

h(x) = \sum_{i=1}^k g_i(x)^2 .

Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for n = 2, 2m = 2, or n = 3 and 2m = 4 a form is SOS if and only if it is positive.{{cite journal|last1=Hilbert|first1=David|title=Ueber die Darstellung definiter Formen als Summe von Formenquadraten|journal=Mathematische Annalen|date=September 1888|volume=32|issue=3|pages=342–350|doi=10.1007/bf01443605|s2cid=177804714 |url=https://zenodo.org/record/1428214}} The same is also valid for the analog problem on positive symmetric forms.{{cite journal|last1=Choi|first1=M. D.|last2=Lam|first2=T. Y.|title=An old question of Hilbert|journal=Queen's Papers in Pure and Applied Mathematics|date=1977|volume=46|pages=385–405}}{{cite journal|last1=Goel|first1=Charu|last2=Kuhlmann|first2=Salma|last3=Reznick|first3=Bruce|title=On the Choi–Lam analogue of Hilbert's 1888 theorem for symmetric forms|journal=Linear Algebra and Its Applications|date=May 2016|volume=496|pages=114–120|doi=10.1016/j.laa.2016.01.024|arxiv=1505.08145|s2cid=17579200 |author3-link=Bruce Reznick}}

Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found.{{cite journal|last1=Lasserre|first1=Jean B.|title=Sufficient conditions for a real polynomial to be a sum of squares| journal=Archiv der Mathematik|volume=89|issue=5|pages=390–398|doi=10.1007/s00013-007-2251-y|url=http://www.optimization-online.org/DB_HTML/2007/02/1587.html|arxiv=math/0612358|year=2007|citeseerx=10.1.1.240.4438|s2cid=9319455 }}{{cite journal| last1=Powers|first1=Victoria|author1-link=Victoria Powers|last2=Wörmann|first2=Thorsten|title=An algorithm for sums of squares of real polynomials|journal=Journal of Pure and Applied Algebra|date=1998|volume=127|issue=1|pages=99–104| doi=10.1016/S0022-4049(97)83827-3| url=http://www.mathcs.emory.edu/~vicki/pub/sos.pdf|doi-access=free}} Moreover, every real nonnegative form can be approximated as closely as desired (in the l_1-norm of its coefficient vector) by a sequence of forms \{f_\epsilon\} that are SOS.{{cite journal|last1=Lasserre|first1=Jean B.|title=A Sum of Squares Approximation of Nonnegative Polynomials|journal=SIAM Review|date=2007|volume=49|issue=4|pages=651–669|doi=10.1137/070693709|arxiv=math/0412398|bibcode=2007SIAMR..49..651L}}

Square matricial representation (SMR)

To establish whether a form {{math|h(x)}} is SOS amounts to solving a convex optimization problem. Indeed, any {{math|h(x)}} can be written as

h(x) = x^{\{m\}'}\left(H+L(\alpha)\right)x^{\{m\}}

where x^{\{m\}} is a vector containing a base for the forms of degree m in x (such as all monomials of degree m in x), the prime ′ denotes the transpose, H is any symmetric matrix satisfying

h(x) = x^{\left\{m\right\}'}Hx^{\{m\}}

and L(\alpha) is a linear parameterization of the linear space

\mathcal{L} = \left\{L=L':~x^{\{m\}'} L x^{\{m\}}=0\right\}.

The dimension of the vector x^{\{m\}} is given by

\sigma(n,m) = \binom{n+m-1}{m},

whereas the dimension of the vector \alpha is given by

\omega(n,2m) = \frac{1}{2}\sigma(n,m)\left(1+\sigma(n,m)\right)-\sigma(n,2m).

Then, {{math|h(x)}} is SOS if and only if there exists a vector \alpha such that

H + L(\alpha) \ge 0,

meaning that the matrix H + L(\alpha) is positive-semidefinite. This is a linear matrix inequality (LMI) feasibility test, which is a convex optimization problem. The expression h(x)=x^{\{m\}'}\left(H+L(\alpha)\right)x^{\{m\}} was introduced in {{cite conference |url=https://ieeexplore.ieee.org/document/7099515 |title=On convexification of some minimum distance problems |last1=Chesi |first1=G. |last2=Tesi |first2=A. |last3=Vicino |first3=A. |last4=Genesio |first4=R. |date=1999 |publisher=IEEE |book-title=Proceedings of the 5th European Control Conference |pages=1446–1451 |location=Karlsruhe, Germany}} with the name square matricial representation (SMR) in order to establish whether a form is SOS via an LMI. This representation is also known as Gram matrix.{{cite conference |title=Sums of squares of real polynomials |last1=Choi |first1=M. |last2=Lam |first2=T. |last3=Reznick |first3=B. |date=1995 |book-title=Proceedings of Symposia in Pure Mathematics |pages=103–125 |url=https://www.researchgate.net/publication/240268385}}

= Examples =

  • Consider the form of degree 4 in two variables h(x)=x_1^4-x_1^2x_2^2+x_2^4. We have m = 2,~x^{\{m\}} = \begin{pmatrix} x_1^2\\x_1x_2\\x_2^2\end{pmatrix}\!,

~H+L(\alpha) = \begin{pmatrix}

1&0&-\alpha_1\\0&-1+2\alpha_1&0\\-\alpha_1&0&1

\end{pmatrix}\!. Since there exists α such that H+L(\alpha)\ge 0, namely \alpha=1, it follows that h(x) is SOS.

  • Consider the form of degree 4 in three variables h(x)=2x_1^4-2.5x_1^3x_2+x_1^2x_2x_3-2x_1x_3^3+5x_2^4+x_3^4. We have m=2,~x^{\{m\}}=\begin{pmatrix}x_1^2\\x_1x_2\\x_1x_3\\x_2^2\\x_2x_3\\x_3^2\end{pmatrix},

~H+L(\alpha) = \begin{pmatrix}

2&-1.25&0&-\alpha_1&-\alpha_2&-\alpha_3\\

-1.25&2\alpha_1&0.5+\alpha_2&0&-\alpha_4&-\alpha_5\\

0&0.5+\alpha_2&2\alpha_3&\alpha_4&\alpha_5&-1\\

-\alpha_1&0&\alpha_4&5&0&-\alpha_6\\

-\alpha_2&-\alpha_4&\alpha_5&0&2\alpha_6&0\\

-\alpha_3&-\alpha_5&-1&-\alpha_6&0&1

\end{pmatrix}. Since H+L(\alpha)\ge 0 for \alpha=(1.18,-0.43,0.73,1.13,-0.37,0.57), it follows that {{math|h(x)}} is SOS.

Generalizations

= Matrix SOS =

A matrix form F(x) (i.e., a matrix whose entries are forms) of dimension r and degree 2m in the real n-dimensional vector x is SOS if and only if there exist matrix forms G_1(x),\ldots,G_k(x) of degree m such that

F(x)=\sum_{i=1}^k G_i(x)'G_i(x) .

== Matrix SMR ==

To establish whether a matrix form F(x) is SOS amounts to solving a convex optimization problem. Indeed, similarly to the scalar case any F(x) can be written according to the SMR as

F(x) = \left(x^{\{m\}}\otimes I_r\right)'\left(H+L(\alpha)\right)\left(x^{\{m\}}\otimes I_r\right)

where \otimes is the Kronecker product of matrices, H is any symmetric matrix satisfying

F(x) = \left(x^{\{m\}}\otimes I_r\right)'H\left(x^{\{m\}}\otimes I_r\right)

and L(\alpha) is a linear parameterization of the linear space

\mathcal{L}=\left\{L=L':~\left(x^{\{m\}}\otimes I_r\right)'L\left(x^{\{m\}}\otimes I_r\right)=0\right\}.

The dimension of the vector \alpha is given by

\omega(n,2m,r)=\frac{1}{2}r\left(\sigma(n,m)\left(r\sigma(n,m)+1\right)-(r+1)\sigma(n,2m)\right).

Then, {{math|F(x)}} is SOS if and only if there exists a vector \alpha such that the following LMI holds:

H+L(\alpha) \ge 0.

The expression F(x) = \left(x^{\{m\}}\otimes I_r\right)'\left(H+L(\alpha)\right)\left(x^{\{m\}}\otimes I_r\right) was introduced in {{cite conference |title=Robust stability for polytopic systems via polynomially parameter-dependent Lyapunov functions |last1=Chesi |first1=G. |last2=Garulli |first2=A. |last3=Tesi |first3=A. |last4=Vicino |first4=A. |date=2003 |publisher=IEEE |book-title=Proceedings of the 42nd IEEE Conference on Decision and Control |pages=4670–4675 |location=Maui, Hawaii | doi=10.1109/CDC.2003.1272307 }} in order to establish whether a matrix form is SOS via an LMI.

= Noncommutative polynomial SOS =

Consider the free algebra RX⟩ generated by the n noncommuting letters X = (X1, ..., Xn) and equipped with the involution T, such that T fixes R and X1, ..., Xn and reverses words formed by X1, ..., Xn.

By analogy with the commutative case, the noncommutative symmetric polynomials f are the noncommutative polynomials of the form {{math|1=f = fT}}. When any real matrix of any dimension r × r is evaluated at a symmetric noncommutative polynomial f results in a positive semi-definite matrix, f is said to be matrix-positive.

A noncommutative polynomial is SOS if there exists noncommutative polynomials h_1,\ldots,h_k such that

f(X) = \sum_{i=1}^{k} h_i(X)^T h_i(X).

Surprisingly, in the noncommutative scenario a noncommutative polynomial is SOS if and only if it is matrix-positive.{{cite journal|last1=Helton|first1=J. William|title="Positive" Noncommutative Polynomials Are Sums of Squares|journal=The Annals of Mathematics|date=September 2002|volume=156|issue=2|pages=675–694|doi=10.2307/3597203|jstor=3597203}} Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.{{cite journal | last1=Burgdorf|first1=Sabine|last2=Cafuta|first2=Kristijan|last3=Klep|first3=Igor|last4=Povh|first4=Janez|title=Algorithmic aspects of sums of Hermitian squares of noncommutative polynomials|journal=Computational Optimization and Applications|date=25 October 2012|volume=55|issue=1|pages=137–153|doi=10.1007/s10589-012-9513-8|citeseerx=10.1.1.416.543|s2cid=254416733 }}

References

{{Reflist}}

See also