Polynomial SOS
{{about|representing a non-negative polynomial as sum of squares of polynomials|representing a polynomial as a sum of squares of rational functions|Hilbert's seventeenth problem|the sum of squares of consecutive integers|square pyramidal number|representing an integer as a sum of squares of integers|Lagrange's four-square theorem}}
In mathematics, a form (i.e. a homogeneous polynomial) h(x) of degree 2m in the real n-dimensional vector x is sum of squares of forms (SOS) if and only if there exist forms of degree m such that
Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for n = 2, 2m = 2, or n = 3 and 2m = 4 a form is SOS if and only if it is positive.{{cite journal|last1=Hilbert|first1=David|title=Ueber die Darstellung definiter Formen als Summe von Formenquadraten|journal=Mathematische Annalen|date=September 1888|volume=32|issue=3|pages=342–350|doi=10.1007/bf01443605|s2cid=177804714 |url=https://zenodo.org/record/1428214}} The same is also valid for the analog problem on positive symmetric forms.{{cite journal|last1=Choi|first1=M. D.|last2=Lam|first2=T. Y.|title=An old question of Hilbert|journal=Queen's Papers in Pure and Applied Mathematics|date=1977|volume=46|pages=385–405}}{{cite journal|last1=Goel|first1=Charu|last2=Kuhlmann|first2=Salma|last3=Reznick|first3=Bruce|title=On the Choi–Lam analogue of Hilbert's 1888 theorem for symmetric forms|journal=Linear Algebra and Its Applications|date=May 2016|volume=496|pages=114–120|doi=10.1016/j.laa.2016.01.024|arxiv=1505.08145|s2cid=17579200 |author3-link=Bruce Reznick}}
Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found.{{cite journal|last1=Lasserre|first1=Jean B.|title=Sufficient conditions for a real polynomial to be a sum of squares| journal=Archiv der Mathematik|volume=89|issue=5|pages=390–398|doi=10.1007/s00013-007-2251-y|url=http://www.optimization-online.org/DB_HTML/2007/02/1587.html|arxiv=math/0612358|year=2007|citeseerx=10.1.1.240.4438|s2cid=9319455 }}{{cite journal| last1=Powers|first1=Victoria|author1-link=Victoria Powers|last2=Wörmann|first2=Thorsten|title=An algorithm for sums of squares of real polynomials|journal=Journal of Pure and Applied Algebra|date=1998|volume=127|issue=1|pages=99–104| doi=10.1016/S0022-4049(97)83827-3| url=http://www.mathcs.emory.edu/~vicki/pub/sos.pdf|doi-access=free}} Moreover, every real nonnegative form can be approximated as closely as desired (in the -norm of its coefficient vector) by a sequence of forms that are SOS.{{cite journal|last1=Lasserre|first1=Jean B.|title=A Sum of Squares Approximation of Nonnegative Polynomials|journal=SIAM Review|date=2007|volume=49|issue=4|pages=651–669|doi=10.1137/070693709|arxiv=math/0412398|bibcode=2007SIAMR..49..651L}}
Square matricial representation (SMR)
To establish whether a form {{math|h(x)}} is SOS amounts to solving a convex optimization problem. Indeed, any {{math|h(x)}} can be written as
where is a vector containing a base for the forms of degree m in x (such as all monomials of degree m in x), the prime ′ denotes the transpose, H is any symmetric matrix satisfying
and is a linear parameterization of the linear space
The dimension of the vector is given by
whereas the dimension of the vector is given by
Then, {{math|h(x)}} is SOS if and only if there exists a vector such that
meaning that the matrix is positive-semidefinite. This is a linear matrix inequality (LMI) feasibility test, which is a convex optimization problem. The expression was introduced in {{cite conference |url=https://ieeexplore.ieee.org/document/7099515 |title=On convexification of some minimum distance problems |last1=Chesi |first1=G. |last2=Tesi |first2=A. |last3=Vicino |first3=A. |last4=Genesio |first4=R. |date=1999 |publisher=IEEE |book-title=Proceedings of the 5th European Control Conference |pages=1446–1451 |location=Karlsruhe, Germany}} with the name square matricial representation (SMR) in order to establish whether a form is SOS via an LMI. This representation is also known as Gram matrix.{{cite conference |title=Sums of squares of real polynomials |last1=Choi |first1=M. |last2=Lam |first2=T. |last3=Reznick |first3=B. |date=1995 |book-title=Proceedings of Symposia in Pure Mathematics |pages=103–125 |url=https://www.researchgate.net/publication/240268385}}
= Examples =
- Consider the form of degree 4 in two variables . We have
~H+L(\alpha) = \begin{pmatrix}
1&0&-\alpha_1\\0&-1+2\alpha_1&0\\-\alpha_1&0&1
\end{pmatrix}\!. Since there exists α such that , namely , it follows that h(x) is SOS.
- Consider the form of degree 4 in three variables . We have
~H+L(\alpha) = \begin{pmatrix}
2&-1.25&0&-\alpha_1&-\alpha_2&-\alpha_3\\
-1.25&2\alpha_1&0.5+\alpha_2&0&-\alpha_4&-\alpha_5\\
0&0.5+\alpha_2&2\alpha_3&\alpha_4&\alpha_5&-1\\
-\alpha_1&0&\alpha_4&5&0&-\alpha_6\\
-\alpha_2&-\alpha_4&\alpha_5&0&2\alpha_6&0\\
-\alpha_3&-\alpha_5&-1&-\alpha_6&0&1
\end{pmatrix}. Since for , it follows that {{math|h(x)}} is SOS.
Generalizations
= Matrix SOS =
A matrix form F(x) (i.e., a matrix whose entries are forms) of dimension r and degree 2m in the real n-dimensional vector x is SOS if and only if there exist matrix forms of degree m such that
== Matrix SMR ==
To establish whether a matrix form F(x) is SOS amounts to solving a convex optimization problem. Indeed, similarly to the scalar case any F(x) can be written according to the SMR as
where is the Kronecker product of matrices, H is any symmetric matrix satisfying
and is a linear parameterization of the linear space
The dimension of the vector is given by
Then, {{math|F(x)}} is SOS if and only if there exists a vector such that the following LMI holds:
The expression was introduced in {{cite conference |title=Robust stability for polytopic systems via polynomially parameter-dependent Lyapunov functions |last1=Chesi |first1=G. |last2=Garulli |first2=A. |last3=Tesi |first3=A. |last4=Vicino |first4=A. |date=2003 |publisher=IEEE |book-title=Proceedings of the 42nd IEEE Conference on Decision and Control |pages=4670–4675 |location=Maui, Hawaii | doi=10.1109/CDC.2003.1272307 }} in order to establish whether a matrix form is SOS via an LMI.
= Noncommutative polynomial SOS =
Consider the free algebra R⟨X⟩ generated by the n noncommuting letters X = (X1, ..., Xn) and equipped with the involution T, such that T fixes R and X1, ..., Xn and reverses words formed by X1, ..., Xn.
By analogy with the commutative case, the noncommutative symmetric polynomials f are the noncommutative polynomials of the form {{math|1=f = fT}}. When any real matrix of any dimension r × r is evaluated at a symmetric noncommutative polynomial f results in a positive semi-definite matrix, f is said to be matrix-positive.
A noncommutative polynomial is SOS if there exists noncommutative polynomials such that
Surprisingly, in the noncommutative scenario a noncommutative polynomial is SOS if and only if it is matrix-positive.{{cite journal|last1=Helton|first1=J. William|title="Positive" Noncommutative Polynomials Are Sums of Squares|journal=The Annals of Mathematics|date=September 2002|volume=156|issue=2|pages=675–694|doi=10.2307/3597203|jstor=3597203}} Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.{{cite journal | last1=Burgdorf|first1=Sabine|last2=Cafuta|first2=Kristijan|last3=Klep|first3=Igor|last4=Povh|first4=Janez|title=Algorithmic aspects of sums of Hermitian squares of noncommutative polynomials|journal=Computational Optimization and Applications|date=25 October 2012|volume=55|issue=1|pages=137–153|doi=10.1007/s10589-012-9513-8|citeseerx=10.1.1.416.543|s2cid=254416733 }}
References
{{Reflist}}