Hessenberg matrix

{{Short description|Kind of square matrix in linear algebra}}

In linear algebra, a Hessenberg matrix is a special kind of square matrix, one that is "almost" triangular. To be exact, an upper Hessenberg matrix has zero entries below the first subdiagonal, and a lower Hessenberg matrix has zero entries above the first superdiagonal.{{harvtxt|Horn|Johnson|1985}}, page 28; {{harvtxt|Stoer|Bulirsch|2002}}, page 251 They are named after Karl Hessenberg.Biswa Nath Datta (2010) Numerical Linear Algebra and Applications, 2nd Ed., Society for Industrial and Applied Mathematics (SIAM) {{ISBN|978-0-89871-685-6}}, p. 307

A Hessenberg decomposition is a matrix decomposition of a matrix A into a unitary matrix P and a Hessenberg matrix H such that PHP^*=A where P^* denotes the conjugate transpose.

Definitions

=Upper Hessenberg matrix=

A square n \times n matrix A is said to be in upper Hessenberg form or to be an upper Hessenberg matrix if a_{i,j}=0 for all i,j with i > j+1.

An upper Hessenberg matrix is called unreduced if all subdiagonal entries are nonzero, i.e. if a_{i+1,i} \neq 0 for all i \in \{ 1,\ldots,n-1 \}.{{harvnb|Horn|Johnson|1985|p=35}}

=Lower Hessenberg matrix=

A square n \times n matrix A is said to be in lower Hessenberg form or to be a lower Hessenberg matrix if its transpose is an upper Hessenberg matrix or equivalently if a_{i,j}=0 for all i,j with j > i+1.

A lower Hessenberg matrix is called unreduced if all superdiagonal entries are nonzero, i.e. if a_{i,i+1} \neq 0 for all i \in \{ 1,\ldots,n-1 \}.

Examples

Consider the following matrices.

A=\begin{bmatrix}

1 & 4 & 2 & 3 \\

3 & 4 & 1 & 7 \\

0 & 2 & 3 & 4 \\

0 & 0 & 1 & 3 \\

\end{bmatrix}

B=\begin{bmatrix}

1 & 2 & 0 & 0 \\

5 & 2 & 3 & 0 \\

3 & 4 & 3 & 7 \\

5 & 6 & 1 & 1 \\

\end{bmatrix}

C=\begin{bmatrix}

1 & 2 & 0 & 0 \\

5 & 2 & 0 & 0 \\

3 & 4 & 3 & 7 \\

5 & 6 & 1 & 1 \\

\end{bmatrix}

The matrix A is an upper unreduced Hessenberg matrix, B is a lower unreduced Hessenberg matrix and C is a lower Hessenberg matrix but is not unreduced.

Computer programming

Many linear algebra algorithms require significantly less computational effort when applied to triangular matrices, and this improvement often carries over to Hessenberg matrices as well. If the constraints of a linear algebra problem do not allow a general matrix to be conveniently reduced to a triangular one, reduction to Hessenberg form is often the next best thing. In fact, reduction of any matrix to a Hessenberg form can be achieved in a finite number of steps (for example, through Householder's transformation of unitary similarity transforms). Subsequent reduction of Hessenberg matrix to a triangular matrix can be achieved through iterative procedures, such as shifted QR-factorization. In eigenvalue algorithms, the Hessenberg matrix can be further reduced to a triangular matrix through Shifted QR-factorization combined with deflation steps. Reducing a general matrix to a Hessenberg matrix and then reducing further to a triangular matrix, instead of directly reducing a general matrix to a triangular matrix, often economizes the arithmetic involved in the QR algorithm for eigenvalue problems.

Reduction to Hessenberg matrix

=Householder transformations=

Any n \times n matrix can be transformed into a Hessenberg matrix by a similarity transformation using Householder transformations. The following procedure for such a transformation is adapted from A Second Course In Linear Algebra by Garcia & Roger.{{cite book |last1=Ramon Garcia |first1=Stephan |last2=Horn |first2=Roger |title=A Second Course In Linear Algebra |date=2017 |publisher=Cambridge University Press |isbn=9781107103818}}

Let A be any real or complex n \times n matrix, then let A^\prime be the (n - 1) \times n submatrix of A constructed by removing the first row in A and let \mathbf{a}^\prime_1 be the first column of A'. Construct the (n-1) \times (n-1) householder matrix V_1 = I_{(n-1)} - 2\frac{ww^*}{\|w\|^2} where

w = \begin{cases}

\|\mathbf{a}^\prime_1\|_2\mathbf{e}_1 - \mathbf{a}^\prime_1 \;\;\;\;\;\;\;\; , \;\;\; a^\prime_{11} = 0 \\

\|\mathbf{a}^\prime_1\|_2\mathbf{e}_1 + \frac{\overline{a^\prime_{11}}}

a^\prime_{11}
\mathbf{a}^\prime_1 \;\;\; , \;\;\; a^\prime_{11} \neq 0 \\

\end{cases}

This householder matrix will map \mathbf{a}^\prime_1 to \|\mathbf{a}^\prime_1\| \mathbf{e}_1 and as such, the block matrix U_1 = \begin{bmatrix}1 & \mathbf{0} \\ \mathbf{0} & V_1 \end{bmatrix} will map the matrix A to the matrix U_1A which has only zeros below the second entry of the first column. Now construct (n-2) \times (n-2) householder matrix V_2 in a similar manner as V_1 such that V_2 maps the first column of A^{\prime\prime} to \|\mathbf{a}^{\prime\prime}_1\| \mathbf{e}_1, where A^{\prime\prime} is the submatrix of A^{\prime} constructed by removing the first row and the first column of A^{\prime}, then let U_2 = \begin{bmatrix}1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & V_2\end{bmatrix} which maps U_1A to the matrix U_2U_1A which has only zeros below the first and second entry of the subdiagonal. Now construct V_3 and then U_3 in a similar manner, but for the matrix A^{\prime\prime\prime} constructed by removing the first row and first column of A^{\prime\prime} and proceed as in the previous steps. Continue like this for a total of n-2 steps.

By construction of U_k, the first k columns of any n \times n matrix are invariant under multiplication by U_k^* from the right. Hence, any matrix can be transformed to an upper Hessenberg matrix by a similarity transformation of the form U_{(n-2)}( \dots (U_2(U_1 A U_1^*)U_2^*) \dots )U_{(n-2)}^* = U_{(n-2)} \dots U_2U_1A(U_{(n-2)} \dots U_2U_1)^* = UAU^*.

=Jacobi (Givens) rotations=

A Jacobi rotation (also called Givens rotation) is an orthogonal matrix transformation in the form

:

A\to A'=J(p,q,\theta)^TAJ(p,q,\theta) \;,

where J(p,q,\theta), p< q, is the Jacobi rotation matrix with all matrix elements equal zero except for

::\left\{\begin{align}

J(p,q,\theta)_{ii} &{}= 1 \; \forall i \ne p,q\\

J(p,q,\theta)_{pp} &{}= \cos(\theta) \\

J(p,q,\theta)_{qq} &{}= \cos(\theta) \\

J(p,q,\theta)_{pq} &{}= \sin(\theta) \\

J(p,q,\theta)_{qp} &{}= -\sin(\theta) \;.

\end{align}\right.

One can zero the matrix element A'_{p-1,q} by choosing

the rotation angle \theta to satisfy the equation

:

A_{p-1,p}\sin\theta+A_{p-1,q}\cos\theta=0 \;,

Now, the sequence of such Jacobi rotations with the following (p,q)

:

(p,q)=(2,3),(2,4),\dots,(2,n),(3,4),\dots,(3,n),\dots,(n-1,n)

reduces the matrix A to the lower Hessenberg form.{{cite journal | arxiv=1501.07812 | doi=10.1016/j.laa.2015.08.026 | title=Quasiseparable Hessenberg reduction of real diagonal plus low rank matrices and applications | date=2016 | last1=Bini | first1=Dario A. | last2=Robol | first2=Leonardo | journal=Linear Algebra and Its Applications | volume=502 | pages=186–213 }}

Properties

For n \in \{1, 2\} , it is vacuously true that every n \times n matrix is both upper Hessenberg, and lower Hessenberg.[https://www.cs.cornell.edu/~bindel/class/cs6210-f16/lec/2016-10-21.pdf Lecture Notes. Notes for 2016-10-21] Cornell University

The product of a Hessenberg matrix with a triangular matrix is again Hessenberg. More precisely, if A is upper Hessenberg and T is upper triangular, then AT and TA are upper Hessenberg.

A matrix that is both upper Hessenberg and lower Hessenberg is a tridiagonal matrix, of which the Jacobi matrix is an important example. This includes the symmetric or Hermitian Hessenberg matrices. A Hermitian matrix can be reduced to tri-diagonal real symmetric matrices.{{Cite web|title=Computational Routines (eigenvalues) in LAPACK | url=http://sites.science.oregonstate.edu/~landaur/nacphy/lapack/eigen.html | website=sites.science.oregonstate.edu | access-date=2020-05-24}}

Hessenberg operator

The Hessenberg operator is an infinite dimensional Hessenberg matrix. It commonly occurs as the generalization of the Jacobi operator to a system of orthogonal polynomials for the space of square-integrable holomorphic functions over some domain—that is, a Bergman space. In this case, the Hessenberg operator is the right-shift operator S, given by

[Sf](z) = z f(z).

The eigenvalues of each principal submatrix of the Hessenberg operator are given by the characteristic polynomial for that submatrix. These polynomials are called the Bergman polynomials, and provide an orthogonal polynomial basis for Bergman space.

See also

Notes

References

  • {{Citation | last1=Horn | first1=Roger A. | last2=Johnson | first2=Charles R. | title=Matrix Analysis | publisher=Cambridge University Press | isbn=978-0-521-38632-6 | year=1985}}.
  • {{Citation | last1=Stoer | first1=Josef | last2=Bulirsch | first2=Roland | title=Introduction to Numerical Analysis | publisher=Springer-Verlag | location=Berlin, New York | edition=3rd | isbn=978-0-387-95452-3 | year=2002}}.
  • {{Citation | last1=Press | first1=WH | last2=Teukolsky | first2=SA | last3=Vetterling | first3=WT | last4=Flannery | first4=BP | year=2007 | title=Numerical Recipes: The Art of Scientific Computing | edition=3rd | publisher=Cambridge University Press | publication-place=New York | isbn=978-0-521-88068-8 | chapter=Section 11.6.2. Reduction to Hessenberg Form | chapter-url=http://apps.nrbook.com/empanel/index.html#pg=594 | access-date=2011-08-13 | archive-date=2011-08-11 | archive-url=https://web.archive.org/web/20110811154417/http://apps.nrbook.com/empanel/index.html#pg=594 | url-status=dead }}