Jump to content

Polynomial matrix spectral factorization

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by NRRyder (talk | contribs) at 00:28, 23 May 2017 (First Draft for Polynomial Matrix Spectral Factorization: Included introduction, statement of theorem, and the outline of the newest elementary proof.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Polynomial matrices are widely studied in the field of control theory and have seen other uses relating to Stable polynomials. Often polynomial matrices show up in a context in which, when viewed as a function from the reals to scalar matrices, have all positive definite matrices in their image. The Polynomial Matrix Spectral Factorization yields a Polynomial Matrix decomposition for such polynomial matrices which is meant to be the polynomial analogue of the Cholesky decomposition for scalar matrices. This result was originally proven by Wiener[1] in a more general context which was concerned with integrable matrix-valued functions that also had integrable log determinant. Because applications are often concerned with the polynomial restriction, simpler proofs have appeared since of this specific case. This article follows the recent proof method of Lasha Ephremidze[2]which relies only on elementary Linear algebra and Complex analysis.

Statement

Let be a polynomial matrix where each entry is a complex coefficient polynomial of degree at most . Suppose that for all we have is a positive definite hermitian matrix. Then there exists a polynomial matrix such that for all . We can furthermore find which is nonsingular on the lower half plane.

Extension to Complex Inputs

Note that if then . When is a complex coefficient polynomial or complex coefficient rational function we have is also a polynomial or rational function respectively. For we haveSince the entries of and are complex polynomials which agree on the real line, they are in fact the same polynomials. We can conclude they in fact agree for all complex inputs.

Cholesky Decomposition

The inspiration for this result is a factorization which characterizes positive definite matrices.

Decomposition for Scalar Matrices

Given any positive definite scalar matrix , we are guaranteed by Cholesky decomposition where is a lower triangular matrix. If we don't restrict to lower triangular matrices we can consider all factorizations of the form . It is not hard to check that all factorizations are achieved by looking at the orbit of under right multiplication by a unitary matrix, .

To obtain the lower triangular decomposition we induct by splitting off the first row and first column:Solving these in terms of we get

Since is positive definite we have is a positive real number, so it has a square root. The last condition from induction since the right hand side is the Schur complement of , which is itself positive definite.

Decomposition for Rational Matrices

Now consider where the are complex rational functions and is positive definite hermitian for almost all real . Then by the symmetric Gaussian elimination we performed above, all we need to show is there exists a rational such that for real . Once we have that then we can solve for . Since the Schur complement is positive definite for the real away from the poles and the Schur complement is a rational polynomial matrix we can induct to find .

Square-Roots of Positive Rational Functions

To prove that there exists rational with, we write where . Since is positive definite hermitian for almost all real we have is positive real for almost all real . Letting we can conclude that is real and positive. The numerator and denominator have distinct sets of roots, so all real roots which show up in either must have even multiplicity. We can divide out these real roots to reduce to the case where has only complex roots and poles. By hypothesis we have . Since all of the are complex (and hence not fixed points of conjugation) they both come in conjugate pairs.

Extension to Polynomial Decompositions

To prove the existence of polynomial matrix spectral factorization, we begin with the rational polynomial matrix Cholesky Decomposition and modify it to remove the poles. Namely given with each entry a complex coefficient polynomial we have rational polynomial matrix with for real . Given a rational polynomial matrix which is unitary valued for real , there exists another decomposition,.

Removing Lower Half Plane Poles

If any entry of has a pole, , in the lower half plane we multiply by For real , since the numerator and denominator are complex conjugates of each other.

Removing Lower Half Plane Singularities

If then there exists a scalar unitary matrix such that . This implies has first column vanish at . To remove the singularity at we multiply byhas determinant with one less zero (by multiplicity) at a, without introducing any poles in the lower half plane of any of the entries.

Extend Analyticity to all of

After modifications, the decomposition satisfies is analytic and invertible on the lower half plane. From our observation in Extension to Complex Inputs, we have

for all complex numbers. This implies . Since has no zeroes in the lower half plane, by the Adjugate matrix formula we have is analytic in the lower half plane. Thus the right hand side is analytic on the lower half plane, which implies is analytic on the upper half plane. Finally if has a pole on the real line then has the same pole on the real line which contradicts the fact has no poles on the real line (it is analytic everywhere by hypothesis).

The above shows that is analytic and invertible on the lower half plane implies is analytic everywhere and hence a polynomial matrix.

Uniqueness

Given two polynomial matrix decompositions which are invertible on the lower half plane we rearrange to . Since is analytic on the lower half plane and nonsingular, is a rational polynomial matrix which is analytic and invertible on the lower half plane. Then by the same argument as above we have is in fact a polynomial matrix which is unitary for all real . This means that if is the th row of then . For real this is a sum of non-negative polynomials which sums to a constant, implying each of the summands are in fact constant polynomials. Then where is a scalar unitary matrix.

References

  1. ^ N. Wiener and P. Masani. "The prediction theory of multivariate stochastic processe". Acta Math. 98: 111–150.
  2. ^ Ephremidze, Lasha. "An Elementary Proof of the Polynomial Matrix Spectral Factorization Theorem". An elementary proof of the polynomial matrix spectral factorization theorem. 144: 747–751.