Jump to content

Matrix theory

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Skeptical scientist (talk | contribs) at 13:14, 31 August 2008 (Applications: fixed link). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Matrix theory is a branch of mathematics which focuses on the study of matrices. Initially a sub-branch of linear algebra, it has grown to cover subjects related to graph theory, algebra, combinatorics, and statistics as well.

History

The term matrix was first coined in 1848 by J.J. Sylvester as a name of an array of numbers. In 1855, Arthur Cayley introduced matrix as a representation of linear transformation. This period was considered as the beginning of linear algebra and matrix theory. The motivation for linear algebra, and the first use of matrices, was the study of systems of linear equations. Related concepts such as determinant and Gaussian elimination, which existed long before the introduction of matrices, are now part of matrix theory.

Applications

The study of vector space over finite field, a branch of linear algebra which is useful in coding theory, naturally leads to the study and use of matrices over finite field in coding theory.

Modules are generalizations of vector spaces. They are similar to vector spaces, but defined over rings rather than fields. This leads to the study of matrices over rings. Matrix theory in this area is not often considered as a branch of linear algebra. Among the results listed in Useful theorems, the Cayley-Hamilton Theorem is valid if the underlying ring is commutative, Smith normal form is valid if the underlying ring is a principal ideal domain, but others are valid for only matrices over complex numbers or real numbers.

Magic squares and Latin squares, two ancient branches of recreational mathematics, are now reformulated using the language of matrices. The link between Latin squares and coding theory demonstrates that this is not merely a coincidence.

With the advance of computer technology, it is now possible to solve systems of large numbers of linear equations in practice, not just in theory. John von Neumann and Herman Goldstine introduced condition numbers in analyzing round-off errors in 1947. Later, different techniques to calculation, multiplication or factorization of matrices were invented, such as the Fast Fourier Transform.

The payoff matrix in game theory, also introduced by John von Neumann, might be the first application of matrices to economics.

The simplex algorithm, a technique involving the operations of matrices of very large size, is used to solve operations research problems, a field strongly related to economics. Flow network problems, part of both graph theory and linear programming, can be solved using the simplex algorithm, although there are other more efficient methods. Matrices appear elsewhere in graph theory as well; for example, the adjacency matrix representation of a directed or undirected graph. Important matrices in combinatorics are permutation matrices, which represent permutations, and Hadamard matrices.

Both adjacency matrices of graphs and permutation matrices are examples of nonnegative matrices, which also include stochastic and doubly stochastic matrices. Stochastic matrices are useful in the study of stochastic processes, in probability theory and in statistics. The evaluation of an enormous stochastic matrix is the central idea behind the PageRank algorithm used by Google. Each doubly stochastic matrix is a convex combination of permutation matrices.

Another important tool in statistics is the correlation matrix.

For optimization problems involving multi-variable real-value functions, Positive-definite matrices occur in the search for maxima and minima.

There are also practical uses for matrices over arbitrary rings (see Matrix ring). In particular, matrices over polynomial rings are used in control theory.

On the pure mathematics side, matrix rings can provide a rich field of counterexamples for mathematical conjectures, amongst other uses. The square matrices also plays a special role, because the n×n matrices for fixed n have many closure properties.

Useful theorems

See also

References