„Blockmatrix“ – Versionsunterschied
[ungesichtete Version] | [ungesichtete Version] |
Inhalt gelöscht Inhalt hinzugefügt
merged with material form matrix_(mathematics) |
|||
Zeile 1:
In [[mathematics]], a '''block matrix''' or a '''partitioned matrix''' is a partition of a [[Matrix (mathematics)|matrix]]
== Example ===
A '''partitioned matrix''' or ''[[block matrix]]'' is a matrix of matrices. For example, take a matrix ''P'':
:<math>P = \begin{bmatrix}
1 & 2 & 3 & 2\\
1 & 2 & 7 & 5\\
4 & 9 & 2 & 6\\
6 & 1 & 5 & 8\end{bmatrix}</math>
We could partition it into a 2×2 partitioned matrix like this:
:<math>P_{11} = \begin{bmatrix}
1 & 2 \\
1 & 2 \end{bmatrix}, P_{12} = \begin{bmatrix}
3 & 2\\
7 & 5\end{bmatrix}, P_{21} = \begin{bmatrix}
4 & 9 \\
6 & 1 \end{bmatrix}, P_{22} = \begin{bmatrix}
2 & 6\\
5 & 8\end{bmatrix}</math>
:<math>P_{\mathrm{partitioned}} = \begin{bmatrix}
P_{11} & P_{12}\\
P_{21} & P_{22}\end{bmatrix}</math>
==Application==
In [[linear algebra]] terms, the use of a block matrix corresponds to having a [[linear mapping]] thought of in terms of corresponding 'bunches' of [[basis vector]]s. That again matches the idea of having distinguished [[direct sum]] decompositions of the [[domain (mathematics)|domain]] and [[range (mathematics)|range]]. It is always particularly significant if a block is the zero matrix; that carries the information that a summand maps into a sub-sum.
Given the interpretation ''via'' linear mappings and direct sums, there is a special type of block matrix that occurs for square matrices (the case ''m'' = ''n''). For those we can assume an interpretation as an [[endomorphism]] of an ''n''-dimensional space ''V''; the block structure in which the bunching of rows and columns is the same is of importance because it corresponds to having a single direct sum decomposition on ''V'' (rather than two). In that case, for example, the [[diagonal]] blocks in the obvious sense are all square. This type of structure is required to describe the [[Jordan normal form]].
This technique is used to cut down calculations of matrices, column-row expansions, and many [[computer science]] applications, including [[VLSI]] chip design. An example is the [[Strassen algorithm]] for fast [[matrix multiplication]].
[[Category:Mathematics]]
|