- (This article attempts to be the transition from basic vector algebra to tensor algebra.)
In geometry and linear algebra, a Cartesian tensor is a tensor in Euclidean space represented by the standard basis. In 3d, Cartesian coordinates are employed (hence the name), and higher dimensional generalizations are easily possible. Dyadic tensors were historically the first approach to formulating second order tensors, Cartesian tensors are more general.
First order Cartesian tensors in 3d
Basis vectors in 3d
In 3d the standard basis is (ex, ey, ez) = (e1, e2, e3). Each basis vector points along the x, y, and z axes, and the vectors are all unit vectors (or "normalized"), so the basis is orthonormal.
Dot product and Kronecker delta
For the dot product

while

which can be summarized by

where i and j are placeholders for x, y, z. This is so important and useful it's given a name: the Kronecker delta.
Cross and product and the Levi-Civita symbol
For the cross product it's (almost) the other way round:

while

which can be summarized by
- Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \mathbf{e}_{i}\times\mathbf{e}_{j}=\left\langle \begin{array}{cc} +1 & \text{cyclic permutations of }i,j\in\left\{ x,y,z\right\} :i\neq j\\ -1 & \text{anticyclic permutations of }i,j\in\left\{ x,y,z\right\} :i\neq j\\ 0 & i=j \end{array}\right. }
where again i and j are placeholders for x, y, z. The cyclic permutations of
are
, and the anticyclic permutations of Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle i,j\in\left\{ x,y,z\right\} :i\neq j}
are
.
Similarly
Again this is so important and useful it's given a name: the Levi-Civita symbol.
- The cyclic permutations of Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle i,j,k\in\left\{ x,y,z\right\} :i\neq j}
are
and
- the anticyclic permutations of
are
.
A scalar is a quantity which does not change from one coordinate system to another, it's invariant under change of coordinates.
The standard example the rotation of a vector. The position vector x should take the same form in any coordinate system. In one coordinate system it has components xi and basis ei, so

In another coordinate system it has components xi and basis ei, so

The dependence of coordinates is

and dependence of basis is

for each i, j.
The components can be linearly transformed by
- Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \bar{x}{}_{j}=L_{ij}x_{i}}
where Lij represents the transformation matrix, and basis by

where (Lij)−1 denotes the inverse matrix of Lij, so that

Exactly the same is neccersarily true for any vector to be invariant under changes of coordinate systems; a = aiei. If a does not transform according to this rule, it's not a vector.
The linear transformation can be interpreted as a matrix; Lij are the entries if the matrix (row number is i and column number is j).
If L is an orthogonal transformation (orthogonal matrix) then there are considerable simplifications, the matrix transpose is the inverse (by definition):

and moreover, det(L) = ±1, (+1) for rotations and (−1) for reflections. To illustrate all possible symmetries in the indices:

The inverse transformation of coordinates is simply

and the inverse transformation of basis is:

The components of L are partial derivatives of the coordinates.
Differentiating xi with respect to xk

so

is an element of the Jacobian matrix;

The first index i for the top of the derivative, second j for the bottom). Many sources state transformations in terms of contractions with this partial derivative.
Conversely differentiating xj with respect to xi:

so
- Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \left(L_{ij}\right)^{-1}=\dfrac{\partial x_{j}}{\partial\bar{x}_{i}}}
is an element of the inverse Jacobian matrix

Since L is orthogonal:

Note the first index j for the bottom of the derivative, second i for top).
Contracting them gives the Kronecker delta:

also

which parallels the matrix multiplication of the Jacobian and its inverse:

As a special case;

As with all linear transformations, L depends on the basis chosen. Since
, for two orthonormal bases
,
- projecting x to the x axes
,
- projecting x to the x axes:
.
Hence the components reduce to direction cosines between the xi and xj axes:


where θij, θji are the angles between the xi and xj axes.
NB:
unless the basis vectors are identical: Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \mathbf{e}_{i}=\bar{\mathbf{e}}_{i}}
Accumulating all results obtained, from equation (??):
The transformation components are

in matrix form
- Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \mathbf{L}=\begin{pmatrix}\dfrac{\partial\bar{x}_{1}}{\partial x_{1}} & \dfrac{\partial\bar{x}_{1}}{\partial x_{2}} & \dfrac{\partial\bar{x}_{1}}{\partial x_{3}}\\ \dfrac{\partial\bar{x}_{2}}{\partial x_{1}} & \dfrac{\partial\bar{x}_{2}}{\partial x_{2}} & \dfrac{\partial\bar{x}_{2}}{\partial x_{3}}\\ \dfrac{\partial\bar{x}_{3}}{\partial x_{1}} & \dfrac{\partial\bar{x}_{3}}{\partial x_{2}} & \dfrac{\partial\bar{x}_{3}}{\partial x_{3}} \end{pmatrix}=\begin{pmatrix}\bar{\mathbf{e}}_{1}\cdot\mathbf{e}_{1} & \bar{\mathbf{e}}_{1}\cdot\mathbf{e}_{2} & \bar{\mathbf{e}}_{1}\cdot\mathbf{e}_{3}\\ \bar{\mathbf{e}}_{2}\cdot\mathbf{e}_{1} & \bar{\mathbf{e}}_{2}\cdot\mathbf{e}_{2} & \bar{\mathbf{e}}_{2}\cdot\mathbf{e}_{3}\\ \bar{\mathbf{e}}_{3}\cdot\mathbf{e}_{1} & \bar{\mathbf{e}}_{3}\cdot\mathbf{e}_{2} & \bar{\mathbf{e}}_{3}\cdot\mathbf{e}_{3} \end{pmatrix}=\begin{pmatrix}\cos\theta_{11} & \cos\theta_{12} & \cos\theta_{13}\\ \cos\theta_{21} & \cos\theta_{22} & \cos\theta_{23}\\ \cos\theta_{31} & \cos\theta_{32} & \cos\theta_{33} \end{pmatrix},\quad\mathbf{L}^{-1}=\mathbf{L}^{\mathrm{T}} }
The transformation of components can be fully written:

similarly for

The geometric interpretation is the xi components equal to the sum of projecting the xj components onto the xj axes.
Tensors are defined as quantities which transform in a certain way under linear transformations of coordinates.
Let
be two vectors, so that they transform according to Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \bar{a}_{i}=L_{ij}a_{j},\bar{b}_{i}=L_{ij}b_{j}}
.
Taking the tensor product gives:

then applying the transformation to the components

and to the bases

gives the transformation law of an order-2 tensor. The tensor
is invariant under this tranformation:

More generally, for any order-2 tensor
, the components transform according to;
,
and the basis transforms by:

If R does not transform according to this rule - whatever quantity R may be, it's not an order 2 tensor.
Now suppose we have an additional vector Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \mathbf{c}=c_{i}\mathbf{e}_{i}}
which transforms according to
.
Taking the tensor product with the other two vectors a and b above gives:

then applying the transformation to the components gives the transformation law of an order-3 tensor:

and to the bases

gives the transformation law of an order-3 tensor. The tensor a⊗b⊗c is invariant under this transformation:

For any order-3 tensor
, the components transform according to;
and the basis transforms by:
.
If S does not transform according to this rule - whatever quantity S may be, it's not an order-3 tensor.
More generally, for any order p tensor

the components transform according to;
- Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \bar{T}_{j_{1}j_{2}\cdots-j_{p}}=L_{i_{1}j_{1}}L_{i_{2}j_{2}}\cdots L_{i_{p}j_{p}}T_{i_{1}i_{2}\cdots i_{p}}}
and the basis transforms by:
.
If T does not transform according to this rule - whatever quantity T may be it's not an order-3 tensor.
Second order Cartesian tensors in 3d
Exterior product of Cartesian basis in 3d
For the exterior product we have:



and

Tensor product of Cartesian basis in 3d
For tensors of order order 1 a Cartesian vector a can be written algebraically as a linear combination of the basis vectors or equivalently as a column vector of the coordinates with respect to the basis:

A dyadic tensor T is an order 2 tensor formed by the tensor product (designated the symbol ⊗) of two Cartesian vectors a and b, again it can be written as a linear combination of the tensor basis ex ⊗ ey, ex ⊗ ez ... ey ⊗ ez, ez ⊗ ez, or more systematically as a matrix:

See matrix multiplication for the notational correspondence between matrices and the dot and tensor products.
Cartesian dyadic tensors of second order are reducible, which means they can be re-expressed in terms of the two vectors as follows:


![{\displaystyle T_{ij}^{(2)}={\frac {1}{2}}[a_{i}b_{j}-a_{j}b_{i}]=a_{[i}b_{j]}}](/media/api/rest_v1/media/math/render/svg/296e1bf80d16d364398745c65ff4880f8e0851a2)

where δij is the Kronecker delta, the components of the identity matrix. These three terms are irreducible representations, which means they cannot be decomposed further and still be tensors satisfying the defining transformation laws under which they must be invariant. Each of the irreducible representations transform like angular momentum according to the number of independent components.
See also
References
External links