11 SPECIAL CLASSES OF MATRICES

  • Square Matrices
    • Identity and diagonal matrices
    • Triangular matrices
    • Symmetric matrices
    • Orthogonal Matrices
  • Dyads

11.1 Some special square matrices

Square matrices are matrices that have the same number of rows as columns. The following are important instances of square matrices.

Identity matrix

The n \times n identity matrix (often denoted I_{n}, or simply I, if the context allows), has ones on its diagonal and zeros elsewhere. It is square, diagonal, and symmetric. This matrix satisfies A \dots I_{n}=A for every matrix A with n columns, and I_{n}\dots B = B for every matrix B with n rows.

Example 1: Identity matrix
The 3 \times 3 identity matrix, denoted I_3, is given by:

    \[I_3 = \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}.\]

This matrix has ones on its diagonal and zeros elsewhere. When multiplied by any 3 \times 3 matrix A, the product AI_3 remains A, and similarly, I_3B = B for any matrix B of size 3 \times 3.

Diagonal matrices

Diagonal matrices are square matrices A with A_{ij}=0 when i \neq j. A diagonal n \times n matrix A can be denoted as A = \diag(a), with a \in \mathbb{R}^{n} the vector containing the elements on the diagonal. We can also write

    \[A = \begin{pmatrix} a_{1} & & \\ & \ddots & \\ & & a_{r} \end{pmatrix},\]

where by convention the zeros outside the diagonal are not written.

Symmetric matrices

Symmetric matrices are square matrices that satisfy A_{ij}=A_{ji} for every pair (i, j). An entire section is devoted to symmetric matrices.

Example 2: A symmetric matrix
The matrix

    \[ A = \begin{pmatrix} 4 & 3/2 & 2 \\ 3/2 & 2 & 5/2 \\ 2 & 5/2 & 2 \end{pmatrix} \]

is symmetric. The matrix

    \[ A = \begin{pmatrix} 4 & 3/2 & 2 \\ 3/2 & 2 & \mathbf{5} \\ 2 & \mathbf{5/2} & 2 \end{pmatrix} \]

is not, since it is not equal to its transpose.

Triangular matrices

A square matrix A is upper triangular if A_{ij} = 0 when i > j. Here are a few examples:

    \[A_{1} = \begin{pmatrix} 1 & -1 & 2 \\ 0 & 2 & 3 \\ 0 & 0 & 4 \end{pmatrix}, \quad A_{2} = \begin{pmatrix} 3 & 8 & 3 \\ 0 & 6 & -1 \\ 0 & 0 & 5 \end{pmatrix}, \quad A_{3} = \begin{pmatrix} 0 & 8 & 3 \\ 0 & 0 & -1 \\ 0 & 0 & 1 \end{pmatrix}.\]

A matrix is lower triangular if its transpose is upper triangular. For example:

    \[A = \begin{pmatrix} 1 & 0 & 0 \\ 8 & -9 & 0 \\ 1 & 2 & 3 \end{pmatrix}.\]

Orthogonal matrices

Orthogonal (or, unitary) matrices are square matrices, such that the columns form an orthonormal basis. If U = [u_{1},\dots,u_{n}] is an orthogonal matrix, then

    \[u_{i}^{T}u_{j} = \begin{cases} 1  \text{ if $i=j$} \\ 0 \text{  otherwise.}\end{cases}.\]

Thus, U^{T}U = I_{n}. Similarly, UU^{T} = I_{n}.

Orthogonal matrices correspond to rotations or reflections across a direction: they preserve length and angles. Indeed, for every vector x,

    \[||Ux||_{2}^{2} = (Ux)^{T}(Ux) = x^{T}U^{T}Ux=x^{T}x=||x||_{2}^{2}.\]

Thus, the underlying linear map x \rightarrow Ux preserves the length (measured in Euclidean norm). This is sometimes referred to as the rotational invariance of the Euclidean norm.

In addition, angles are preserved: if x, y are two vectors with unit norm, then the angle \theta between them satisfies \cos \theta = x^{T}y, while the angle \theta' between the rotated vectors x' = Ux, y' = Uy satisfies \cos \theta'=(x')^{T}y'. Since

    \[(Ux)^{T}(Uy)=x^{T}U^{T}Uy=x^{T}y,\]

we obtain that the angles are the same. (The converse is true: any square matrix that preserves lengths and angles is orthogonal.)

Geometrically, orthogonal matrices correspond to rotations (around a point) or reflections (around a line passing through the origin).

Examples 3: A 2 \times 2 orthogonal matrix
The matrix

    \[U = \dfrac{1}{\sqrt{2}}\begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix}\]

is orthogonal.
The vector x=(2, 1) is transformed by the orthogonal matrix above into

    \[Ux = \dfrac{1}{\sqrt{2}}\begin{pmatrix} 1 \\ 3 \end{pmatrix}.\]

Thus, U corresponds to a rotation of angle 45 degrees counter-clockwise.

See also: Permutation matrices

11.2 Dyads

Dyads are a special class of matrices, also called rank-one matrices, for reasons seen later.

Definition

A matrix A \in \mathbb{R}^{m \times n} is a dyad if it is of the form A = uv^{T} for some vectors u \in \mathbb{R}^{m}, v \in \mathbb{R}^{n}. The dyad acts on an input vector x \in \mathbb{R}^{n} as follows:

    \[Ax=(uv^{T})x=(v^{T}x)u.\]

In terms of the associated linear map, for a dyad, the output always points in the same direction u in output space (\mathbb{R}^{m}), no matter what the input x is. The output is thus always a simple scaled version of u. The amount of scaling depends on the vector v, via the linear function x \rightarrow v^{T}x..

See also: Single-factor models of financial data.

Normalized dyads

We can always normalize the dyad, by assuming that both u, v are of unit (Euclidean) norm, and using a factor to capture their scale. That is, any dyad can be written in normalized form:

    \[A = uv^{T} = (||u||_{2}\cdot||v||_{2})\cdot \left(\dfrac{u}{||u||_{2}}\right)\left(\dfrac{v}{||v||_{2}}\right)^{T} = \sigma\tilde{u}\tilde{v}^{T},\]

where \sigma>0, and ||\tilde{u}||_{2}=||\tilde{v}||_{2}=1.

License

Icon for the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

Linear Algebra and Applications Copyright © 2023 by VinUiversity is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, except where otherwise noted.

Share This Book