Singular value decomposition (SVD) theorem
Theorem: Singular Value Decomposition (SVD)
An arbitrary matrix admits a decomposition of the form
where are both orthogonal matrices, and the matrix is diagonal:
where the positive numbers are unique, and are called the singular values of . The number is equal to the rank of , and the triplet is called a singular value decomposition (SVD) of . The first columns of (resp. ) are called left (resp. right) singular vectors of , and satisfy
|
Proof: The matrix is real and symmetric. According to the spectral theorem, it admits an eigenvalue decomposition in the form , with a matrix whose columns form an orthonormal basis (that is, ), and . Here, is the rank of (if then there are no trailing zeros in ). Since is positive semi-definite, the ‘s are non-negative, and we can define the non-zero quantities .
Note that when , since then .
Let us construct an orthogonal matrix as follows. We set
These -vectors are unit-norm, and mutually orthogonal since ‘s are eigenvectors of . Using (say) the Gram-Schmidt orthogonalization procedure, we can complete (if necessary, that is in the case ) this set of vectors by in order to form an orthogonal matrix .
Let us check that satisfy the conditions of the theorem, by showing that
We have
where the second line stems from the fact that when . Thus, , as claimed.