40 EXERCISES
40.1. SVD of simple matrices
1. Consider the matrix
a. Find the range, nullspace, and rank of .
b. Find an SVD of .
c. Determine the set of solutions to the linear equation , with
2. Consider the matrix
a. What is an SVD of ? Express it as , with the diagonal matrix of singular values ordered in decreasing fashion. Make sure to check all the properties required for .
b. Find the semi-axis lengths and principal axes (minimum and maximum distance and associated directions from to the center) of the ellipsoid
Hint: Use the SVD of to show that every element of is of the form for some element in . That is, . (In other words, the matrix maps into the set .) Then analyze the geometry of the simpler set .
c. What is the set when we append a zero vector after the last column of , that is is replaced with ?
d. The same question when we append a row after the last row of , that is, is replaced with . Interpret geometrically your result.
40.2. Rank and SVD
The image on the left shows a matrix of pixel values. The lines indicate values; at each intersection of lines, the corresponding matrix element is . All the other elements are zero.
1. Show that for some permutation matrices , the permuted matrix has the symmetric form , for two vectors . Determine and .
2. What is the rank of ? Hint: find the nullspace of .
3. Find an SVD of . Hint: Find an eigenvalue decomposition of , using the results of an exercise on eigenvalue here.
40.3. Procrustes problem
The Orthogonal Procrustes problem is a problem of the form
where denotes the Frobenius norm, and the matrices , are given. Here, the matrix variable is constrained to have orthonormal columns. When , the problem can be interpreted geometrically as seeking a transformation of points (contained in ) to other points (contained in ) that involves only rotation.
1. Show that the solution to the Procrustes problem above can be found via the SVD of the matrix .
2. Derive a formula for the answer to the constrained least-squares problem
with , given.
40.4. SVD and projections
1. We consider a set of data points , . We seek to find a line in such that the sum of the squares of the distances from the points to the line is minimized. To simplify, we assume that the line goes through the origin.
a. Consider a line that goes through the origin , where is given. (You can assume without loss of generality that .) Find an expression for the projection of a given point on .
b. Now consider the points and find an expression for the sum of the squares of the distances from the points to the line .
c. Explain how you would find the line via the SVD of the matrix .
d. How would you address the problem without the restriction that the line has to pass through the origin?
2. Solve the same problems as previously by replacing the line with a hyperplane.
40.5. SVD and least-squares
1. Consider the matrix formed as , with
with is a vector chosen randomly in (this represents a 3-dimensional cube with each dimension ranging from to ). In addition, we define
with again chosen randomly in . We consider the associated least-squares problem
a. What is the rank of ?
b. Apply the least-squares formula . What is the norm of the residual vector, ?
c. Express the least-squares solution in terms of the SVD of . That is, form the pseudo-inverse of and apply the formula . What is now the norm of the residual?
d. Interpret your results.
2. Consider a least-squares problem
where the data matrix has rank one.
a. Is the solution unique?
b. Show how to reduce the problem to one involving one scalar variable.
c. Express the set of solutions in closed form.