Skip to the content of the web site.

Examples of nice eigenvalue decompositions or eigendecompositions

Example 1

Consider the matrix:

$A = \begin{pmatrix} 1.56 & -0.42 \\ -1.92 & 2.44 \end{pmatrix}$.

The determinant of $\lambda I_2 - A$ is the determinant of

$\begin{pmatrix} \lambda - 1.56 & 0.42 \\ 1.92 & \lambda - 2.44 \end{pmatrix}$

which is $(\lambda - 1.56)(\lambda - 2.44) - 0.42 \cdot 1.92$, which equals $\lambda^2 - 4\lambda + 3.8064 - 0.8064 = \lambda^2 - 4\lambda + 3$, which factors to $(\lambda - 1)(\lambda - 3)$, which has roots of $1$ and $3$, and thus, these are the two eigenvalues.

To find the first eigenvector when $\lambda_1 = 1$, we solve

$\begin{pmatrix} 1 - 1.56 & 0.42 & 0 \\ 1.92 & 1 - 2.44 & 0 \end{pmatrix}$,

or

$\begin{pmatrix} -0.56 & 0.42 & 0 \\ 1.92 & -1.44 & 0 \end{pmatrix}$,

and we add $\frac{24}{7}$ times Row 1 onto Row 2, but this isn't really necessary, as we know that the null space of $I_2 - A$ has dimension 1.

Thus, $\alpha_2$ is a free variable and thus $\alpha_1 = 0.75 \alpha_2$ ($42 = 3 \times 14$ and $56 = 4 \times 14$), so an eigenvector is $\textbf{u}_1 = \begin{pmatrix} 0.75 \\ 1 \end{pmatrix}$, and $\|\textbf{u}_1\|_2 = 1.25$ (note the Pythagorean triple $3, 4, 5$), so a normalized eigenvector is $\hat{\textbf{u}}_1 = \begin{pmatrix} 0.6 \\ 0.8 \end{pmatrix}$.

To find the second eigenvector when $\lambda_2 = 3$, we solve

$\begin{pmatrix} 3 - 1.56 & 0.42 & 0 \\ 1.92 & 3 - 2.44 & 0 \end{pmatrix}$,

or

$\begin{pmatrix} 1.44 & 0.42 & 0 \\ 1.92 & 0.56 & 0 \end{pmatrix}$,

and we add $-\frac{4}{3}$ times Row 1 onto Row 2, but this isn't really necessary, as we know that the null space of $3I_2 - A$ also has dimension 1.

Thus, $\alpha_2$ is a free variable and thus $\alpha_1 = -0.2916666 \alpha_2$, so an eigenvector is $\textbf{u}_2 = \begin{pmatrix} -0.291666 \\ 1 \end{pmatrix}$, and $\|\textbf{u}_2\|_2 = 1.0416666$ (it's a little more difficult to see, but this is the Pythagorean triple $7, 24, 25$), so a normalized eigenvector is $\hat{\textbf{u}}_2 = \begin{pmatrix} -0.28 \\ 0.96 \end{pmatrix}$.

Note that a calculator will keep all digits, we only show the first four repeating digits.

Thus, the matrix $U$ is $\begin{pmatrix} 0.6 & -0.28 \\ 0.8 & 0.96 \end{pmatrix}$.

To find the inverse of $U$, we continue as follows:

$\begin{pmatrix} 0.6 & -0.28 & 1 & 0 \\ 0.8 & 0.96 & 0 & 1 \end{pmatrix}$.

First, add $-\frac{4}{3}$ times Row 1 onto Row 2:

$\sim \begin{pmatrix} 0.6 & -0.28 & 1 & 0 \\ 0 & 1.3333 & -1.3333 & 1 \end{pmatrix}$.

Next, multiple Row 1 by $\frac{5}{3}$ and Row 2 by $\frac{3}{4}$:

$\sim \begin{pmatrix} 1 & -0.466668 & 1.6666 & 0 \\ 0 & 1 & -1 & 0.75 \end{pmatrix}$.

Finally, add $0.46666$ times Row 2 onto Row 1 to get

$\sim \begin{pmatrix} 1 & 0 & 1.2 & 0.35 \\ 0 & 1 & -1 & 0.75 \end{pmatrix}$.

Thus, $U^{-1} = \begin{pmatrix} 1.2 & 0.35 \\ -1 & 0.75 \end{pmatrix}$.

You will note that in the above documentation, the matrix $U$ has the second column multiplied by $-1$ when compared to this matrix. That's fine, as the normalized eigenvectors can be multiplied by $-1$ with no impact. Just make sure your teaching assistants are aware of this...

Note, if you want to have them calculate $A^{99}$, this author would suggest you use the matrix with eigenvalues $1$ and $-1$, as for that matrix, the eigenvalues are $1$ and $-1$, and thus, not surprising, $A^2 = I_2$ and thus, $A^{99} = A$.

If you are looking to calculate $A^8$, you could use a matrix with eigenvalues of $1$ and $2$, for example.

Example 2

In this case, I took a matrix $U$ and swapped Columns 2 and 3, and then multiplied this by the diagonal matrix with entries $2$, $-2$ and $-1$.

Consider the matrix:

$A = \begin{pmatrix} 1.16 & 0.63 & -1.8 \\ 2.88 & -0.16 & -2.4 \\ 0 & 0 & -2 \end{pmatrix}$.

The determinant of $\lambda I_3 - A$ is the determinant of

$\begin{pmatrix} \lambda - 1.16 & -0.63 & 1.8 \\ -2.88 & \lambda + 0.16 & 2.4 \\ 0 & 0& \lambda + 2 \end{pmatrix}$

which is $\lambda + 2$ times the determinant of $\begin{pmatrix} \lambda - 1.16 & -0.63 \\ -2.88 & \lambda + 0.16 \end{pmatrix}$ which is $(\lambda + 2)((\lambda - 1.16)(\lambda + 0.16) - 0.63 \cdot 2.88)$, which equals $(\lambda + 2)(\lambda^2 - \lambda - 0.1856 - 1.8144) = (\lambda + 2)(\lambda^2 - \lambda - 2)$, which factors to $(\lambda + 2)(\lambda - 2)(\lambda + 1)$, which has roots of $-2$, $2$ and $-1$, and thus, these are the three eigenvalues. Note the order is different from the diagonal matrix we generated this $A$ from.

To find the first eigenvector when $\lambda_1 = -2$, we solve

$A = \begin{pmatrix} -2 - 1.16 & -0.63 & 1.8 \\ -2.88 & -2 + 0.16 & 2.4 \\ 0 & 0& -2 + 2 \end{pmatrix}$.

or

$\begin{pmatrix} -3.16 & -0.63 & 1.8 \\ -2.88 & -1.84 & 2.4 \\ 0 & 0 & 0 \end{pmatrix}$

however, now to simplify this, we will take an alternate route: let's eliminate $(2,3)$ by adding $-\frac{4}{3}$ times Row 1 onto Row 2, which yields:

$\begin{pmatrix} -3.16 & -0.63 & 1.8 \\ 1.3333 & -1 & 0 \\ 0 & 0 & 0 \end{pmatrix}$

You may note that $3 \times 21 = 63$ while $4 \times 21 = 84$.

Thus, $\alpha_2$ is a free variable and thus $\alpha_1 = 0.75 \alpha_2$, and substituting this into the first equation, we get $\alpha_3 = 1.6666\alpha_2$. Thus, the eigenvector is $\textbf{u}_1 = \begin{pmatrix} 0.75 \\ 1 \\ 1.6666 \end{pmatrix}$, and $\|\textbf{u}_1\|_2 = 2.083333$ (not obvious, but $9^2 + 12^2 + 20^2 = 25^2$), so a normalized eigenvector is $\hat{\textbf{u}}_1 = \begin{pmatrix} 0.36 \\ 0.48 \\ 0.8 \end{pmatrix}$.

To find the second eigenvector when $\lambda = 2$, we solve

$\begin{pmatrix} 2 - 1.16 & -0.63 & 1.8 \\ -2.88 & 2 + 0.16 & 2.4 \\ 0 & 0 & 2 + 2 \end{pmatrix}$

or

$\begin{pmatrix} 0.84 & -0.63 & 1.8 \\ -2.88 & 2.16 & 2.4 \\ 0 & 0 & 4 \end{pmatrix}$

We know the matrix is singular, so we can again take a few short cuts. Rows 1 and 3 are linearly independent, so we may simply take it that Row 2 depends on Rows 1 and 3, so just eliminate it:

$\begin{pmatrix} 0.84 & -0.63 & 1.8 \\ 0 & 0 & 4 \\ 0 & 0 & 0 \end{pmatrix}$

Thus, $\alpha_3 = 0$, and $\alpha_2$ is a free variable, so we are left with $0.84\alpha_1 - 0.63\alpha_2$ (remember $3 \times 21 = 63$ and $4 \times 21 = 84$), so $\alpha_1 = 0.75\alpha_2$. Thus, the eigenvector is $\textbf{u}_2 = \begin{pmatrix} 0.75 \\ 1 \\ 0 \end{pmatrix}$, and $\|\textbf{u}_2\|_2 = 1.25$, so a normalized eigenvector is $\hat{\textbf{u}}_2 = \begin{pmatrix} 0.6 \\ 0.8 \\ 0 \end{pmatrix}$.

To find the third eigenvector when $\lambda_3 = -1$, we solve

$\begin{pmatrix} -1 - 1.16 & -0.63 & 1.8 \\ -2.88 & -1 + 0.16 & 2.4 \\ 0 & 0 & -1 + 2 \end{pmatrix}$

or

$\begin{pmatrix} -2.16 & -0.63 & 1.8 \\ -2.88 & 0.84 & 2.4 \\ 0 & 0 & 1 \end{pmatrix}$

Again, we can just eliminate Row 2:

$\begin{pmatrix} -2.16 & -0.63 & 1.8 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{pmatrix}$

Thus, $\alpha_3 = 0$ again and $\alpha_2$ is a free variable, so we are left with $-2.16\alpha_1 - 0.63\alpha_2$, so $\alpha_1 = -0.2916666\alpha_2$. Thus, the eigenvector is $\textbf{u}_3 = \begin{pmatrix} -0.2916666 \\ 1 \\ 0 \end{pmatrix}$, and $\|\textbf{u}_3\|_2 = 1.0416666$ (again, the Pythagorean triple $7, 24, 25$), so a normalized eigenvector is $\hat{\textbf{u}}_3 = \begin{pmatrix} -0.28 \\ 0.96 \\ 0 \end{pmatrix}$.

Thus, the matrix $U$ is $\begin{pmatrix} 0.36 & 0.6 & -0.28 \\ 0.48 & 0.8 & -0.96 \\ 0.8 & 0 & 0 \end{pmatrix}$.

To find the inverse of $U$, we start with

$\begin{pmatrix} 0.36 & 0.6 & -0.28 & 1 & 0 & 0 \\ 0.48 & 0.8 & -0.96 & 0 & 1 & 0 \\ 0.8 & 0 & 0 & 0 & 0 & 1 \end{pmatrix}$.

and end with

$\begin{pmatrix} 1 & 0 & 0 & 0 & 0 & 1.25 \\ 0 & 1 & 0 & 1.2 & 0.35 & -0.75 \\ 0 & 0 & 1 & -1 & 0.75 & 0 \end{pmatrix}$.

Thus, $U^{-1} = \begin{pmatrix} 0 & 0 & 1.25 \\ 1.2 & 0.35 & -0.75 \\ -1 & 0.75 & 0 \end{pmatrix}$.

Example 3

In this case, the example simply comes out of the examples given. I have chosen a $4 \times 4$ matrix that has many zeros, which should simplify the process of finding the determinant and solving for the eigenspaces.

Consider the matrix:

[[0, 2, 0, 0], [2, -1.8, 0, 2.4], [0, 0, 0, 2], [0, 2.4, 2, 1.8]]

$A = \begin{pmatrix} 0 & 2 & 0 & 0 \\ 2 & -1.8 & 0 & 2.4 \\ 0 & 0 & 0 & 2 \\ 0 & 2.4 & 2 & 1.8 \end{pmatrix}$.

The determinant of $\lambda I_4 - A$ is the determinant of

$\begin{pmatrix} \lambda & -2 & 0 & 0 \\ -2 & \lambda + 1.8 & 0 & -2.4 \\ 0 & 0 & \lambda & -2 \\ 0 & -2.4 & -2 & \lambda - 1.8 \end{pmatrix}$.

which can be expanded along the first column, so we have that the determinant is the sum of $\lambda$ times the determinant of $\begin{pmatrix} \lambda + 1.8 & 0 & -2.4 \\ 0 & \lambda & -2 \\ -2.4 & -2 & \lambda - 1.8 \end{pmatrix}$ minus $-2$ times the determinant of $\begin{pmatrix} -2 & 0 & 0 \\ 0 & \lambda & -2 \\ -2.4 & -2 & \lambda - 1.8 \end{pmatrix}$.

Thus, we have $\lambda((\lambda + 1.8)\lambda(\lambda - 1.8) - 2.4^2\lambda - 2(-2)(\lambda + 1.8))$ minus $(-2)^2(\lambda(\lambda - 1.8) - 2^2)$, which when expanded yields $(\lambda^4 - 13\lambda^2 - 7.2\lambda) - (4\lambda^2 - 7.2\lambda - 16)$ which equals $\lambda^4 - 17\lambda^2 + 16$, and this, when factored equals $(\lambda^2 - 16)(\lambda^2 - 1)$, which equals $(\lambda + 4)(\lambda - 4)(\lambda + 1)(\lambda - 1)$, and thus the eigenvalues are $-4$, $4$, $-1$ and $1$.

To find the first eigenvector when $\lambda_1 = -4$, we solve

$\begin{pmatrix} -4 & -2 & 0 & 0 \\ -2 & -4 + 1.8 & 0 & -2.4 \\ 0 & 0 & -4 & -2 \\ 0 & -2.4 & -2 & -4 - 1.8 \end{pmatrix} = \begin{pmatrix} -4 & -2 & 0 & 0 \\ -2 & -2.2 & 0 & -2.4 \\ 0 & 0 & -4 & -2 \\ 0 & -2.4 & -2 & -5.8 \end{pmatrix}$.

$\sim \begin{pmatrix} -4 & -2 & 0 & 0 \\ 0 & -1.2 & 0 & -2.4 \\ 0 & 0 & -4 & -2 \\ 0 & 0 & 0 & 0 \end{pmatrix}$

$\alpha_4$ is free, so $\alpha_3 = -0.5\alpha_4$, $\alpha_2 = -2 \alpha_4$ and $\alpha_1 = \alpha_4$, so the eigenvector is $\textbf{u}_1 = \begin{pmatrix} 1 \\ -2 \\ -0.5 \\ 1 \end{pmatrix}$, and $\|\textbf{u}_1\|_2 = 2.5$ (the sum of the squares is $6.25$, which is $\frac{625}{100}$ and both numerator and denominator are perfect squares, resulting in $\frac{25}{10}$), so a normalized eigenvector is $\hat{\textbf{u}}_1 = \begin{pmatrix} 0.4 \\ -0.8 \\ -0.2 \\ 0.4 \end{pmatrix}$.

To find the second eigenvector when $\lambda_2 = 4$, we solve

$\begin{pmatrix} 4 & -2 & 0 & 0 \\ -2 & 4 + 1.8 & 0 & -2.4 \\ 0 & 0 & 4 & -2 \\ 0 & -2.4 & -2 & 4 - 1.8 \end{pmatrix} = \begin{pmatrix} 4 & -2 & 0 & 0 \\ -2 & 5.8 & 0 & -2.4 \\ 0 & 0 & 4 & -2 \\ 0 & -2.4 & -2 & 2.2 \end{pmatrix}$.

$\sim \begin{pmatrix} 4 & -2 & 0 & 0 \\ 0 & 4.8 & 0 & -2.4 \\ 0 & 0 & 4 & -2 \\ 0 & 0 & 0 & 0 \end{pmatrix}$.

$\alpha_4$ is free, so $\alpha_3 = 0.5\alpha_4$, $\alpha_2 = 0.5 \alpha_4$ and $\alpha_1 = 0.25\alpha_4$, so the eigenvector is $\textbf{u}_2 = \begin{pmatrix} 0.25 \\ 0.5 \\ 0.5 \\ 1 \end{pmatrix}$, and $\|\textbf{u}_2\|_2 = 1.25$, so a normalized eigenvector is $\hat{\textbf{u}}_2 = \begin{pmatrix} 0.2 \\ 0.4 \\ 0.4 \\ 0.8 \end{pmatrix}$.

To find the third eigenvector when $\lambda_3 = -1$, we solve

$\begin{pmatrix} -1 & -2 & 0 & 0 \\ -2 & -1 + 1.8 & 0 & -2.4 \\ 0 & 0 & -1 & -2 \\ 0 & -2.4 & -2 & -1 - 1.8 \end{pmatrix} = \begin{pmatrix} -1 & -2 & 0 & 0 \\ -2 & 0.8 & 0 & -2.4 \\ 0 & 0 & -1 & -2 \\ 0 & -2.4 & -2 & -2.8 \end{pmatrix}$.

$\sim \begin{pmatrix} -1 & -2 & 0 & 0 \\ 0 & 4.8 & 0 & -2.4 \\ 0 & 0 & -1 & -2 \\ 0 & 0 & 0 & 0 \end{pmatrix}$.

$\alpha_4$ is free, so $\alpha_3 = -2\alpha_4$, $\alpha_2 = 0.5 \alpha_4$ and $\alpha_1 = -\alpha_4$, so the eigenvector is $\textbf{u}_3 = \begin{pmatrix} -1 \\ 0.5 \\ -2 \\ 1 \end{pmatrix}$, and $\|\textbf{u}_3\|_2 = 2.5$, so a normalized eigenvector is $\hat{\textbf{u}}_3 = \begin{pmatrix} -0.4 \\ 0.2 \\ -0.8 \\ 0.4 \end{pmatrix}$.

Finally, to find the fourth eigenvector when $\lambda_4 = 1$, we solve

$\begin{pmatrix} 1 & -2 & 0 & 0 \\ -2 & 1 + 1.8 & 0 & -2.4 \\ 0 & 0 & 1 & -2 \\ 0 & -2.4 & -2 & 1 - 1.8 \end{pmatrix} = \begin{pmatrix} 1 & -2 & 0 & 0 \\ -2 & 2.8 & 0 & -2.4 \\ 0 & 0 & 1 & -2 \\ 0 & -2.4 & -2 & -0.8 \end{pmatrix}$.

$\sim \begin{pmatrix} 1 & -2 & 0 & 0 \\ 0 & -1.2 & 0 & -2.4 \\ 0 & 0 & 1 & -2 \\ 0 & 0 & 0 & 0 \end{pmatrix}$.

$\alpha_4$ is free, so $\alpha_3 = 2\alpha_4$, $\alpha_2 = -2\alpha_4$ and $\alpha_1 = -4\alpha_4$, so the eigenvector is $\textbf{u}_4 = \begin{pmatrix} -4 \\ -2 \\ 2 \\ 1 \end{pmatrix}$, and $\|\textbf{u}_4\|_2 = 5$, so a normalized eigenvector is $\hat{\textbf{u}}_4 = \begin{pmatrix} -0.8 \\ -0.4 \\ 0.4 \\ 0.2 \end{pmatrix}$.

Thus, the matrix of eigenvectors is

$U = \begin{pmatrix} 0.4 & 0.2 & -0.4 & -0.8 \\ -0.8 & 0.4 & 0.2 & -0.4 \\ -0.2 & 0.4 & -0.8 & 0.4 \\ 0.4 & 0.8 & 0.4 & 0.2 \end{pmatrix}$.

and because the matrix $A$ is symmetric, the matrix $U$ must be orthogonal (you can check this by calculating $UU^\textrm{T}$ which should equal $I_4$, which it does), so its inverse is its transpose and we can check that

$U \begin{pmatrix} -4 & 0 & 0 & 0 \\ 0 & 4 & 0 & 0 \\ 0 & 0 & -1 & 0 \\ 0 & 0 & 0 & 1 \end{pmatrix}U^\textrm{T} = A$,

which it does.

Note that if you find this example, you will see that the columns of $U$ are swapped, but this is because the eigenvalues ended up in the order $-4$, $4$, $-1$ and $1$. A different ordering would produce the $U$ in the example.