Eigen Value Decomposition

Eigen Values, Eigen Vectors, & Eigen Value Decomposition

In this section we will understand Eigen Values, Eigen Vectors, & Eigen Value Decomposition.


💡 What is the meaning of the word “Eigen” ?

Eigen is a German word that means “Characteristic” or “Proper”.
It tells us about the characteristic properties of a matrix.

📘 Linear Transformation
A linear transformation defined by a matrix, denoted as \(T(x)=A\mathbf{x}\), is a function that maps a vector \(\mathbf{x}\) to a new vector by multiplying it by a matrix \(A\).
Multiplying a vector by a matrix can change the direction or magnitude or both of the vector.
For example:
\( \mathbf{A} = \begin{bmatrix} 2 & 1 \\ \\ 1 & 2 \end{bmatrix} \), \(\mathbf{u} = \begin{bmatrix} 0 \\ \\ 1 \\ \end{bmatrix}\), \(\mathbf{v} = \begin{bmatrix} 1 \\ \\ 1 \\ \end{bmatrix}\)

\(\mathbf{Au} = \begin{bmatrix} 1 \\ \\ 2 \\ \end{bmatrix}\) , \(\quad\) \(\mathbf{Av} = \begin{bmatrix} 3 \\ \\ 3 \\ \end{bmatrix}\)


📘 Eigen Vector
A special non-zero vector whose direction remains unchanged after transformation by a matrix is applied.
It might get scaled up or down but does not change its direction.
Result of linear transformation, i.e, multiplying the vector by a matrix, is just a scalar multiple of the original vector.

📘 Eigen Value (\(\lambda\))
It is the scaling factor of the eigen vector, i.e, a scalar multiple \(\lambda\) of the original vector, when the vector is multiplied by a matrix.
\(|\lambda| > 1 \): Vector stretched
\(0 < |\lambda| < 1 \): Vector shrunk
\(|\lambda| = 1 \): Same size
\(\lambda < 0 \): Vector’s direction is reversed

💡 What are the eigen values and vectors of an identity matrix?

Characteristic equation for identity matrix:
\(\mathbf{Iv} = \lambda \mathbf{v}\)
Therefore, identity matrix has only one eigen value \(\lambda = 1\), and all non-zero vectors can be eigen vectors.

💡 Are the eigen values of a real matrix always real?

No, eigen values can be complex; if complex, then always occur in conjugate pairs. e.g:
\(\mathbf{A} = \begin{bmatrix} 0 & 1 \\ \\ -1 & 0 \end{bmatrix} \), \( \quad det(\mathbf{A} - \lambda \mathbf{I}) = 0 \quad \) => det \(\begin{bmatrix} 0-\lambda & 1 \\ \\ -1 & 0-\lambda \end{bmatrix} = 0 \)

=> \(\lambda^2 + 1 = 0\) => \(\lambda = \pm i\)
So, eigen values are complex.

💡 What are the eigen values of a diagonal matrix?

The eigen values of a diagonal matrix are the diagonal elements themselves.
e.g.:
\( \mathbf{D} = \begin{bmatrix} d_{11} & 0 & \cdots & 0 \\ 0 & d_{22} & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & d_{nn} \end{bmatrix} _{\text{n x n}} \), \( \quad det(\mathbf{D} - \lambda \mathbf{I}) = 0 \quad \) => det \( \begin{bmatrix} d_{11}-\lambda & 0 & \cdots & 0 \\ 0 & d_{22}-\lambda & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & d_{nn}-\lambda \end{bmatrix} _{\text{n x n}} \)

=> \((d_{11}-\lambda)(d_{22}-\lambda) \cdots (d_{nn}-\lambda) = 0\)
=> \(\lambda = d_{11}, d_{22}, \cdots, d_{nn}\)
So, eigen values are the diagonal elements of the matrix.

💡 How will we calculate the 2nd power of a matrix i.e \(\mathbf{A}^2\)?

Let’ calculate the 2nd power of a square matrix.

e.g.:
\(\mathbf{A} = \begin{bmatrix} 2 & 1 \\ \\ 1 & 2 \end{bmatrix} \), \(\quad \mathbf{A}^2 = \begin{bmatrix} 2 & 1 \\ \\ 1 & 2 \end{bmatrix} \begin{bmatrix} 2 & 1 \\ \\ 1 & 2 \end{bmatrix} = \begin{bmatrix} 5 & 4 \\ \\ 4 & 5 \end{bmatrix} \)


💡 Now, how will we calculate higher powers of a matrix i.e \(\mathbf{A}^k\)?

If we follow the above method, then we will have to multiply the matrix \(\mathbf{A}\), \(k\) times, which will be very time consuming and cumbersome.
So, we need to find an easier way to calculate the power of a matrix.

💡 How will we calculate the power of diagonal matrix?

Let’s calculate the 2nd power of a diagonal matrix.

e.g.:
\(\mathbf{A} = \begin{bmatrix} 3 & 0 \\ \\ 0 & 2 \end{bmatrix} \), \(\quad \mathbf{A}^2 = \begin{bmatrix} 3 & 0 \\ \\ 0 & 2 \end{bmatrix} \begin{bmatrix} 3 & 0 \\ \\ 0 & 2 \end{bmatrix} = \begin{bmatrix} 9 & 0 \\ \\ 0 & 4 \end{bmatrix} \)

Note that when we square the diagonal matrix, then all the diagonal elements got squared.
Similarly, if we want to calculate the kth power of a diagonal matrix, then all we need to do is to just compute the kth powers of all diagonal elements, instead of complex matrix multiplications.
\(\quad \mathbf{A}^k = \begin{bmatrix} 3^k & 0 \\ \\ 0 & 2^k \end{bmatrix} \)

Therefore, if we diagonalize a square matrix then the computation of power of the matrix will become very easy.
Next, let’s see how to diagonalize a matrix.

For example:
Let’s revisit the example given above:
\(\mathbf{A} = \begin{bmatrix} 2 & 1 \\ \\ 1 & 2 \end{bmatrix} \), \(\quad \lambda_1 = 3\) and \( \lambda_2 = 1\), \(\mathbf{v_1} = \begin{bmatrix} 1 \\ \\ 1 \\ \end{bmatrix}\), \(\mathbf{v_2} = \begin{bmatrix} 1 \\ \\ -1 \\ \end{bmatrix}\)

=> \( \mathbf{V} = \begin{bmatrix} 1 & 1 \\ \\ 1 & -1 \\ \end{bmatrix} \), \(\quad \mathbf{\Lambda} = \begin{bmatrix} 3 & 0 \\ \\ 0 & 1 \end{bmatrix} \)

\( \because \mathbf{A} = \mathbf{V} \mathbf{\Lambda} \mathbf{V}^{-1}\)
We know, \( \mathbf{V} ~and~ \mathbf{\Lambda} \), we need to calculate \(\mathbf{V}^{-1}\).

\(\mathbf{V}^{-1} = \frac{1}{2}\begin{bmatrix} 1 & 1 \\ \\ 1 & -1 \\ \end{bmatrix} \)

\( \therefore \mathbf{V} \mathbf{\Lambda} \mathbf{V}^{-1} = \begin{bmatrix} 1 & 1 \\ \\ 1 & -1 \\ \end{bmatrix} \begin{bmatrix} 3 & 0 \\ \\ 0 & 1 \end{bmatrix} \frac{1}{2}\begin{bmatrix} 1 & 1 \\ \\ 1 & -1 \\ \end{bmatrix} \)

\( = \frac{1}{2} \begin{bmatrix} 3 & 1 \\ \\ 3 & -1 \\ \end{bmatrix} \begin{bmatrix} 1 & 1 \\ \\ 1 & -1 \\ \end{bmatrix} = \frac{1}{2}\begin{bmatrix} 4 & 2 \\ \\ 2 & 4 \\ \end{bmatrix} \)

\( = \begin{bmatrix} 2 & 1 \\ \\ 1 & 2 \end{bmatrix} = \mathbf{A} \)



End of Section