Given a matrix , a vector and a scalar , is called an Eigenvalue and is called an Eigenvector if the following equation is satisfied:
That means, applying the linear transformation to (multiplying the matrix with the vector) gives the same result as multiplying the same vector by a scalar . In many cases, applying a linear transformation to a vector will give a vector that points into a different direction. Only in some cases, the new vector will have the same direction (even if the orientation is inverted) as the original vector. In this case the new vector can be obtained from the old one by multiplying it with a scalar.
The above equation has a trivial solution for . Typically, we want to find the Eigenvectors and Eigenvalues of a matrix with .
Finding the Eigenvalues of a matrix
Given a square matrix , we want to find its Eigenvalues for a nonzero vector . We transform the above equation as follows:
If the matrix is invertible, then the solution is: . However, we want to find a solution for a . Hence, we can only find a solution if is not invertible. This is the case when the determinant of this matrix is zero:
The determinant gives us a polynomial. The roots of this polynomial are the Eigenvalues of .
Let us find the Eigenvalues of the following matrix:
We find the polynomial given by the determinant :
The Eigenvalues are the roots of this polynomial and can be found as follows:
From this we can see that the Eigenvalues of are and .
Finding the Eigenvectors of a Matrix
To find the Eigenvectors of a matrix , we need to know its Eigenvalues and can find by solving .
We continue with our example from above. Remember our matrix and Eigenvalues:
We first find the Eigenvectors for the Eigenvalue :
This linear system of equations is not independent. It is satisfied for any vector such that . The Eigenvectors of corresponding to the Eigenvalue are: for any .
We repeat the same process for and find the the Eigenvectors for any .