# Mathematics Prelims

## June 25, 2009

### Eigenvalues and Eigenvectors

Filed under: Algebra,Linear Algebra — cjohnson @ 2:39 pm

Let’s suppose that $A$ is an $n \times n$ matrix which is similar to a diagonal matrix, $\text{diag}(\lambda_1, \lambda_2, ..., \lambda_n)$.  This means there is an invertible (change-of-basis) matrix $P$ such that

$\displaystyle A = P \text{diag}(\lambda_1, ..., \lambda_n) P^{-1}$

Now since $P$ is a change of basis matrix, each of its columns gives the coordinates to a basis vector of some basis.  Let’s call that basis $\beta$ and let $\beta_1$ through $\beta_n$ be the elements of that basis.  Now, if we take the above equation and multiply by $P$ on the right, notice that

$AP = P \text{diag}(\lambda_1, ..., \lambda_n)$

$\implies (AP)_{*i} = (P \text{diag}(\lambda_1, ..., \lambda_n))_{*i} = \lambda_i P_{*i}$

That is, the $i$-th column of $AP$ is equal to the $i$-th column of $P \text{diag}(\lambda_1, ..., \lambda_n)$, which is just $\lambda_i$ times the $i$-th column of $P$.  Since each column of $AP$ is just a linear combination of the columns of $A$, though, we have

$AP_{*i} = \lambda_i P_{*i}$

This means that when we plug in the $i$-th column of $P$ to the linear transformation represented by $A$, we get back a multiple of that column.  Calling the linear transformation $\tau$, we have that

$\tau(\beta_i) = \lambda_i \beta_i$.

Vectors such as $\beta_i$ whose image under $\tau$ is just a multiple of the vector are called eigenvectors of $\tau$.  That multiple, the $\lambda_i$ above, is called an eigenvalue of $\tau$.  These eigenvectors and eigenvalues are associated with a particular linear transformation, so when we talk about the eigenvectors and eigenvalues of a matrix, we really mean the eigenvectors and eigenvalues of the transformation represented by that matrix.  Notice that this means that eigenvalues are independent of the chosen basis; since similar matrices represent the same transformation just with respect to different bases, similar matrices have the same eigenvalues.

We assumed that $A$ was similar to a diagonal matrix above, but this isn’t always true.  If $A$ is similar to a diagonal matrix, say $A = P^{-1}DP$, then as we’ve just shown, the columns of $P$ are eigenvectors of $A$.  Since these form the columns of a non-singular matrix, the eigenvectors of $A$ form a basis for the vector space.  Also, if the eigenvectors of $A$ form a basis, let’s take those basis vectors as columns of $P$.

$\displaystyle P^{-1} A P$

$\displaystyle = \left[ \begin{array}{c|c|c} \beta_1 & ... & \beta_n \end{array} \right]^{-1} A \left[ \begin{array}{c|c|c} \beta_1 & ... & \beta_n \end{array} \right]$

$\displaystyle = \left[ \begin{array}{c|c|c} \beta_1 & ... & \beta_n \end{array} \right]^{-1} \left[ \begin{array}{c|c|c} A \beta_1 & ... & A \beta_n \end{array} \right]$

$\displaystyle = \left[ \begin{array}{c|c|c} \beta_1 & \cdots & \beta_n \end{array} \right]^{-1} \left[ \begin{array}{c|c|c} \lambda_1 \beta_1 & \cdots & \lambda_n \beta_n \end{array} \right]$

$\displaystyle = \left[ \begin{array}{c|c|c} \beta_1 & \cdots & \beta_n \end{array} \right]^{-1} \left( \left[ \begin{array}{c|c|c} \beta_1 & \cdots & \beta_n \end{array} \right] \left[ \begin{array}{cccc} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & & \ddots & \vdots \\ 0 & \cdots & 0 & \lambda_n \end{array} \right] \right)$

$\displaystyle = \text{diag}(\lambda_1, ..., \lambda_n)$

So a matrix is diagonalizable (similar to a diagonal matrix) if and only if its eigenvectors form a basis for the vector space.

1. So can I say from your post that an $n\times n$ matrix $A$ is diagonalizable if and only if there are linearly independent eigent vectors $latex\{\beta_1,\cdots,\beta_n\}$ of $A$.

Comment by watchmath — June 25, 2009 @ 9:38 pm

2. Right. If you have $n$ linearly independent eigenvectors, just take those to be the columns of your $P$ matrix. When you multiply out $P^{-1} A P$, you’ll get a diagonal matrix with the eigenvalues of $A$ on the diagonal.

Comment by cjohnson — June 25, 2009 @ 10:16 pm

3. […] Filed under: Algebra, Linear Algebra — cjohnson @ 7:18 pm Last time we defined the eigenvalues and eigenvectors of a matrix, but didn’t really discuss how to actually calculate the eigenvalues or […]

Pingback by The Characteristic Polynomial « Mathematics Prelims — June 26, 2009 @ 7:18 pm

4. Now I am a little disagree with this formulation
“So a matrix is diagonalizable (similar to a diagonal matrix) if and only if its eigenvectors form a basis for the vector space”
Since there are infinitely many eigenvectors, you need to tell which eigenvectors that form a basis or you can avoid that by saying that there are eigenvectors that form a basis.

Thanks

Comment by watchmath — June 28, 2009 @ 7:02 am

Blog at WordPress.com.