next up previous contents
Next: Similar matrices. Up: Eigenvalues and eigenvectors. Previous: An application of Cayley-Hamilton's

Eigenvalues and Eigenvectors.

Let $\phi$ be an endomorphism of the vector space V.

Definition 11.4.1   A scalar $\lambda$ for which there exists a non-zero vector $\overrightarrow{u} $ such that $\phi (\overrightarrow{u} = \lambda \overrightarrow{u} $ is called an eigenvalue of $\phi$ and the vector $\overrightarrow{u} $ is called an eigenvector of $\phi$.

Example 11.4.2   If $\phi$ is defined by $\forall \overrightarrow{u}\in V, \; \phi (\overrightarrow{u} )=2 \overrightarrow{u} $, then every vector is an eigenvector corresponding to the eigenvalue 2.

Example 11.4.3   $\phi$ is the endomorphism of $V=\mathbb{R} ^2$ whose matrix w.r.t. the standard basis is $A=\begin{pmatrix}3 & 4 \\ 4 & -3 \end{pmatrix}$ .


\begin{align*}\phi ( \overrightarrow{u} ) = \lambda \overrightarrow{u} & \Longle...
...begin{cases}(3-\lambda) x+4y= 0 \\ 4x-(3+\lambda )y =0 \end{cases}
\end{align*}
This system of equations has a non trivial solution if, and only if, its determinant is equal to 0.

$\begin{vmatrix}3-\lambda & 4 \\ 4 & -3-\lambda \end{vmatrix}=0 \Longleftrightar...
...ambda^2 -25 =0
\Longleftrightarrow \lambda =5 \quad \text{or} \quad \lambda =-5$.

The endomorphism $\phi$ has two eigenvalues: 5 and -5.

Let's look for eigenvectors. We replace $\lambda$ successively by 5 and by -5 in the last system and solve for (x,y). We have:

Example 11.4.4   Let $V= \mathcal{C}^{\infty}(\mathbb{R} ,\mathbb{R} )$ be the space of all functions from $\mathbb{R} $ to $\mathbb{R} $ which are infinitely many times derivable. Consider the endomorphism $D=\frac {d^2}{dx^2}$ of the vector space V. We have $\frac {d^2}{dx^2} (\cos x) = - \cos x$ and $\frac {d^2}{dx^2} (\sin x) = - \sin x$, i.e. the sine function and the cosine function are eigenvectors with eigenvalue -1 of the endomorphism $\frac {d^2}{dx^2}$.

Properties:

1.
Let $\phi$ be an endomorphism of the vector space V. If $\lambda$ is an eigenvalue of $\phi$, then the set $V(\lambda )$ of all the vectors with eigenvalue $\lambda$ is a vector subspace of V.
2.
For $\lambda_1 \neq \lambda_2$, $V(\lambda_1 ) \cap V( \lambda_2 ) = \empty$.
3.
If $\overrightarrow{u_1} , \dots , \overrightarrow{u_r} $ are eigenvectors of $\phi$ corresponding to the respective (distinct) eigenvalues $\lambda_1, \dots , \lambda_r$, then the family $\overrightarrow{u_1} , \dots , \overrightarrow{u_r} $ is linearly independent.

The case of a finite dimensional vector space:

V is now an n-dimensional vector space and a basis for V is given. To every endomorphism $\phi$ we associate a square matrix $\Phi$ of order n. The eigenvalues of $\phi$ are the roots of the characteristic polynomial of $\Phi$; they will be called the eigenvalues of the matrix $\Phi$. By the same way the eigenvectors of $\phi$ will ce also called the eigenvectors of $\Phi$.

Note that the previous results are independent of the choice of the basis (v.i.  5.2).


next up previous contents
Next: Similar matrices. Up: Eigenvalues and eigenvectors. Previous: An application of Cayley-Hamilton's
Noah Dana-Picard
2001-02-26