Redirected from Eigenspace
In linear algebra, the eigenvectors (from the German eigen meaning "own") of a linear operator are nonzero vectors which, when operated on by the operator, result in a scalar multiple of themselves. The scalar is then called the eigenvalue associated with the eigenvector. The set of all the eigenvalues is called the matrix or operator spectrum.
In applied mathematics and physics the eigenvectors of a matrix or a differential operator[?] often have important physical significance. In classical mechanics the eigenvectors of the governing equations typically correspond to natural modes of vibration in a body, and the eigenvalues to their frequencies. In quantum mechanics, operators correspond to observable variables, eigenvectors are also called eigenstates, and the eigenvalues of an operator represent those values of the corresponding variable that have nonzero probability of occurring.
Examples Intuitively, for linear transformations of twodimensional space R^{2}, eigenvectors are thus:
Definition Formally, we define eigenvectors and eigenvalues as follows: If A : V > V is a linear operator on some vector space V, v is a nonzero vector in V and c is a scalar (possibly zero) such that
then we say that v is an eigenvector of the operator A, and its associated eigenvalue is <math>c</math>. Note that if v is an eigenvector with eigenvalue <math>c</math>, then any nonzero multiple of v is also an eigenvector with eigenvalue <math>c</math>. In fact, all the eigenvectors with associated eigenvalue <math>c</math>, together with 0, form a subspace of V, the eigenspace for the eigenvalue <math>c</math>.
For example, consider the matrix
which represents a linear operator R^{3} > R^{3}. One can check that
Returning to the example above, if we wanted to compute all of A's eigenvalues, we could determine the characteristic polynomial first:
\begin{bmatrix}x & 1 & 1 \\ 1 & 1x & 0 \\ 1 & 0 & 1x \end{bmatrix} </math>
and because of <math>p(x) = (x  2) (x  1) (x + 1)</math> we see that the eigenvalues of A are 2, 1 and 1.
(In practice, eigenvalues of large matrices are not computed using the characteristic polynomial. Faster and more numerically stable methods are available, for instance the QR decomposition[?].)
Note that if A is a real matrix, the characteristic polynomial will have real coefficients, but not all its roots will necessarily be real. The complex eigenvalues will all be associated to complex eigenvectors.
In general, if v_{1}, ..., v_{m} are eigenvectors to different eigenvalues λ_{1}, ..., λ_{m}, then the vectors v_{1}, ..., v_{m} are necessarily linearly independent.
The spectral theorem for symmetric matrices states that, if A is a real symmetric nbyn matrix, then all its eigenvalues are real, and there exist n linearly independent eigenvectors for A which all have length 1 and are mutually orthogonal.
Our example matrix from above is symmetric, and three mutually orthogonal eigenvectors of A are
These three vectors form a basis of R^{3}. With respect to this basis, the linear map represented by A takes a particularly simple form: every vector x in R^{3} can be written uniquely as
Search Encyclopedia

Featured Article
