Eigenvalues and eigenvectors are crucial concepts in linear algebra. They help us understand how matrices transform vectors, revealing special directions where the transformation acts like simple scaling.
These concepts are key to solving many problems in science and engineering. From quantum mechanics to data compression, eigenvalues and eigenvectors provide powerful tools for analyzing complex systems and simplifying calculations.
Eigenvalues and Eigenvectors
Definition and Characteristics
- An eigenvector of a square matrix A is a non-zero vector v such that the matrix A multiplied by v equals the scalar ฮป multiplied by v ($Av = ฮปv$)
- The scalar ฮป is the eigenvalue corresponding to the eigenvector v
- The eigenvalue equation $Av = ฮปv$ can be rewritten as $(A - ฮปI)v = 0$, where I is the identity matrix
- A non-zero solution v to the equation $(A - ฮปI)v = 0$ exists if and only if the determinant of $(A - ฮปI)$ equals zero ($det(A - ฮปI) = 0$)
- The equation $det(A - ฮปI) = 0$ is the characteristic equation of the matrix A, and the left-hand side is the characteristic polynomial
Eigenspaces and Multiplicities
- The set of all eigenvectors corresponding to an eigenvalue ฮป, together with the zero vector, forms a subspace called the eigenspace of ฮป
- For example, the eigenspace of ฮป = 2 for the matrix $A = \begin{pmatrix} 2 & 0 \ 0 & 3 \end{pmatrix}$ is the subspace spanned by the vector $\begin{pmatrix} 1 \ 0 \end{pmatrix}$
- The dimension of the eigenspace is the geometric multiplicity of the eigenvalue
- The algebraic multiplicity of an eigenvalue is its multiplicity as a root of the characteristic polynomial
- For instance, if the characteristic polynomial is $(ฮป - 2)^2(ฮป - 3)$, the eigenvalue 2 has an algebraic multiplicity of 2, while the eigenvalue 3 has an algebraic multiplicity of 1
Computing Eigenvalues and Eigenvectors
Finding Eigenvalues
- To find the eigenvalues of a square matrix A, solve the characteristic equation $det(A - ฮปI) = 0$ for ฮป
- For the matrix $A = \begin{pmatrix} 1 & 2 \ 3 & 4 \end{pmatrix}$, the characteristic equation is $det \begin{pmatrix} 1-ฮป & 2 \ 3 & 4-ฮป \end{pmatrix} = (1-ฮป)(4-ฮป) - 6 = ฮป^2 - 5ฮป - 2 = 0$, which gives the eigenvalues ฮป = -1 and ฮป = 2
- The roots of the characteristic polynomial are the eigenvalues of the matrix A
Finding Eigenvectors
- For each eigenvalue ฮป, find the corresponding eigenvectors by solving the equation $(A - ฮปI)v = 0$ for non-zero vectors v
- For the matrix $A = \begin{pmatrix} 1 & 2 \ 3 & 4 \end{pmatrix}$ and the eigenvalue ฮป = -1, solve $\begin{pmatrix} 2 & 2 \ 3 & 5 \end{pmatrix}v = 0$, which gives the eigenvector $v = \begin{pmatrix} 1 \ -1 \end{pmatrix}$ (up to scalar multiplication)
- The solutions to the equation $(A - ฮปI)v = 0$ form the eigenspace corresponding to the eigenvalue ฮป
Geometric Interpretation of Eigenvalues and Eigenvectors
Scaling and Stretching
- Geometrically, an eigenvector v of a matrix A represents a direction in which the linear transformation defined by A acts by scaling
- The corresponding eigenvalue ฮป represents the scaling factor in the direction of the eigenvector
- If ฮป is positive, the eigenvector is stretched by a factor of ฮป (e.g., if ฮป = 2, the eigenvector is doubled in length)
- If ฮป is negative, the eigenvector is flipped and stretched by a factor of |ฮป| (e.g., if ฮป = -3, the eigenvector is flipped and tripled in length)
Special Cases
- If ฮป = 1, the eigenvector remains unchanged under the linear transformation
- If ฮป = 0, the eigenvector is mapped to the zero vector, and the linear transformation is not invertible
- In this case, the matrix A is singular, and the eigenvector corresponds to the nullspace of A
Special Eigenvalues and Eigenvectors
Dominant Eigenvalues and Eigenvectors
- A dominant eigenvalue is an eigenvalue with the largest absolute value among all the eigenvalues of a matrix
- The eigenvector corresponding to the dominant eigenvalue is the dominant eigenvector
- In applications such as Markov chains and power iteration, the dominant eigenvector plays a crucial role in determining the long-term behavior of a system
- For example, in a Markov chain representing a random walk on a graph, the dominant eigenvector corresponds to the stationary distribution of the walk
Repeated Eigenvalues and Defective Matrices
- Repeated eigenvalues occur when an eigenvalue has algebraic multiplicity greater than one
- If the geometric multiplicity of a repeated eigenvalue is less than its algebraic multiplicity, the matrix is defective and does not have a full set of linearly independent eigenvectors
- For instance, the matrix $A = \begin{pmatrix} 2 & 1 \ 0 & 2 \end{pmatrix}$ has a repeated eigenvalue ฮป = 2 with algebraic multiplicity 2, but only one linearly independent eigenvector $\begin{pmatrix} 1 \ 0 \end{pmatrix}$
- Matrices with repeated eigenvalues may require generalized eigenvectors to form a complete basis for the vector space