Fiveable

๐Ÿงš๐Ÿฝโ€โ™€๏ธAbstract Linear Algebra I Unit 7 Review

QR code for Abstract Linear Algebra I practice questions

7.2 Diagonalization Process and Spectral Decomposition

๐Ÿงš๐Ÿฝโ€โ™€๏ธAbstract Linear Algebra I
Unit 7 Review

7.2 Diagonalization Process and Spectral Decomposition

Written by the Fiveable Content Team โ€ข Last updated September 2025
Written by the Fiveable Content Team โ€ข Last updated September 2025
๐Ÿงš๐Ÿฝโ€โ™€๏ธAbstract Linear Algebra I
Unit & Topic Study Guides

Diagonalization and spectral decomposition are powerful tools for understanding matrix transformations. They break down complex matrices into simpler components, revealing their fundamental structure and behavior.

These techniques are crucial for solving various problems in linear algebra and beyond. By expressing matrices in terms of eigenvalues and eigenvectors, we gain insights into their properties and can simplify calculations in many applications.

Diagonalizing Matrices with Eigenvectors

The Diagonalization Process

  • Diagonalization finds a diagonal matrix $D$ similar to a given square matrix $A$, such that $A = PDP^{-1}$, where $P$ is an invertible matrix
  • A matrix $A$ is diagonalizable if and only if it has $n$ linearly independent eigenvectors, where $n$ is the size of the matrix
    • Example: A 3x3 matrix is diagonalizable if it has 3 linearly independent eigenvectors
  • To diagonalize a matrix $A$, first find its eigenvalues by solving the characteristic equation $det(A - ฮปI) = 0$, where $ฮป$ represents the eigenvalues and $I$ is the identity matrix
    • Example: For a 2x2 matrix $A = \begin{pmatrix} 1 & 2 \ 3 & 4 \end{pmatrix}$, the characteristic equation is $det(A - ฮปI) = (1 - ฮป)(4 - ฮป) - 6 = 0$

Finding Eigenvectors and Eigenspaces

  • For each distinct eigenvalue $ฮป_i$, find the corresponding eigenvectors by solving the equation $(A - ฮป_i I)v = 0$, where $v$ represents the eigenvectors
    • Example: If $ฮป_1 = 2$ is an eigenvalue of matrix $A$, solve $(A - 2I)v = 0$ to find the eigenvectors associated with $ฮป_1$
  • The eigenvectors corresponding to each distinct eigenvalue form a basis for the eigenspace associated with that eigenvalue
    • The eigenspace is the set of all vectors $v$ that satisfy $(A - ฮป_i I)v = 0$ for a given eigenvalue $ฮป_i$
    • The dimension of the eigenspace is equal to the geometric multiplicity of the corresponding eigenvalue

Diagonal and Invertible Matrices for Diagonalization

Constructing the Diagonal Matrix

  • The diagonal matrix $D$ is constructed by placing the eigenvalues of $A$ along the main diagonal in any order, with each eigenvalue appearing as many times as its algebraic multiplicity
    • Example: If the eigenvalues of $A$ are $ฮป_1 = 2$ with multiplicity 2 and $ฮป_2 = 3$ with multiplicity 1, then $D = \begin{pmatrix} 2 & 0 & 0 \ 0 & 2 & 0 \ 0 & 0 & 3 \end{pmatrix}$
  • The size of the diagonal matrix $D$ is the same as the size of the original matrix $A$

Constructing the Invertible Matrix

  • The invertible matrix $P$, also known as the modal matrix, is constructed by arranging the linearly independent eigenvectors of $A$ as its columns, in the same order as their corresponding eigenvalues in $D$
    • Example: If the eigenvectors of $A$ are $v_1 = \begin{pmatrix} 1 \ 0 \end{pmatrix}$ and $v_2 = \begin{pmatrix} 0 \ 1 \end{pmatrix}$, then $P = \begin{pmatrix} 1 & 0 \ 0 & 1 \end{pmatrix}$
  • The columns of $P$ must be linearly independent for $P$ to be invertible. If $A$ has repeated eigenvalues, ensure that the corresponding eigenvectors are linearly independent
  • The matrix $P^{-1}$ is the inverse of $P$, and it can be found using various methods such as Gaussian elimination or the adjugate matrix divided by the determinant of $P$
    • Example: If $P = \begin{pmatrix} 1 & 2 \ 3 & 4 \end{pmatrix}$, then $P^{-1} = \frac{1}{det(P)} \begin{pmatrix} 4 & -2 \ -3 & 1 \end{pmatrix} = \begin{pmatrix} -2 & 1 \ \frac{3}{2} & -\frac{1}{2} \end{pmatrix}$

Spectral Decomposition of Matrices

The Spectral Decomposition Theorem

  • The spectral decomposition theorem states that if $A$ is an $n ร— n$ symmetric matrix with distinct eigenvalues $ฮป_1, ฮป_2, ..., ฮป_n$ and corresponding orthonormal eigenvectors $v_1, v_2, ..., v_n$, then $A$ can be expressed as $A = ฮป_1 v_1 v_1^T + ฮป_2 v_2 v_2^T + ... + ฮป_n v_n v_n^T$
    • Example: If $A = \begin{pmatrix} 2 & 0 \ 0 & 3 \end{pmatrix}$ with eigenvalues $ฮป_1 = 2$ and $ฮป_2 = 3$, and orthonormal eigenvectors $v_1 = \begin{pmatrix} 1 \ 0 \end{pmatrix}$ and $v_2 = \begin{pmatrix} 0 \ 1 \end{pmatrix}$, then $A = 2 \begin{pmatrix} 1 \ 0 \end{pmatrix} \begin{pmatrix} 1 & 0 \end{pmatrix} + 3 \begin{pmatrix} 0 \ 1 \end{pmatrix} \begin{pmatrix} 0 & 1 \end{pmatrix}$
  • Each term in the spectral decomposition, $ฮป_i v_i v_i^T$, is a rank-one matrix, as it is the outer product of a column vector ($v_i$) with its transpose ($v_i^T$)

Matrix Form of the Spectral Decomposition

  • The spectral decomposition can be written in matrix form as $A = PDP^T$, where $P$ is an orthogonal matrix whose columns are the orthonormal eigenvectors of $A$, and $D$ is a diagonal matrix with the eigenvalues of $A$ on its main diagonal
    • Example: If $A = \begin{pmatrix} 2 & 0 \ 0 & 3 \end{pmatrix}$, $P = \begin{pmatrix} 1 & 0 \ 0 & 1 \end{pmatrix}$, and $D = \begin{pmatrix} 2 & 0 \ 0 & 3 \end{pmatrix}$, then $A = PDP^T = \begin{pmatrix} 1 & 0 \ 0 & 1 \end{pmatrix} \begin{pmatrix} 2 & 0 \ 0 & 3 \end{pmatrix} \begin{pmatrix} 1 & 0 \ 0 & 1 \end{pmatrix}$
  • If $A$ is not symmetric, the spectral decomposition theorem does not apply directly. However, a similar decomposition can be obtained using the singular value decomposition (SVD)

Geometric and Algebraic Interpretation of Spectral Decomposition

Geometric Interpretation

  • Geometrically, the spectral decomposition can be interpreted as a transformation of the standard basis vectors by the matrix $A$, followed by a scaling of each transformed vector by its corresponding eigenvalue
    • Example: If $A = \begin{pmatrix} 2 & 0 \ 0 & 3 \end{pmatrix}$, the standard basis vectors $\begin{pmatrix} 1 \ 0 \end{pmatrix}$ and $\begin{pmatrix} 0 \ 1 \end{pmatrix}$ are transformed by $A$ and then scaled by the eigenvalues 2 and 3, respectively
  • The eigenvectors represent the principal directions or axes of the transformation, while the eigenvalues represent the scaling factors along these axes
    • Example: In a 2D transformation, the eigenvectors may represent the directions of stretching or compression, while the eigenvalues indicate the amount of stretching or compression along those directions

Algebraic Interpretation

  • Algebraically, the spectral decomposition expresses a matrix as a linear combination of rank-one matrices, each of which represents a specific contribution to the overall transformation
    • Example: In the spectral decomposition $A = ฮป_1 v_1 v_1^T + ฮป_2 v_2 v_2^T$, each term $ฮป_i v_i v_i^T$ represents a rank-one matrix contributing to the transformation described by $A$
  • The magnitude of each eigenvalue indicates the significance of its corresponding eigenvector in the transformation. Larger eigenvalues have a more significant impact on the transformation than smaller eigenvalues
    • Example: If $ฮป_1 = 10$ and $ฮป_2 = 0.1$, the transformation described by $A$ is primarily determined by the eigenvector corresponding to $ฮป_1$, as it has a much larger scaling factor
  • The spectral decomposition provides insight into the underlying structure of a matrix and can be used to analyze properties such as matrix powers, exponentials, and functions of matrices
    • Example: The matrix exponential $e^A$ can be easily computed using the spectral decomposition as $e^A = Pe^DP^T$, where $e^D$ is a diagonal matrix with the exponentials of the eigenvalues on its main diagonal