Fiveable

๐Ÿงš๐Ÿฝโ€โ™€๏ธAbstract Linear Algebra I Unit 10 Review

QR code for Abstract Linear Algebra I practice questions

10.2 Self-Adjoint Operators and Hermitian Matrices

๐Ÿงš๐Ÿฝโ€โ™€๏ธAbstract Linear Algebra I
Unit 10 Review

10.2 Self-Adjoint Operators and Hermitian Matrices

Written by the Fiveable Content Team โ€ข Last updated September 2025
Written by the Fiveable Content Team โ€ข Last updated September 2025
๐Ÿงš๐Ÿฝโ€โ™€๏ธAbstract Linear Algebra I
Unit & Topic Study Guides

Self-adjoint operators are linear operators equal to their adjoint, with real eigenvalues and orthogonal eigenspaces. They're crucial in quantum mechanics and data analysis. Their properties make them ideal for representing physical observables and analyzing complex datasets.

Hermitian matrices are the matrix representation of self-adjoint operators in finite-dimensional spaces. They share similar properties, including real eigenvalues and orthogonal eigenvectors. The spectral theorem allows for diagonalization, enabling efficient computation of matrix functions and applications in various fields.

Self-adjoint operators in inner product spaces

Definition and properties

  • A self-adjoint operator is a linear operator equal to its adjoint operator
    • For a linear operator $T$ on an inner product space $V$, $T$ is self-adjoint if $\langle Tx, y \rangle = \langle x, Ty \rangle$ for all $x, y \in V$
  • Self-adjoint operators are bounded and have real eigenvalues
    • The eigenspaces corresponding to distinct eigenvalues are orthogonal
  • If $T$ is a self-adjoint operator on a finite-dimensional inner product space $V$, there exists an orthonormal basis for $V$ consisting of eigenvectors of $T$

Algebraic properties

  • The set of self-adjoint operators on an inner product space forms a real vector space under the usual addition and scalar multiplication of operators
  • The composition of two self-adjoint operators is self-adjoint if and only if the operators commute
    • For self-adjoint operators $S$ and $T$, $ST = TS$ is a necessary and sufficient condition for $ST$ to be self-adjoint
  • The sum of two self-adjoint operators is always self-adjoint
    • If $S$ and $T$ are self-adjoint, then $S + T$ is also self-adjoint
  • Scalar multiples of self-adjoint operators are self-adjoint
    • If $T$ is self-adjoint and $c \in \mathbb{R}$, then $cT$ is also self-adjoint

Eigenvalues and eigenvectors of self-adjoint operators

Eigenvalue properties

  • Eigenvalues of a self-adjoint operator are always real
    • If $\lambda$ is an eigenvalue of a self-adjoint operator $T$, then $\lambda \in \mathbb{R}$
  • Eigenvectors corresponding to distinct eigenvalues of a self-adjoint operator are orthogonal
    • If $v_1$ and $v_2$ are eigenvectors of a self-adjoint operator $T$ with distinct eigenvalues $\lambda_1$ and $\lambda_2$, then $\langle v_1, v_2 \rangle = 0$
  • The algebraic and geometric multiplicities of each eigenvalue of a self-adjoint operator are equal
    • For any eigenvalue $\lambda$ of a self-adjoint operator $T$, the dimension of the eigenspace corresponding to $\lambda$ equals the multiplicity of $\lambda$ as a root of the characteristic polynomial of $T$

Spectral properties

  • A self-adjoint operator on a finite-dimensional inner product space has a complete set of orthonormal eigenvectors that form a basis for the space
    • This set of eigenvectors is called an orthonormal eigenbasis
  • Any vector in the inner product space can be expressed as a linear combination of the orthonormal eigenvectors
    • For a vector $v$ in an inner product space $V$ with orthonormal eigenbasis ${u_1, u_2, \ldots, u_n}$, $v = \sum_{i=1}^n \langle v, u_i \rangle u_i$
  • The eigenvalues of a self-adjoint operator can be used to calculate the operator's trace and determinant
    • For a self-adjoint operator $T$ with eigenvalues $\lambda_1, \lambda_2, \ldots, \lambda_n$, $\text{tr}(T) = \sum_{i=1}^n \lambda_i$ and $\det(T) = \prod_{i=1}^n \lambda_i$

Self-adjoint operators vs Hermitian matrices

Hermitian matrices

  • A matrix $A$ is Hermitian if $A = A^$, where $A^$ denotes the conjugate transpose of $A$
    • Hermitian matrices are the matrix representation of self-adjoint operators on finite-dimensional inner product spaces
  • The eigenvalues of a Hermitian matrix are always real, and the eigenvectors corresponding to distinct eigenvalues are orthogonal
  • Every Hermitian matrix is unitarily diagonalizable
    • There exists a unitary matrix $U$ such that $U^AU$ is a diagonal matrix with the eigenvalues of $A$ on the diagonal

Algebraic properties

  • The set of Hermitian matrices forms a real vector space under the usual matrix addition and scalar multiplication
  • The product of two Hermitian matrices is Hermitian if and only if the matrices commute
    • For Hermitian matrices $A$ and $B$, $AB = BA$ is a necessary and sufficient condition for $AB$ to be Hermitian
  • The sum of two Hermitian matrices is always Hermitian
    • If $A$ and $B$ are Hermitian, then $A + B$ is also Hermitian
  • Scalar multiples of Hermitian matrices are Hermitian
    • If $A$ is Hermitian and $c \in \mathbb{R}$, then $cA$ is also Hermitian

Spectral theorem for self-adjoint operators

Diagonalization of Hermitian matrices

  • The spectral theorem states that if $T$ is a self-adjoint operator on a finite-dimensional inner product space $V$, then there exists an orthonormal basis for $V$ consisting of eigenvectors of $T$, and $T$ can be represented as a diagonal matrix with respect to this basis
  • To diagonalize a Hermitian matrix $A$, find an orthonormal basis of eigenvectors ${v_1, v_2, \ldots, v_n}$ and form a unitary matrix $U$ with these eigenvectors as columns
    • Then, $U^AU = D$, where $D$ is a diagonal matrix with the eigenvalues of $A$ on the diagonal
  • The spectral decomposition of a Hermitian matrix $A$ is given by $A = UDU^$, where $U$ is a unitary matrix whose columns are eigenvectors of $A$, and $D$ is a diagonal matrix with the eigenvalues of $A$ on the diagonal

Applications of the spectral theorem

  • The spectral theorem allows for the computation of matrix functions of Hermitian matrices
    • If $f$ is a function defined on the eigenvalues of a Hermitian matrix $A$, then $f(A) = Uf(D)U^$, where $f(D)$ is the diagonal matrix obtained by applying $f$ to each diagonal entry of $D$
  • The spectral theorem is used in quantum mechanics to represent observables as self-adjoint operators and to calculate their expectation values and probabilities
    • The eigenvalues of the observable correspond to the possible measurement outcomes, and the eigenvectors represent the states in which the system is found after the measurement
  • The spectral theorem is also applied in signal processing and data analysis to perform principal component analysis (PCA) and singular value decomposition (SVD)
    • These techniques help in dimensionality reduction, feature extraction, and noise reduction by identifying the most significant eigenvectors and eigenvalues of the data covariance matrix