Fiveable

๐Ÿงš๐Ÿฝโ€โ™€๏ธAbstract Linear Algebra I Unit 8 Review

QR code for Abstract Linear Algebra I practice questions

8.4 Orthogonal Matrices and Their Properties

๐Ÿงš๐Ÿฝโ€โ™€๏ธAbstract Linear Algebra I
Unit 8 Review

8.4 Orthogonal Matrices and Their Properties

Written by the Fiveable Content Team โ€ข Last updated September 2025
Written by the Fiveable Content Team โ€ข Last updated September 2025
๐Ÿงš๐Ÿฝโ€โ™€๏ธAbstract Linear Algebra I
Unit & Topic Study Guides

Orthogonal matrices are special square matrices with columns that form an orthonormal set. They're key players in linear algebra, popping up in rotations, reflections, and other transformations that keep distances and angles intact.

These matrices have cool properties like their transpose being their inverse. They're super useful in simplifying calculations, finding orthonormal bases, and breaking down matrices in important ways like QR decomposition and SVD.

Orthogonal Matrices

Definition and Properties

  • An orthogonal matrix is a square matrix $Q$ such that its transpose is equal to its inverse ($Q^T = Q^{-1}$)
  • The columns and rows of an orthogonal matrix form orthonormal sets
    • They are unit vectors (length 1) that are orthogonal (perpendicular) to each other
  • The determinant of an orthogonal matrix is either 1 or -1
  • The product of two orthogonal matrices is also an orthogonal matrix
  • Orthogonal matrices preserve the length of vectors and the angle between vectors under transformation
    • Rotations and reflections are examples of transformations that can be represented by orthogonal matrices

Examples

  • The 2x2 rotation matrix $\begin{bmatrix} \cos\theta & -\sin\theta \ \sin\theta & \cos\theta \end{bmatrix}$ is orthogonal for any angle $\theta$
  • The 2x2 reflection matrix $\begin{bmatrix} \cos\theta & \sin\theta \ \sin\theta & -\cos\theta \end{bmatrix}$ is orthogonal for any angle $\theta$

Identifying Orthogonal Matrices

Checking Orthogonality

  • To check if a matrix $Q$ is orthogonal, multiply it by its transpose ($Q^T$) and see if the result is the identity matrix ($QQ^T = Q^TQ = I$)
    • If $QQ^T = Q^TQ = I$, then $Q$ is orthogonal
  • Verify that the columns and rows of the matrix form orthonormal sets
    • Check if they are unit vectors (length 1) and if they are orthogonal (dot product of distinct vectors is 0)
  • Calculate the determinant of the matrix and confirm that it is either 1 or -1

Examples

  • The matrix $\begin{bmatrix} \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \ \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \end{bmatrix}$ is orthogonal because $QQ^T = Q^TQ = I$, its columns and rows form orthonormal sets, and its determinant is 1
  • The matrix $\begin{bmatrix} 1 & 0 \ 0 & -1 \end{bmatrix}$ is orthogonal because $QQ^T = Q^TQ = I$, its columns and rows form orthonormal sets, and its determinant is -1

Orthonormal Basis of Orthogonal Matrices

Proving Orthonormality

  • Let $Q$ be an orthogonal matrix with columns $q_1, q_2, ..., q_n$
  • Show that the columns are unit vectors by proving that the dot product of each column with itself is equal to 1 ($q_i \cdot q_i = 1$ for all $i$)
  • Prove that the columns are orthogonal to each other by showing that the dot product of any two distinct columns is equal to 0 ($q_i \cdot q_j = 0$ for all $i \neq j$)
  • Conclude that the columns of $Q$ form an orthonormal set, which is a basis for the vector space since $Q$ is a square matrix

Examples

  • For the orthogonal matrix $\begin{bmatrix} \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \ \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \end{bmatrix}$, its columns $q_1 = \begin{bmatrix} \frac{1}{\sqrt{2}} \ \frac{1}{\sqrt{2}} \end{bmatrix}$ and $q_2 = \begin{bmatrix} -\frac{1}{\sqrt{2}} \ \frac{1}{\sqrt{2}} \end{bmatrix}$ form an orthonormal basis for $\mathbb{R}^2$
  • The standard basis vectors $e_1 = \begin{bmatrix} 1 \ 0 \end{bmatrix}$ and $e_2 = \begin{bmatrix} 0 \ 1 \end{bmatrix}$ form an orthonormal basis for $\mathbb{R}^2$ and can be seen as the columns of the orthogonal identity matrix $I_2$

Applications of Orthogonal Matrices

Transformations and Decompositions

  • Use orthogonal matrices to perform rotations, reflections, and other isometric transformations in Euclidean spaces
    • Isometric transformations preserve distances and angles between vectors
  • Apply orthogonal matrices to solve systems of linear equations by simplifying the problem through orthogonal transformations
  • Utilize orthogonal matrices in the QR decomposition (factorization) of a matrix
    • $A = QR$, where $Q$ is an orthogonal matrix and $R$ is an upper triangular matrix
  • Employ orthogonal matrices in the singular value decomposition (SVD) of a matrix
    • $A = U\Sigma V^T$, where $U$ and $V$ are orthogonal matrices and $\Sigma$ is a diagonal matrix of singular values

Orthonormal Bases for Subspaces

  • Use orthogonal matrices to find orthonormal bases for subspaces, such as the column space and null space of a matrix
    • The columns of an orthogonal matrix can form an orthonormal basis for a subspace
  • Orthonormal bases simplify calculations and provide a convenient representation for vectors in a subspace
  • Gram-Schmidt process can be used to construct an orthonormal basis from a given set of linearly independent vectors