Fiveable

โž—Abstract Linear Algebra II Unit 4 Review

QR code for Abstract Linear Algebra II practice questions

4.5 Orthogonal complements and projections

โž—Abstract Linear Algebra II
Unit 4 Review

4.5 Orthogonal complements and projections

Written by the Fiveable Content Team โ€ข Last updated September 2025
Written by the Fiveable Content Team โ€ข Last updated September 2025
โž—Abstract Linear Algebra II
Unit & Topic Study Guides

Inner product spaces let us measure angles and distances between vectors. This section dives into orthogonal complements - subspaces containing vectors perpendicular to a given subspace. We'll see how these complements relate to the original space's structure.

We'll also explore orthogonal projections, which find the closest vector in a subspace to a given vector. This concept is crucial for solving least-squares problems and decomposing vectors into orthogonal components.

Orthogonal Complements and Properties

Definition and Basic Properties

  • Orthogonal complement of subspace W in inner product space V denoted WโŠฅ contains all vectors in V orthogonal to every vector in W
  • For finite-dimensional inner product space V and subspace W, (WโŠฅ)โŠฅ = W
  • Dimension relationship dim(W) + dim(WโŠฅ) = dim(V) holds for finite-dimensional inner product spaces
  • Orthogonal complement of a subspace forms a subspace of the inner product space
  • Zero subspace {0} has orthogonal complement V, and V has orthogonal complement {0}

Properties Involving Multiple Subspaces

  • For subspaces U and W in V, (U + W)โŠฅ = UโŠฅ โˆฉ WโŠฅ
  • Intersection and sum relationship (U โˆฉ W)โŠฅ = UโŠฅ + WโŠฅ holds for subspaces U and W
  • Linear transformation T : V โ†’ W between inner product spaces yields ker(T) = (range(T))โŠฅ
  • Adjoint T* of linear transformation T satisfies range(T*) = (ker(T))โŠฅ

Orthogonal Projections onto Subspaces

Computation and Properties

  • Orthogonal projection of vector v onto subspace W represents closest vector in W to v
  • Formula for orthogonal projection onto W with orthonormal basis {uโ‚, ..., uโ‚–} projW(v)=โˆ‘i=1kโŸจv,uiโŸฉuiproj_W(v) = \sum_{i=1}^k \langle v, u_i \rangle u_i
  • Projection matrix P for orthogonal projection onto W P=U(Uโˆ—U)โˆ’1Uโˆ—P = U(U^*U)^{-1}U^* where U forms basis for W
  • Orthogonal projection P onto W satisfies Pยฒ = P (idempotent) and P = P (self-adjoint)
  • Orthogonal projection onto WโŠฅ calculated as v - proj_W(v)
  • Error vector e = v - proj_W(v) orthogonal to all vectors in W

Orthogonal Decomposition

  • Vector v in V decomposes as v = proj_W(v) + proj_WโŠฅ(v)
  • Decomposition represents v as sum of components in W and WโŠฅ
  • Orthogonal decomposition unique for each vector v in V

Orthogonal Decomposition Theorem

Statement and Proof

  • Theorem states every vector v in V uniquely expressed as v = w + wโŠฅ, w in W and wโŠฅ in WโŠฅ
  • Existence proof constructs w = proj_W(v) and wโŠฅ = v - proj_W(v)
  • Uniqueness proof assumes two decompositions v = wโ‚ + wโ‚โŠฅ = wโ‚‚ + wโ‚‚โŠฅ and shows wโ‚ = wโ‚‚ and wโ‚โŠฅ = wโ‚‚โŠฅ
  • Proof utilizes inner product properties and orthogonal complement definition

Implications and Applications

  • Theorem implies V = W โŠ• WโŠฅ (direct sum) for any subspace W of V
  • Unique decomposition of vectors into projections onto W and WโŠฅ
  • Fundamental theorem in understanding inner product space structure
  • Applications in linear algebra, functional analysis, and quantum mechanics (state vector decomposition)

Least-Squares Problems with Projections

Problem Formulation and Solution

  • Least-squares problem minimizes ||Ax - b||ยฒ for matrix A and vector b
  • Normal equations AAx = Ab characterize least-squares solution
  • Solution xฬ‚ given by x^=(Aโˆ—A)โˆ’1Aโˆ—b\hat{x} = (A^*A)^{-1}A^*b when AA invertible
  • Geometrically, solution finds closest vector Axฬ‚ in column space of A to b
  • Orthogonal projection of b onto column space of A projcol(A)(b)=A(Aโˆ—A)โˆ’1Aโˆ—bproj_{col(A)}(b) = A(A^*A)^{-1}A^*b
  • Residual vector r = b - Axฬ‚ orthogonal to column space of A

Applications and Significance

  • Data fitting applications (linear regression, polynomial fitting)
  • Model matrix A represents predictor variables, b represents observed data
  • Signal processing uses (noise reduction, signal approximation)
  • Parameter estimation in scientific and engineering fields (system identification)
  • Statistical analysis applications (ANOVA, multiple regression)