Inner product spaces let us measure angles and distances between vectors. This section dives into orthogonal complements - subspaces containing vectors perpendicular to a given subspace. We'll see how these complements relate to the original space's structure.
We'll also explore orthogonal projections, which find the closest vector in a subspace to a given vector. This concept is crucial for solving least-squares problems and decomposing vectors into orthogonal components.
Orthogonal Complements and Properties
Definition and Basic Properties
- Orthogonal complement of subspace W in inner product space V denoted Wโฅ contains all vectors in V orthogonal to every vector in W
- For finite-dimensional inner product space V and subspace W, (Wโฅ)โฅ = W
- Dimension relationship dim(W) + dim(Wโฅ) = dim(V) holds for finite-dimensional inner product spaces
- Orthogonal complement of a subspace forms a subspace of the inner product space
- Zero subspace {0} has orthogonal complement V, and V has orthogonal complement {0}
Properties Involving Multiple Subspaces
- For subspaces U and W in V, (U + W)โฅ = Uโฅ โฉ Wโฅ
- Intersection and sum relationship (U โฉ W)โฅ = Uโฅ + Wโฅ holds for subspaces U and W
- Linear transformation T : V โ W between inner product spaces yields ker(T) = (range(T))โฅ
- Adjoint T* of linear transformation T satisfies range(T*) = (ker(T))โฅ
Orthogonal Projections onto Subspaces
Computation and Properties
- Orthogonal projection of vector v onto subspace W represents closest vector in W to v
- Formula for orthogonal projection onto W with orthonormal basis {uโ, ..., uโ}
- Projection matrix P for orthogonal projection onto W where U forms basis for W
- Orthogonal projection P onto W satisfies Pยฒ = P (idempotent) and P = P (self-adjoint)
- Orthogonal projection onto Wโฅ calculated as v - proj_W(v)
- Error vector e = v - proj_W(v) orthogonal to all vectors in W
Orthogonal Decomposition
- Vector v in V decomposes as v = proj_W(v) + proj_Wโฅ(v)
- Decomposition represents v as sum of components in W and Wโฅ
- Orthogonal decomposition unique for each vector v in V
Orthogonal Decomposition Theorem
Statement and Proof
- Theorem states every vector v in V uniquely expressed as v = w + wโฅ, w in W and wโฅ in Wโฅ
- Existence proof constructs w = proj_W(v) and wโฅ = v - proj_W(v)
- Uniqueness proof assumes two decompositions v = wโ + wโโฅ = wโ + wโโฅ and shows wโ = wโ and wโโฅ = wโโฅ
- Proof utilizes inner product properties and orthogonal complement definition
Implications and Applications
- Theorem implies V = W โ Wโฅ (direct sum) for any subspace W of V
- Unique decomposition of vectors into projections onto W and Wโฅ
- Fundamental theorem in understanding inner product space structure
- Applications in linear algebra, functional analysis, and quantum mechanics (state vector decomposition)
Least-Squares Problems with Projections
Problem Formulation and Solution
- Least-squares problem minimizes ||Ax - b||ยฒ for matrix A and vector b
- Normal equations AAx = Ab characterize least-squares solution
- Solution xฬ given by when AA invertible
- Geometrically, solution finds closest vector Axฬ in column space of A to b
- Orthogonal projection of b onto column space of A
- Residual vector r = b - Axฬ orthogonal to column space of A
Applications and Significance
- Data fitting applications (linear regression, polynomial fitting)
- Model matrix A represents predictor variables, b represents observed data
- Signal processing uses (noise reduction, signal approximation)
- Parameter estimation in scientific and engineering fields (system identification)
- Statistical analysis applications (ANOVA, multiple regression)