Fiveable

๐ŸงฎComputational Mathematics Unit 2 Review

QR code for Computational Mathematics practice questions

2.7 Singular value decomposition

๐ŸงฎComputational Mathematics
Unit 2 Review

2.7 Singular value decomposition

Written by the Fiveable Content Team โ€ข Last updated September 2025
Written by the Fiveable Content Team โ€ข Last updated September 2025
๐ŸงฎComputational Mathematics
Unit & Topic Study Guides

Singular value decomposition (SVD) is a powerful matrix factorization technique in linear algebra. It breaks down any matrix into three components, revealing important properties and structures that are useful in various applications.

SVD is crucial for understanding matrix properties, solving linear systems, and analyzing data. It's widely used in dimensionality reduction, image compression, and machine learning, making it a versatile tool in computational mathematics and data science.

Singular Value Decomposition

Matrix Factorization and Properties

  • SVD decomposes a matrix into three matrices U, ฮฃ, and V^T
  • Applies to any m x n matrix including non-square matrices
  • Generalizes eigendecomposition for non-square matrices
  • Factorization theorem states A = UฮฃV^T for any matrix A
    • U and V orthogonal matrices
    • ฮฃ diagonal matrix containing singular values
  • Singular values non-negative real numbers in descending order
  • Columns of U called left singular vectors
  • Columns of V called right singular vectors
  • Key properties include rank revelation, matrix approximation, orthogonality of singular vectors

Mathematical Formulation

  • SVD expressed as A=UฮฃVTA = U\Sigma V^T
  • U (m x m) and V (n x n) orthogonal matrices
  • ฮฃ (m x n) diagonal matrix with singular values ฯƒ_1 โ‰ฅ ฯƒ_2 โ‰ฅ ... โ‰ฅ ฯƒ_r > 0
  • r rank of matrix A
  • Singular values relate to eigenvalues: ฯƒi=ฮปi\sigma_i = \sqrt{\lambda_i}
    • ฮป_i eigenvalues of A^TA or AA^T
  • Left singular vectors (u_i) satisfy Aui=ฯƒiviAu_i = \sigma_i v_i
  • Right singular vectors (v_i) satisfy ATvi=ฯƒiuiA^Tv_i = \sigma_i u_i

Computing SVD

Eigenvalue-Based Approach

  • Find eigenvalues and eigenvectors of A^TA and AA^T
  • Right singular vectors (V) eigenvectors of A^TA
  • Left singular vectors (U) eigenvectors of AA^T
  • Singular values (ฯƒ_i) square roots of non-zero eigenvalues of both A^TA and AA^T
  • Steps for computing SVD:
    1. Compute A^TA and AA^T
    2. Find eigenvalues and eigenvectors of A^TA
    3. Calculate singular values as square roots of eigenvalues
    4. Construct U, ฮฃ, and V matrices

Numerical Algorithms

  • Golub-Reinsch algorithm common for efficient and stable SVD computation
  • Jacobi SVD algorithm alternative method for smaller matrices
  • Iterative methods for large matrices:
    • Lanczos algorithm
    • Randomized SVD
  • Computational complexity O(min(mn^2, m^2n)) for m x n matrix
  • Numerical stability considerations:
    • Use of Householder reflections or Givens rotations
    • Handling of round-off errors in floating-point arithmetic

Interpreting SVD Components

Geometric Interpretation

  • Left singular vectors (U) represent principal directions in column space of A
  • Right singular vectors (V) represent principal directions in row space of A
  • Singular values (ฮฃ) indicate importance of each singular vector pair
  • Number of non-zero singular values equals rank of matrix A
  • Ratio of singular values provides information about:
    • Condition number of matrix
    • Sensitivity to perturbations
  • Null space of A spanned by right singular vectors with zero singular values
  • Range of A spanned by left singular vectors with non-zero singular values

Matrix Analysis

  • SVD reveals fundamental subspaces of matrix A:
    • Column space: span of left singular vectors
    • Row space: span of right singular vectors
    • Null space: complement of row space
    • Left null space: complement of column space
  • Frobenius norm of A equals square root of sum of squared singular values
  • 2-norm (spectral norm) of A equals largest singular value
  • Pseudoinverse A^+ computed using SVD: A+=Vฮฃ+UTA^+ = V\Sigma^+U^T
    • ฮฃ^+ reciprocal of non-zero singular values, zero otherwise

SVD Applications for Data Analysis

Dimensionality Reduction and Data Compression

  • Low-rank matrix approximation by truncating smaller singular values
  • Image compression using fewer SVD components (JPEG compression)
  • Principal Component Analysis (PCA) for dimensionality reduction:
    • Identify dominant patterns in high-dimensional data
    • Project data onto principal components (right singular vectors)
  • Latent Semantic Analysis (LSA) in natural language processing:
    • Reduce dimensionality of term-document matrices
    • Uncover latent semantic relationships

Machine Learning and Signal Processing

  • Recommendation systems:
    • Factorize user-item interaction matrices
    • Predict user preferences based on latent factors
  • Noise reduction in signal processing:
    • Filter out components with small singular values
    • Separate signal from noise (Wiener filtering)
  • Feature extraction and data visualization:
    • Use top singular vectors as features
    • Plot data in reduced dimensional space
  • Preprocessing step for various machine learning algorithms:
    • Improve numerical stability
    • Remove multicollinearity in regression problems
  • Solving least squares problems:
    • Compute pseudoinverse for overdetermined systems
    • Handle rank-deficient matrices in linear regression