Fiveable

๐Ÿ”ขCoding Theory Unit 5 Review

QR code for Coding Theory practice questions

5.4 Soft-Decision Decoding

๐Ÿ”ขCoding Theory
Unit 5 Review

5.4 Soft-Decision Decoding

Written by the Fiveable Content Team โ€ข Last updated September 2025
Written by the Fiveable Content Team โ€ข Last updated September 2025
๐Ÿ”ขCoding Theory
Unit & Topic Study Guides

Soft-decision decoding uses real numbers to represent symbol reliability, outperforming hard-decision decoding that uses binary values. It leverages more signal information, improving error correction. Soft inputs contain reliability data, boosting decoder accuracy and achieving coding gain.

Log-likelihood ratios and Euclidean distances are key reliability metrics in soft-decision decoding. Belief propagation, a powerful algorithm, uses these metrics to decode complex codes like LDPC by passing messages between graph nodes, iteratively updating beliefs about codeword bits.

Soft vs Hard Decoding

Types of Decoding

  • Soft-decision decoding operates on soft inputs which are real numbers representing the reliability of the received symbols
  • Hard-decision decoding operates on hard inputs which are binary values (0 or 1) obtained by quantizing the received symbols
  • Soft-decision decoding generally outperforms hard-decision decoding because it utilizes more information about the received signal

Soft Inputs and Coding Gain

  • Soft-input refers to the real-valued inputs used in soft-decision decoding
  • Soft inputs contain reliability information about the received symbols which helps the decoder make better decisions
  • Coding gain represents the improvement in performance achieved by using error-correcting codes and soft-decision decoding
  • Coding gain is typically measured in decibels (dB) and quantifies the reduction in signal-to-noise ratio (SNR) required to achieve a target error rate compared to uncoded transmission

Reliability Metrics

Log-Likelihood Ratio (LLR)

  • Log-likelihood ratio (LLR) is a reliability metric used in soft-decision decoding
  • LLR represents the logarithm of the ratio of the probabilities of a received symbol being a 0 or a 1
  • Positive LLR values indicate a higher likelihood of the symbol being a 0 while negative values indicate a higher likelihood of the symbol being a 1
  • LLR provides a measure of the reliability of the received symbols which can be used by the decoder to make better decisions

Euclidean Distance

  • Euclidean distance is another reliability metric used in soft-decision decoding
  • Euclidean distance measures the distance between the received symbol and the closest constellation point in the signal space
  • Smaller Euclidean distances indicate higher reliability of the received symbols
  • Decoders can use Euclidean distances to determine the most likely transmitted codeword by finding the codeword with the smallest total Euclidean distance from the received sequence

Decoding Algorithms

Belief Propagation

  • Belief propagation (BP) is a decoding algorithm used for soft-decision decoding of codes on graphs such as low-density parity-check (LDPC) codes
  • BP operates by passing messages between the nodes of the graph representing the code
  • Messages represent the beliefs or probabilities of the nodes about the values of the codeword bits
  • BP iteratively updates the beliefs by exchanging messages between the nodes until convergence or a maximum number of iterations is reached
  • At each iteration, nodes compute their beliefs based on the messages received from their neighbors and the channel reliability information (soft inputs)
  • After convergence or reaching the maximum number of iterations, the decoder makes a decision on the transmitted codeword based on the final beliefs of the nodes
  • BP can achieve near-optimal performance for codes with sparse graph representations such as LDPC codes