Fiveable

๐ŸŽฒIntro to Probabilistic Methods Unit 6 Review

QR code for Intro to Probabilistic Methods practice questions

6.3 Transformations of random variables

๐ŸŽฒIntro to Probabilistic Methods
Unit 6 Review

6.3 Transformations of random variables

Written by the Fiveable Content Team โ€ข Last updated September 2025
Written by the Fiveable Content Team โ€ข Last updated September 2025
๐ŸŽฒIntro to Probabilistic Methods
Unit & Topic Study Guides

Transformations of random variables are crucial in probability theory. They allow us to manipulate and analyze random variables in different ways, opening up new possibilities for modeling and problem-solving.

Linear transformations scale and shift random variables, affecting their mean and variance. Non-linear transformations can change the shape of probability distributions entirely, requiring special techniques to derive new distributions from existing ones.

Linear Transformations of Random Variables

Scaling and Shifting Random Variables

  • Linear transformations involve scaling and shifting the original random variable
  • If $X$ is a random variable and $a$ and $b$ are constants, the linear transformation is $Y = aX + b$
  • Scaling a random variable by a constant factor $a$ changes the spread of the distribution
    • If $|a| > 1$, the distribution is stretched (wider spread)
    • If $|a| < 1$, the distribution is compressed (narrower spread)
  • Shifting a random variable by a constant value $b$ changes the location of the distribution
    • Positive $b$ shifts the distribution to the right
    • Negative $b$ shifts the distribution to the left

Effects on Mean and Variance

  • The mean of the transformed random variable $Y$ is related to the mean of $X$ by $E[Y] = aE[X] + b$, where $E[X]$ is the mean of $X$
    • The constant $a$ scales the mean of $X$
    • The constant $b$ shifts the mean of $X$
  • The variance of the transformed random variable $Y$ is related to the variance of $X$ by $Var(Y) = a^2 Var(X)$, where $Var(X)$ is the variance of $X$
    • The constant $a$ scales the variance of $X$ by a factor of $a^2$
    • The constant $b$ does not affect the variance
  • Linear transformations preserve the shape of the probability distribution
    • The location (mean) and scale (variance) of the distribution may change
    • Example: If $X$ follows a normal distribution, $Y = aX + b$ will also follow a normal distribution with a different mean and variance

Non-linear Transformations of Random Variables

Applying Non-linear Functions to Random Variables

  • Non-linear transformations involve applying a non-linear function to the original random variable
  • If $X$ is a random variable and $g(x)$ is a non-linear function, the transformed random variable is $Y = g(X)$
  • Examples of non-linear transformations:
    • Exponential function: $Y = e^X$
    • Logarithmic function: $Y = \log(X)$
    • Power function: $Y = X^n$, where $n$ is a constant
  • Non-linear transformations can change the shape of the probability distribution

Deriving the Probability Distribution of Transformed Variables

  • To find the probability distribution of $Y$, determine the cumulative distribution function (CDF) of $Y$, denoted as $F_Y(y)$, using the CDF of $X$, denoted as $F_X(x)$
  • The relationship between the CDFs is $F_Y(y) = P(Y \leq y) = P(g(X) \leq y)$
  • For monotonically increasing functions $g(x)$, the CDF of $Y$ is $F_Y(y) = F_X(g^{-1}(y))$, where $g^{-1}(y)$ is the inverse function of $g(x)$
  • For monotonically decreasing functions $g(x)$, the CDF of $Y$ is $F_Y(y) = 1 - F_X(g^{-1}(y))$
  • The probability density function (PDF) of $Y$, denoted as $f_Y(y)$, is obtained by differentiating the CDF of $Y$ with respect to $y$
    • $f_Y(y) = \frac{d}{dy}F_Y(y)$
    • The PDF of $Y$ can also be expressed in terms of the PDF of $X$, denoted as $f_X(x)$, using the change of variables technique: $f_Y(y) = f_X(g^{-1}(y)) \cdot |\frac{d}{dy}g^{-1}(y)|$

Jacobian Determinant in Multivariate Transformations

Definition and Role of the Jacobian Determinant

  • The Jacobian determinant is a matrix of partial derivatives used when transforming multivariate random variables
  • Given a vector of random variables $X = (X_1, X_2, ..., X_n)$ and a vector of transformed variables $Y = (Y_1, Y_2, ..., Y_n)$, where each $Y_i$ is a function of $X_1, X_2, ..., X_n$, the Jacobian matrix $J$ is defined as $J_{ij} = \frac{\partial Y_i}{\partial X_j}$
  • The Jacobian determinant, denoted as $|J|$, is the absolute value of the determinant of the Jacobian matrix $J$
  • The Jacobian determinant represents the volume change factor when transforming from the $X$-space to the $Y$-space
    • If $|J| > 1$, the volume expands during the transformation
    • If $|J| < 1$, the volume contracts during the transformation

Calculating the Jacobian Determinant

  • To calculate the Jacobian determinant, first determine the Jacobian matrix $J$ by finding the partial derivatives of each transformed variable $Y_i$ with respect to each original variable $X_j$
  • Arrange the partial derivatives in a square matrix, with each row corresponding to a transformed variable and each column corresponding to an original variable
  • Calculate the determinant of the Jacobian matrix using standard matrix determinant techniques (e.g., cofactor expansion, Laplace expansion, or Gaussian elimination)
  • Take the absolute value of the determinant to obtain the Jacobian determinant $|J|$
  • Example: For a transformation from polar coordinates $(R, \Theta)$ to Cartesian coordinates $(X, Y)$, where $X = R \cos(\Theta)$ and $Y = R \sin(\Theta)$, the Jacobian matrix is $J = \begin{bmatrix} \cos(\Theta) & -R \sin(\Theta) \ \sin(\Theta) & R \cos(\Theta) \end{bmatrix}$, and the Jacobian determinant is $|J| = R$

Joint Distribution of Transformed Variables

Multivariate Change of Variables Technique

  • The multivariate change of variables technique is used to find the joint probability density function (PDF) of transformed random variables
  • Given a vector of random variables $X = (X_1, X_2, ..., X_n)$ with joint PDF $f_X(x_1, x_2, ..., x_n)$ and a vector of transformed variables $Y = (Y_1, Y_2, ..., Y_n)$, the joint PDF of $Y$, denoted as $f_Y(y_1, y_2, ..., y_n)$, is given by $f_Y(y_1, y_2, ..., y_n) = f_X(x_1, x_2, ..., x_n) \cdot |J|$, where $|J|$ is the absolute value of the Jacobian determinant
  • The multivariate change of variables technique requires the transformation to be one-to-one and the inverse transformation to be differentiable

Steps to Apply the Multivariate Change of Variables Technique

  1. Express the original variables $(X_1, X_2, ..., X_n)$ in terms of the transformed variables $(Y_1, Y_2, ..., Y_n)$
  2. Calculate the Jacobian matrix $J$ by finding the partial derivatives of each original variable with respect to each transformed variable
  3. Calculate the Jacobian determinant $|J|$ by taking the absolute value of the determinant of the Jacobian matrix
  4. Substitute the expressions for the original variables and the Jacobian determinant into the joint PDF of $X$
  5. Simplify the resulting expression to obtain the joint PDF of $Y$
  • Example: For a transformation from Cartesian coordinates $(X, Y)$ to polar coordinates $(R, \Theta)$, where $R = \sqrt{X^2 + Y^2}$ and $\Theta = \arctan(Y/X)$, the joint PDF of $(R, \Theta)$ is $f_{R,\Theta}(r, \theta) = f_{X,Y}(r \cos(\theta), r \sin(\theta)) \cdot r$, where $f_{X,Y}(x, y)$ is the joint PDF of $(X, Y)$