Fiveable

๐ŸŽฒIntro to Probabilistic Methods Unit 3 Review

QR code for Intro to Probabilistic Methods practice questions

3.3 Expectation, variance, and moments of discrete random variables

๐ŸŽฒIntro to Probabilistic Methods
Unit 3 Review

3.3 Expectation, variance, and moments of discrete random variables

Written by the Fiveable Content Team โ€ข Last updated September 2025
Written by the Fiveable Content Team โ€ข Last updated September 2025
๐ŸŽฒIntro to Probabilistic Methods
Unit & Topic Study Guides

Discrete random variables are the building blocks of probability theory. They help us understand and predict outcomes in situations with a finite number of possibilities, like coin flips or dice rolls.

Expectation, variance, and moments are key tools for analyzing discrete random variables. They give us insights into the average outcome, spread of values, and overall shape of the probability distribution, helping us make informed decisions in uncertain situations.

Expected Value of Discrete Variables

Calculating the Expected Value (Mean)

  • The expected value (mean) of a discrete random variable $X$ is denoted by $E(X)$ and is calculated as the sum of the product of each possible value of $X$ and its corresponding probability
  • The formula for the expected value of a discrete random variable $X$ is $E(X) = \Sigma x \cdot P(X = x)$, where $x$ represents each possible value of $X$, and $P(X = x)$ is the probability of $X$ taking the value $x$
    • For example, if $X$ represents the number of heads in two coin flips, with possible values of 0, 1, and 2, and corresponding probabilities of 0.25, 0.5, and 0.25, then $E(X) = 0 \cdot 0.25 + 1 \cdot 0.5 + 2 \cdot 0.25 = 1$
  • The expected value represents the average value of the random variable over a large number of trials or observations
    • In the coin flip example, if we repeat the two coin flips many times, the average number of heads observed will approach 1
  • The expected value is a weighted average, where each value is weighted by its probability of occurrence

Properties of Expected Value

  • The expected value is a linear operator, meaning that for any constants $a$ and $b$, and random variables $X$ and $Y$, $E(aX + bY) = aE(X) + bE(Y)$
    • For instance, if $X$ and $Y$ are independent random variables representing the numbers rolled on two fair dice, then $E(2X + 3Y) = 2E(X) + 3E(Y) = 2 \cdot 3.5 + 3 \cdot 3.5 = 17.5$
  • The linearity of expectation holds regardless of whether the random variables are independent or not
  • The expected value of a constant $c$ is the constant itself, i.e., $E(c) = c$

Variance and Standard Deviation of Discrete Variables

Calculating Variance and Standard Deviation

  • The variance of a discrete random variable $X$, denoted by $Var(X)$ or $\sigma^2$, measures the average squared deviation of the random variable from its expected value
  • The formula for the variance of a discrete random variable $X$ is $Var(X) = E[(X - E(X))^2] = \Sigma (x - E(X))^2 \cdot P(X = x)$, where $x$ represents each possible value of $X$, and $P(X = x)$ is the probability of $X$ taking the value $x$
    • For example, if $X$ represents the number of heads in two coin flips, with $E(X) = 1$, then $Var(X) = (0 - 1)^2 \cdot 0.25 + (1 - 1)^2 \cdot 0.5 + (2 - 1)^2 \cdot 0.25 = 0.5$
  • The variance can also be calculated using the alternative formula $Var(X) = E(X^2) - [E(X)]^2$, where $E(X^2)$ is the expected value of the squared random variable $X$
    • In the coin flip example, $E(X^2) = 0^2 \cdot 0.25 + 1^2 \cdot 0.5 + 2^2 \cdot 0.25 = 1.5$, so $Var(X) = 1.5 - 1^2 = 0.5$
  • The standard deviation of a discrete random variable $X$, denoted by $\sigma$ or $\sqrt{Var(X)}$, is the square root of the variance and measures the average deviation of the random variable from its expected value
    • In the coin flip example, $\sigma = \sqrt{0.5} \approx 0.707$

Interpreting Variance and Standard Deviation

  • The variance and standard deviation provide a measure of the dispersion or spread of the random variable around its expected value
  • A higher variance or standard deviation indicates that the values of the random variable are more spread out from the mean, while a lower variance or standard deviation suggests that the values are more concentrated around the mean
  • The standard deviation has the same units as the random variable, making it easier to interpret than the variance, which has squared units

Moments of Distributions

Defining Moments

  • Moments are mathematical quantities that describe the shape and characteristics of a probability distribution
  • The $n$-th moment of a discrete random variable $X$ is defined as $E(X^n)$, where $n$ is a non-negative integer
    • For example, the first moment (n = 1) of a fair six-sided die is $E(X) = 1 \cdot \frac{1}{6} + 2 \cdot \frac{1}{6} + \ldots + 6 \cdot \frac{1}{6} = 3.5$
  • The first moment ($n = 1$) is the expected value or mean of the random variable, $E(X)$
  • The second moment ($n = 2$) is related to the variance of the random variable, as $Var(X) = E(X^2) - [E(X)]^2$

Higher-Order Moments and Their Significance

  • Higher-order moments ($n > 2$) provide additional information about the shape of the distribution
  • The third moment is related to the skewness of the distribution, which measures the asymmetry of the distribution around its mean
    • A positive skewness indicates that the tail on the right side of the distribution is longer or fatter than the left side, while a negative skewness indicates the opposite
  • The fourth moment is related to the kurtosis of the distribution, which measures the heaviness of the tails and the peakedness of the distribution relative to a normal distribution
    • A higher kurtosis indicates heavier tails and a more peaked distribution, while a lower kurtosis indicates lighter tails and a flatter distribution
  • Moments can be used to compare and contrast different probability distributions and to estimate the parameters of a distribution from sample data

Linearity of Expectation

The Linearity Property

  • The linearity of expectation states that the expected value of the sum of random variables is equal to the sum of their individual expected values, regardless of whether the random variables are independent or not
  • Mathematically, for any random variables $X$ and $Y$, $E(X + Y) = E(X) + E(Y)$, and more generally, for any constants $a$ and $b$, $E(aX + bY) = aE(X) + bE(Y)$
    • For example, if $X$ and $Y$ are the numbers rolled on two fair dice, then $E(X + Y) = E(X) + E(Y) = 3.5 + 3.5 = 7$
  • The linearity of expectation allows for the calculation of the expected value of a sum of random variables without the need to determine their joint distribution or dependence structure

Applying Linearity of Expectation

  • The linearity of expectation is particularly useful when solving problems involving multiple random variables, as it simplifies the calculation of the expected value of their sum
    • For instance, in a game where a player rolls two fair dice and receives a payoff equal to the sum of the numbers rolled, the expected payoff can be easily calculated as $E(X + Y) = E(X) + E(Y) = 3.5 + 3.5 = 7$
  • The linearity of expectation can be extended to any finite number of random variables, i.e., for random variables $X_1, X_2, \ldots, X_n$, $E(X_1 + X_2 + \ldots + X_n) = E(X_1) + E(X_2) + \ldots + E(X_n)$
    • This property is useful in various applications, such as calculating the expected total waiting time in a queue with multiple independent service times or the expected total score in a game with multiple independent rounds