Random Variables

Many experiments produce numerical outcomes. A random variable (RV) assigns a real number to each outcome of a random experiment. Random variables allow us to:

  • Model uncertainty numerically
  • Apply algebra and calculus to probability problems
  • Analyze signals and noise mathematically

A random variable is a function that maps outcomes in the sample space Ω to real numbers.

X: \Omega \rightarrow \mathbb{R}

Examples
  • Coin toss: ( X = 1 ) for Head, ( X = 0 ) for Tail
  • Dice roll: X \in {1,2,3,4,5,6}
Types of Random Variables
(a) Discrete Random Variables
  • Take countable values
  • Examples: number of heads, number of bit errors
(b) Continuous Random Variables
  • Take values over a continuous range
  • Examples: temperature, noise amplitude, signal voltage
Probability Mass Function (PMF)

Used for discrete random variables.

p_X(x) = P(X = x)

Properties
  1. p_X(x) \ge 0
  2. \sum_x p_X(x) = 1
Example

Fair die: p_X(x) = \frac{1}{6}, \quad x=1,2,3,4,5,6

Cumulative Distribution Function (CDF)

Applies to both discrete and continuous RVs.

F_X(x) = P(X \le x)

Properties
  1. ( 0 \le F_X(x) \le 1 )
  2. Non-decreasing function

\lim_{x\to -\infty} F_X(x) = 0,

\lim_{x\to \infty} F_X(x) = 1

Interpretation
  • Gives the probability that the RV is less than or equal to a value
  • Fundamental description of a random variable
Probability Density Function (PDF)

Used for continuous random variables.

f_X(x) = \frac{d}{dx}F_X(x)

Probability Calculation

P(a < X < b) = \int_a^b f_X(x),dx

It is to be noted that P(X = x) = 0 \quad \text{for continuous RVs}

Relationship Between PDF and CDF

F_X(x) = \int_{-\infty}^{x} f_X(t),dt

f_X(x) = \frac{d}{dx}F_X(x)

Expectation (Mean Value)
  • Discrete RV:
    E[X] = \sum_x x p_X(x)
  • Continuous RV:
    E[X] = \int_{-\infty}^{\infty} xf_X(x),dx
Interpretation
  • Long-run average value
  • Center of mass of the distribution
Expected Value of a Function

For a function g(X):

  • Discrete:
    E[g(X)] = \sum_x g(x)p_X(x)
  • Continuous:
    E[g(X)] = \int g(x)f_X(x)dx

This avoids finding the PDF of ( g(X) ).

Variance and Standard Deviation
Variance

\text{Var}(X) = E[(X - E[X])^2]

Equivalent form:
\text{Var}(X) = E[X^2] - (E[X])^2

Standard Deviation

\sigma_X = \sqrt{\text{Var}(X)}

Interpretation
  • Measures spread or uncertainty
  • Larger variance → more randomness
Important Properties of Expectation
  1. Linearity
    E[aX + b] = aE[X] + b
  2. Constants
    E[c] = c
  3. Expectation does not require independence
Common Discrete Random Variables
Bernoulli Random Variable
  • Values: 0 or 1
  • Parameter: (p = P(X=1))

Mean: E[X] = p

Variance: \text{Var}(X) = p(1-p)

Engineering Intuition
  • Random variables model uncertain signals
  • Expectation → average signal level
  • Variance → noise power
  • PDFs describe how likely signal values are
Key Takeaways
  • Random variables convert randomness into numbers
  • CDF is the most fundamental description
  • PDF and PMF describe probability distribution
  • Expectation and variance summarize behavior
  • These concepts are essential for:
    • Random processes
    • Noise analysis
    • Detection and estimation