Keywords: cdf |  pmf |  pdf |  uniform distribution |  bernoulli distribution |  conditional |  marginal | 

Summary

Remember that a Random Variable is a mapping $ X: \Omega \rightarrow \mathbb{R}$ that assigns a real number $X(\omega)$ to each outcome $\omega$ in a sample space $\Omega$. The definitions below are taken from Larry Wasserman’s All of Statistics.

Cumulative distribution Function

The cumulative distribution function, or the CDF, is a function

,

defined by

A note on notation: $X$ is a random variable while $x$ is a particular value of the random variable.

Let $X$ be the random variable representing the number of heads in two coin tosses. Then $x$ can take on values 0, 1 and 2. The CDF for this random variable can be drawn thus (taken from All of Stats):

Notice that this function is right-continuous and defined for all $x$, even if $x $does not take real values in-between the integers.

Probability Mass and Distribution Function

$X$ is called a discrete random variable if it takes countably many values ${x_1, x_2,…}$. We define the probability function or the probability mass function (pmf) for X by:

$f_X$ is a probability.

The pmf for the number of heads in two coin tosses (taken from All of Stats) looks like this:

On the other hand, a random variable is called a continuous random variable if there exists a function $f_X$ such that $f_X (x) \ge 0$ for all x, $\int_{-\infty}^{\infty} f_X (x) dx = 1$ and for every a ≤ b,

The function $f_X$ is called the probability density function (pdf). We have the CDF:

and $f_X (x) = \frac{d F_X (x)}{dx}$ at all points x at which $F_X$ is differentiable.

Continuous variables are confusing. Note:

  1. $p(X=x) = 0$ for every $x$. You cant think of $f_X(x)$ as $p(X=x)$. This holds only for discretes. You can only get probabilities from a pdf by integrating, if only over a very small paty of the space.
  2. A pdf can be bigger than 1 unlike a probability mass function, since probability masses represent actual probabilities.

A continuous example: the Uniform Distribution

Suppose that X has pdf A random variable with this density is said to have a Uniform (0,1) distribution. This is meant to capture the idea of choosing a point at random between 0 and 1. The cdf is given by: and can be visualized as so (again from All of Stats):

A discrete example: the Bernoulli Distribution

The Bernoulli Distribution represents the distribution a coin flip. Let the random variable $X$ represent such a coin flip, where $X=1$ is heads, and $X=0$ is tails. Let us further say that the probability of heads is $p$ ($p=0.5$ is a fair coin).

We then say:

which is to be read as $X$ has distribution $Bernoulli(p)$. The pmf or probability function associated with the Bernoulli distribution is

for p in the range 0 to 1. This pmf may be written as

for x in the set {0,1}.

$p$ is called a parameter of the Bernoulli distribution.

Conditional and Marginal Distributions

Marginal mass functions are defined in analog to probabilities. Thus:

Similarly, marginal densities are defined using integrals:

Notice there is no interpretation of the marginal densities in the continuous case as probabilities. An example here if $f(x,y) = e^{-(x+y)}$ defined on the positive quadrant. The marginal is an exponential defined on the positive part of the line.

Conditional mass function is similarly, just a conditional probability. So:

The similar formula for continuous densities might be suspected to a bit more complex, because we are conditioning on the event $Y=y$ which strictly speaking has 0 probability. But it can be proved that the same formula holds for densities with some additional requirements and interpretation:

where we must assume that $f_Y(y) > 0$. Then we have the interpretation that for some event A:

An example of this is the uniform distribution on the unit square. Suppose then that $y=0.3$. Then the conditional density is a uniform density on the line between 0 and 1 at $y=0.3$.