Story
The Exponential distribution is the continuous counterpart to the Geometric distribution. The story of the Exponential distribution is analogous, but we are now waiting for a success in continuous time, where successes arrive at a rate of $\lambda$ successes per unit of time. The average number of successes in a time interval of length $t$ is $\lambda t$, though the actual number of successes varies randomly. An Exponential random variable represents the waiting time until the first arrival of a success.
——adapted from Book BH
Basic
Definition: A continuous r.v. $X$ is said to have the Exponential distribution with parameter $\lambda$ if its PDF is
The corresponding CDF is
To calculate the expectation and variance, we first consider $X \sim Exp(1)$ with PDF $f(x) = e^{-x}$, then
Now let $Y=\frac{X}{\lambda} \sim Exp(\lambda)$ for
or
Hence, we can get
- $E(Y) = E(X/\lambda) = 1/\lambda$
- $Var(Y) = Var(X/\lambda) = 1/\lambda^2$
- MGF (moment generating function):
Memeoryless Property
Memoryless is something like $P(X \ge s+t ~|~ X \ge s) = P(X \ge t)$, let $X \sim Exp(\lambda)$, then
Theorem: If $X$ is a positive continuous r.v. with memoryless property, then $X$ has an exponential distribution. Similarly, if $X$ is discrete, then it has a geometric distribution.
Proof idea: use survival function and solve differential equations.
Examples
eg.1 $X_1 \sim Exp(\lambda_1), ~X_2 \sim Exp(\lambda_2)$, and $X_1 \perp X_2$. Then $P(X_1 < X_2) = \frac{\lambda_1}{\lambda_1 + \lambda_2}$.
Proof: By LOTP (law of total probability),
eg.2 $\{X_i\}_{i=1}^n$ are independent with $X_j \sim Exp(\lambda_j)$. Let $L = \min(X_1, \cdots, X_n)$, then $L \sim Exp(\lambda_1 + \cdots \lambda_n)$.
Proof:
The intuition of this result is that if you consider $n$ Poisson processes with rate $\lambda_j$,
- $X_1$ as the waiting time for a green car
- $X_2$ as the waiting time for a red car
- …
Then $L$ is the waiting time for a car of any color (i.e., any car). So it makes sense, the rate is $\lambda_1 + \cdots + \lambda_n$.
eg.3 (Difference of two exponetial) Let $X \sim Exp(\lambda)$ and $Y \sim Exp(\mu)$, $X \perp Y$. Then what is the PDF of $Z=X-Y$?
Solution:
Recall the story of exponential, one can think of $X$ and $Y$ as waiting times for two independent things. For example,
- $X$ as the waiting time for a red car passing by
- $Y$ as the waiting time for a blue car
If we see a blue car passing by, then the further waiting time for a red car is still distributed as same distribution as $Y$, for the memoryless property of exponential. Likewise, if we see a red car passing by, then the further waiting time is distributed as same as $X$. The further waiting time is somehow what we are interested in, say $Z$.
The above intuition says that, the conditional distribution of $X-Y$ given $X > Y$ is the distribution of $X$, and the conditional distribution of $X-Y$ given $X \le Y$ is the distribution of $-Y$ (or in other words, the conditional distribution of $Y-X$ given $Y \ge X$ is same as the distribution of $Y$).
To make full use of our intuition, we know that
- If $X>Y$, which means $Z>0$, then $Z~|~X>Y = X$ a.s. holds, that is
- If $X < Y$, which means $Z < 0$, then $Z~|~X<Y = -Y$ a.s. holds, that is
However, this is just a sketch. Later we will see how to derivate the form mathematically.
From the above point of view, the PDF of $Z$ had better be discussed by the sign of $Z$.
- If $Z > 0$, which implies $X > Y\implies P(X < Y) = 0 $, then
- If $Z \le 0$, which implies $X \le Y$, then
Therefore, the PDF of $Z$ has the form
Note: $P(X=Y)=0$ since the integral domain is a line ($y=x$) whose measure is 0. That is $P(Z=0) = 0$. This is why we can give no care of the case $X=Y$.