Poisson

Poisson Random Variables

Definition[Poisson Random Variable]
A discrete random variable X, with possible values 0,1,2,... is called Poisson with parameter \lambda > 0 if

    \begin{align*} P(X = i) &= \frac{e^{-\lambda \lambda^{i}}}{i!}, & i = 0,1,2,... \end{align*}

The expectations, variances, and standard deviations of a Poisson random variable are

    \begin{align*} E[X] &= \lambda \\ V[X] &= \lambda \\ \sigma_X &= \sqrt{\lambda} \end{align*}


Poisson Approximation to Binomial

One use of the Poisson distribution is as follows,
Proposition:
Suppose that the binomial pmf b(x;n,p), we let n\rightarrow \infty and p \rightarrow 0 such that np \rightarrow \lambda > 0. Then b(x;n,p) \rightarrow p(x;\lambda).
\end{prop}


Remark:
If p < 0.1 and np \leq 10, then such an approximation is generally good, if np > 10 then it would be better to use a normal distribution, discussed in a later chapter.

Remark:
What this means is that for experiments with a large number of trials and a low probability p, we can use a Poisson distribution. Since if p is is small then (1 - p) is close to one so we have that np(1 - p) \approx np = \lambda. So we can approximate a Binomial random variable by a Poisson random variable.


Poisson Processes

Definition[Poisson Process]
Suppose at some time, t = 0, we begin counting the number of events that occur. Then for each value of t we obtain a number denoted by N(t), which is the number of events to occur during [0,t]. For each value of t, N(t) is a discrete random variable with the set of possible values \{0,1,2,...\}. To study this process given by N(t) we make the following simple assumptions about the way that the events occur
\begin{itemize}

  • Stationary: For all n \geq 0, and for any two equal time intervals \Delta_1, \Delta_2, the probability of n events in \Delta_1 is equal to the probability of n events in \Delta_2.
  • Independent Increments: For all n \geq 0, and for any time interval (t, t+s), the probability of n events in (t,t+s) is independent of how many events have occurred earlier. In particular, suppose that the times 0 \leq t_1 \leq t_2 < ... < t_k are given. For 1 \leq i \leq k -1, let A_i be the event that n_i events of the process occur in [t_i,t_{i+1}). The independent increments mean that \set{A_1, A_2,...,A_k} is an independent set of events.
  • Orderliness:The occurrence of two or more events in a very small time interval is practically impossible. This condition is expressed by

        \begin{align*} \lim_{h \rightarrow 0} P(N(h) > 1)/h = 0. \end{align*}

     This implies that as h \rightarrow 0, the probability of two or more events, P(N(h) > 1), approaches 0 faster than h does. That is, if h is negligible, then P(N(h) > 1) is even more negligible.

 

We will prove in a later chapter that:
The simultaneous occurrence of two or more events is impossible. Therefore, under the aforementioned properties, events occur one at a time, not in pairs or groups.


Theorem:
If random events occur in time in a way that the above conditions are always satisfied, N(0) = 0, and , for all t >0, 0 < P(N(t) = 0) < 1, then there exists a positive number \lambda such that

    \begin{align*} P(N(t) = n) &= \frac{(\lambda t)^n e^{-\lambda t}}{n!} \end{align*}

That is, for all t > 0, N(t) is a Poisson random variable with parameter \lambda t. Hence E\left[ N(t) \right] = \lambda t and therefore \lambda = E\left[ N(1) \right].