Probability and Stochastic Processes
An Introduction
Definition:
If is a measure space, and
is a measurable simple function of the form
where are the distinct values of
, and if
, we define
If is measurable, and
, we define}
the supremum taken over all simple measurable functions such that
.
Theorem: Lebesgue’s Monotone Convergence Theorem]
Let be a sequence of measurable functions on
, and suppose that
Theorem:
If is measurable, for
, and
Theorem:[Fatou’s Lemma]
If is measurable, for each positive integer
, then
Definition:
We define to be the collection of all complex measurable functions
on
for which
Theorem:[Lebesgue’s Dominated Convergence Theorem] Suppose is a sequence of complex measurable functions on
such that
exists for every . If there is a function
such that
then ,
and
Discrete and Continuous Random Variables} Definition: A random variable
is a measurable function, where is a probability space, and
is a measurable space. A random variable does not return a probability, it can be thought of as returning the result of an experiment where we do not know the result ahead of time. Example: How many outcomes are there if we throw five dice?\\ Solution: Let
be the set of all possible outcomes for the
th die, so
. The number of outcomes of throwing five die equals the number of ways we can chose an element from each of the
. So we get number of outcomes is
Definition: If is a random variable from
to
, then the function defined on
by
is called the \textbf{distribution function} of
. In the vein of measure theory we define the function as follows
If is constant almost everywhere except at the points
and we denote
, then the collection of
is the traditional probability mass function for a discrete random variable. We have the following properties of
that hold regardless if
is a continuous, discrete, or measurable random variable.
Useful identities involving Distribution Functions:
So then
and since the probability function is a continuous function
and so
Where we use to denote the lefthand limit at the end.
Definition: The expected value of a random variable defined on
, a probability space, is
We also the expected value the \textbf{mean}.
Theorem: Let be a random variable on
and let
be a real valued function, then
is a random variable, with
Definition: Let be a random variable defined on
, a probability space, the variance (
) and standard deviation (
) are defined by
respectively. \hfill\break \hspace{-1cm}\fbox{\parbox{\textwidth + 1cm}{ \begin{thm} \end{thm} }}
Theorem: Let be a random variable defined on
, a probability space with mean
. Then
if and only if
is constant with probability 1.
Theorem: Let be a random variable defined on
, a probability space; then for constants
we have
Definition: Let and
be two random variables defined on
, a probability space and
be a given point. If for all
,
then we say that is more concentrated about
than
.
If we let , these constructions lead to some important information about random variables we are working with,