Discrete Probability
An Introduction to Discrete Probability
Finite Probability
An experiment is a procedure that yields one of a given set of possible outcomes. The sample space of the experiment is the set of possible outcomes. An event is a subset of the sample space. Laplaceâs definition of the probability of an event with finitely many possible outcomes will now be stated.
- Definition :
If
is a finite nonempty sample space of equally likely outcomes, and is an event, that is, a subset of , then the probability of is .
Probabilities of Complements and Unions of Events
We can use counting techniques to find the probability of events derived from other events.
- Theorem :
Let
be an event in a sample space . The probability of the event , the complementary event of , is given by - Theorem :
Let
and be events in the sample space . Then
Probability Theory
Assigning Probabilities
Let
- (I)
for each and - (II)
.
(Note that when the sample space is infinite,
- (I)
for and - (II)
.
The function
- Definition :
Suppose that
is a set with elements. The uniform distribution assigns the probability to each element of . - Definition :
The probability of the event
is the sum of the probabilities of the outcomes in . That is, (Note that when is an infinite set, is a convergent infinite series.)
Note that the uniform distribution assigns the same probability to an event that Laplaceâs original definition of probability assigns to this event.
The experiment of selecting an element from a sample space with a uniform distribution is called selecting an element of
- Theorem :
If
is a sequence of pairwise disjoint events in a sample space , then (Note that this theorem applies when the sequence consists of a finite number or a countably infinite number of pairwise disjoint events.)
Conditional Probability
In general, to find the conditional probability of
- Definition :
Let
and be events with . The conditional probability of given , denoted by , is defined as
Independence
- Definition :
The events
and are independent if and only if .
Pairwise and Mutual Independence
- Definition :
The events
are pairwise independent if and only if for all pairs of integers and with . These events are mutually independent if whenever , are integers with and .
Bernoulli Trials and the Binomial Distribution
Suppose that an experiment can have only two possible outcomes. For instance, when a bit is generated at random, the possible outcomes are
- Theorem :
The probability of exactly
successes in independent Bernoulli trials, with probability of success and probability of failure , is .
Random Variables
A random variable is a function from the sample space of an experiment to the set of real numbers. That is, a random variable assigns a real number to each possible outcome.
- Definition :
The distribution of a random variable
on a sample space is the set of pairs for all , where is the probability that takes the value . The set of pairs in this distribution is determined by the probabilities for .
The Probabilistic Method
- Theorem : The Probabilistic Method
If the probability that an element chosen at random from a
does not have a particular property is less than , there exists an element in with this property. - Theorem :
If
is an integer with , then .
Bayesâ Theorem
- Theorem : Bayesâ Theorem
Suppose that
and are events from a sample space such that and Then
Generalizing Bayesâ Theorem
Note that in the statement of Bayesâ theorem, the events
- Theorem : Generalized Bayesâ Theorem
Suppose that
is an event from a sample space and that are mutually exclusive events such that . Assume that and for . Then
Expected Value and Variance
The expected value of a random variable is the sum over all elements in a sample space of the product of the probability of the element and the value of the random variable at this element. Consequently, the expected value is a weighted average of the values of a random variable. The expected value of a random variable provides a central point for the distribution of values of this random variable.
-
Definition : The expected value, also called the expectation or mean, of the random variable
on the sample space is equal to The deviation of at is , the difference between the value of and the mean of . -
Theorem : If
is a random variable and is the probability that , so that , then -
Theorem : The expected number of successes when
mutually independent Bernoulli trials are performed, where is the probability of success on each trial, is .
Linearity of Expectations
- Theorem :
If
with a positive integer, are random variables on , and if and are real numbers, then- (I)
, - (II)
.
- (I)
Average-Case Computational Complexity
Computing the average-case computational complexity of an algorithm can be interpreted as computing the expected value of a random variable.
Let the sample space of an experiment be the set of possible inputs
The Geometric Distribution
We now turn our attention to a random variable with infinitely many possible outcomes.
- Definition :
A random variable
has a geometric distribution with parameter if for , where is a real number with . - Theorem :
If the random variable
has the geometric distribution with parameter , then .
Independent Random Variables
-
Definition : The random variables
and on a sample space are independent if Or in words, if the probability that and equals the product of the probabilities that and , for all real numbers and . -
Theorem : If
and are independent random variables on a sample space , then .
Variance
The expected value of a random variable tells us its average value, but nothing about how widely its values are distributed. However, the random variable
-
Definition : Let
be a random variable on a sample space . The variance of , denoted by , is That is, is the weighted average of the square of the deviation of . The standard deviation of , denoted , is defined to be . -
Theorem : If
is a random variable on a sample space , then . -
Corollary : If
is a random variable on a sample space and , then . -
Theorem : BienaymĂ©âs Formula If
and are two independent random variables on a sample space , then . Furthermore, if , with a positive integer, are pairwise independent random variables on , then .
Chebyshevâs Inequality
- Theorem : Chebyshevâs Inequality
Let
be a random variable on a sample space with probability function . If is a positive real number, then .