1 Discrete Probability Distributions Flashcards
(31 cards)
In this chapter, we shall first consider chance experiments with a finite number of
possible outcomes ω1, ω2, . . . , ωn. For example, we roll a die and the possible
outcomes are 1, 2, 3, 4, 5, 6 corresponding to the side that turns up. We toss a coin
with possible outcomes H (heads) and T (tails).
It is frequently useful to be able to refer to an outcome of an experiment. For
example, we might want to write the mathematical expression which gives the sum
of four rolls of a die. How could we do this?
To do this, we could let Xi, i = 1, 2, 3, 4, represent the values
of the outcomes of the four rolls, and then we could write the expression
X1 + X2 + X3 + X4
for the sum of the four rolls.
In the expression X1 + X2 + X3 + X4 for the sum of four rolls of the dice.
What are the Xi’s called
Random Variables
What is a random variable?
A random variable
is simply an expression whose value is the outcome of a particular experiment.
Just as in the case of other types of variables in mathematics, random variables can
take on different values.
Let X be the random variable which represents the roll of one die. How shall we
assign probabilities to the possible outcomes of this experiment.?
We do this by
assigning to each outcome ωj a nonnegative number m(ωj ) in such a way that
m(ω1) + m(ω2) + · · · + m(ω6) = 1
What is the
m(ω1) + m(ω2) + · · · + m(ω6) = 1
function called?
The function m(ωj ) is called the distribution function of the random variable X.
How should we assign probabilities for our dice example?
We would assign equal probabilities or probabilities
1/6 to each of the outcomes. With this assignment of probabilities, one could write
P(X ≤ 4) = 2/3
to mean that the probability is 2/3 that a roll of a die will have a value which does
not exceed 4.
In both experiments (dice and coin), each outcome is assigned an equal probability.
This would certainly not be the case in general. For example, if a drug is found to
be effective 30 percent of the time it is used, we might assign a probability .3 that
the drug is effective the next time it is used and .7 that it is not effective. What does this last
example illustrate?
The intuitive frequency concept of probability. That is, if we have
a probability p that an experiment will result in outcome A, then if we repeat this
experiment a large number of times we should expect that the fraction of times that
A will occur is about p.
We want to be able to perform an experiment that corresponds to a given set of
probabilities; for example, m(ω1) = 1/2, m(ω2) = 1/3, and m(ω3) = 1/6. How could we imagine an experiment to test this.
In this
case, one could mark three faces of a six-sided die with an ω1, two faces with an ω2,
and one face with an ω3.
How could we visualize probabilities in a general case?
In the general case we assume that m(ω1), m(ω2), . . . , m(ωn) are all rational
numbers, with least common denominator n. If n > 2, we can imagine a long
cylindrical die with a cross-section that is a regular n-gon. If m(ωj ) = nj/n, then
we can label nj of the long faces of the cylinder with an ωj , and if one of the end
faces comes up, we can just roll the die again. If n = 2, a coin could be used to
perform the experiment.
We will be particularly interested in repeating a chance experiment a large number
of times. Although the cylindrical die would be a convenient way to carry out
a few repetitions, why wouldn’t it be ideal?
It would be difficult to carry out a large number of experiments.
Since the modern computer can do a large number of operations in a very short
time, it is natural to turn to the computer for this task.
What is a computer analog of rolling a die.
This is done on the computer
by means of a random number generator.
We must first find a computer analog of rolling a die. This is done on the computer
by means of a random number generator. Depending upon the particular software
package, the computer can be asked for a real number between 0 and 1, or an integer
in a given set of consecutive integers. How is this done in the first case?
In the first case, the real numbers are chosen
in such a way that the probability that the number lies in any particular subinterval
of this unit interval is equal to the length of the subinterval.
We must first find a computer analog of rolling a die. This is done on the computer
by means of a random number generator. Depending upon the particular software
package, the computer can be asked for a real number between 0 and 1, or an integer
in a given set of consecutive integers. How is this done in the second case?
In the second case,
each integer has the same probability of being chosen.
Let X be a random variable with distribution function m(ω), where ω is in the
set {ω1, ω2, ω3}, and m(ω1) = 1/2, m(ω2) = 1/3, and m(ω3) = 1/6. If our computer
package can return a random integer in the set {1, 2, …, 6}, what should we do?
If our computer
package can return a random integer in the set {1, 2, …, 6}, then we simply ask it
to do so, and make 1, 2, and 3 correspond to ω1, 4 and 5 correspond to ω2, and 6
correspond to ω3.
Let X be a random variable with distribution function m(ω), where ω is in the
set {ω1, ω2, ω3}, and m(ω1) = 1/2, m(ω2) = 1/3, and m(ω3) = 1/6. If our computer
package can return a random real number r in the
interval (0, 1), what should we do?
If our computer package returns a random real number r in the
interval (0, 1), then the expression
⌊6r⌋ + 1
will be a random integer between 1 and 6. (The notation ⌊x⌋ means the greatest
integer not exceeding x, and is read “floor of x.”)
How does the program Random Numbers work?
(Random Number Generation) The program RandomNumbers
generates n random real numbers in the interval [0, 1], where n is chosen by the
user.
(Coin Tossing) As we have noted, our intuition suggests that the
probability of obtaining a head on a single toss of a coin is 1/2. What’s the first way we can have the
computer toss a coin?
We can ask it to pick a random real number in the interval
[0, 1] and test to see if this number is less than 1/2. If so, we shall call the outcome
heads; if not we call it tails.
(Coin Tossing) As we have noted, our intuition suggests that the
probability of obtaining a head on a single toss of a coin is 1/2. What’s the second way we can have the
computer toss a coin?
We can ask the computer
to pick a random integer from the set {0, 1}. The program CoinTosses carries
out the experiment of tossing a coin n times.
We notice that when we tossed the coin 10,000 times, the proportion of heads
was close to the “true value” .5 for obtaining a head when a coin is tossed. What is a mathematical
model for this experiment called?
Bernoulli Trials (see Chapter 3).
What will the
Law of Large Numbers, which we shall study later (see Chapter 8), show about Bernoulli trials?
The
Law of Large Numbers, will show that
in the Bernoulli Trials model, the proportion of heads should be near .5, consistent
with our intuitive idea of the frequency interpretation of probability.
Of course, how could our coin toss program could be easily modified?
Our program could be easily modified to simulate coins for which the
probability of a head is p, where p is a real number between 0 and 1.
In the case of coin tossing, we already knew the probability of the event occurring
on each experiment. Where does the real power of simulation come from?
The ability to estimate
probabilities when they are not known ahead of time.
What is true about simulated results?
Accurate
results by simulation require a large number of experiments.
The previous simulation shows that it is
important to know how many trials we should simulate in order to expect a certain
degree of accuracy in our approximation. What shall we later see?
That in these types of
experiments, a rough rule of thumb is that, at least 95% of the time, the error does
not exceed the reciprocal of the square root of the number of trials.