Lec 2 | Uncertainty Flashcards
(50 cards)
This can be represented as a number of events and the likelihood, or probability, of each of them happening.
Uncertainty
Every possible situation can be thought of as a world, represented by which lowercase Greek letter?
omega ω
How do we represent the probability of a certain world?
we write P(ω)
Axioms in Probability
every value representing probability must range between 0 and 1
0 < P(ω) < 1
Axioms in Probability
________ is an impossible event, like rolling a standard die and getting a 7.
Zero or 0
Axioms in Probability
________ is an event that is certain to happen, like rolling a standard die and getting a value less than 10.
One or 1
Axioms in Probability
The probabilities of every possible event, when summed together, are equal to ?
1
the degree of belief in a proposition in the absence of any other evidence
Unconditional Probability
The degree of belief in a proposition given some evidence that has already been revealed.
Conditional Probability
AI can use partial information to make educated guesses about the future. To use this information, which affects the probability that the event occurs in the future, we rely on?
Conditional Probability
How do we express conditional probability?
P(a | b)
What does P(a | b) mean?
“the probability of event a occurring given that we know event b to have occurred” or “the probability of a given b.”
What formula do we use to compute the conditional probability of a given b?
P(a∧b) P(a|b)=------------- P(b)
It is a variable in probability theory with a domain of possible values that it can take on
Random Variable
It is the knowledge that the occurrence of one event does not affect the probability of the other event.
Independence
How do we define independence?
Independece can be defined mathematically: events a and b are independent if and only if the probability of a and b is equal to the probability of a times the probability of b: P(a ∧ b) = P(a)P(b)
It is commonly used in probability theory to compute conditional probability
Baye’s Rule
Bayes’ rule says that the probability of b given a is equal to the probability of a given b, times the probability of b, divided by the probability of a.
P (b) P(a | b) P(b | a) = ------------------ P(a)
It is the likelihood of multiple events all occurring
Joint Probability
Probability Rules
This stems from the fact that the sum of the probabilities of all the possible worlds is 1, and the complementary literals a and ¬a include all the possible worlds.
Negation: P(¬a) = 1 - P(a).
Probability Rules
This can interpreted in the following way: the worlds in which a or b are true are equal to all the worlds where a is true, plus the worlds where b is true. However, in this case, some worlds are counted twice (the worlds where both a and b are true)). To get rid of this overlap, we subtract once the worlds where both a and b are true (since they were counted twice).
Inclusion-Exclusion: P(a ∨ b) = P(a) + P(b) - P(a ∧ b).
Probability Rules
The idea here is that b and ¬b are disjoint probabilities. That is, the probability of b and ¬b occurring at the same time is 0. We also know b and ¬b sum up to 1. Thus, when a happens, b can either happen or not. When we take the probability of both a and b happening in addition to the probability of a and ¬b, we end up with simply the probability of a.
Marginalization: P(a) = P(a, b) + P(a, ¬b).
Probability Rules
How is Marginalization expressed for random variables?
P(X = xsubi) = ∑subjP(X = xsubi, Y = ysubj)
yati unsaon ni
Probability Rules
This is a similar idea to marginalization. The probability of event a occurring is equal to the probability of a given b times the probability of b, plus the probability of a given ¬b time the probability of ¬b.
Conditioning: P(a) = P(a | b)P(b) + P(a | ¬b)P(¬b).