Week 8 (Chapter 13) - Uncertainty Flashcards

1
Q

What are some methods for handling uncertainty?

A
  • Nonmonotonic logic
  • Rules with confidence factors
  • Probability

Ultimately, just probability because it is able to summarize the effects of laziness and ignorance.

  • laziness: failure to enumerate exceptions, qualifications, etc.
  • ignorance: lack of relevant facts, initial conditions, etc.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does Fuzzy logic handle?

A

Degree of truth

e.g. WetGrass is true to degree 0.2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the issues with Rules with confidence factors?

A

Problems with combination

e.g.
Sprinkler -> 0.99 WetGrass
WetGrass -> 0.7 Rain

Problems with combination, e.g. Does Sprinkler cause Rain?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the issues with Nonmonotonic logic?

A

Hard to determine which assumptions are reasonable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is “Utility Theory”? [Follow-up]

A

Used to represent and infer preferences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is “Decision Theory”? [Follow-up]

A

Decision theory = Utility theory + Probability theory

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

WTF is “random variables”?

A

A “random variable” is a function from sample points to some range

Consider a 6 sided dice
e.g. P(Odd=true) = 1/6 + 1/6 + 1/6 = 1/2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is an “event” in probability?

A

A set of sample points
e.g. P(A) = sumof{w belonging to A} P(w)

e.g. P(die roll < 4) = 1/6 + 1/6 + 1/6 = 1/2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

WTF is a “proposition”? [Follow-up]

A

Page 9 Lecture 8

Think of a proposition as the event where the proposition is true

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

P(A U B) = ?

A

P(A) + P(B) - P(A n B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Syntax for Propositions?

A
  • Propositional or Boolean random variables
    e. g. Cavity (do I have a cavity?)
  • Discrete random variables
    e.g. Weather is one of
    Weather = rain is a proposition
  • Continuous random variables
    e. g. Temp = 21.6 or Temp < 22.0
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How to find the probability for continuous variables?

A

Gaussian probability density function

P(x) = (1/sqrt(2pi) * std dev) * e^( (x - mean)^2 / 2 * std dev^2) )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Is new evidence good for conditional probability?

A

No, may be irrelevant

e.g. P(cavity | toothache, carltonWins) = P(cavity | toothache) = 0.8

The new evidence is valid, but not always useful. It’s less specific.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

P(A | B) = ?

A

P(A n B) / P(B)

Note: P(A n B) = P(A | B)P(B) = P(B | A)P(A)
Proof: P(A | B) = ( P(A | B)P(B) ) / P(B) -> P(A | B) = P(A | B)

therefore:
Bayes Rule = P(A | B) = ( P(B | A) / P(B) ) / P(A)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Inference by enumeration?

A

It’s just the distribution matrix.

Everything should add up to 1

Page 17 - 20 Lecture 8

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Follow-up on Page 21 - 22 Lecture 8, wtf is this shit

wtf is the normalization constant

A

!?!?!?!

P(cavity | toothache) = a*P(cavity, toothache)

17
Q

Conditions for A and B to be independent?

A

A and B are independent iff
P(A|B) = P(A) or
P(B|A) = P(B) or
P(A,B) = P(A)P(B)

18
Q

What is conditional independence useful for?

Elaborate on conditional independence with explanation with toothache, cavity example

A

Conditional independence is useful for simplifying probability calculations.

If I have a cavity, the probability that the probe catches in it doesn’t depend on whether I have a toothache:
(1) P(catch|toothache,cavity) = P(catch|cavity)

The same independence holds if I haven’t got a cavity:
(2) P(catch|toothache,¬cavity) = P(catch|¬cavity)

Catch is conditionally independent of Toothache given Cavity:
P(Catch|Toothache,Cavity) = P(Catch|Cavity)

19
Q

How to write out joint distribution using chain rule of P(Toothache, Catch, Cavity)?
[Follow-up]

A

Page 25 Lecture 8

P(Toothache,Catch,Cavity)
= P(Toothache|Catch,Cavity)P(Catch,Cavity)
= P(Toothache|Catch,Cavity)P(Catch|Cavity)P(Cavity)
= P(Toothache|Cavity)P(Catch|Cavity)P(Cavity)

20
Q

Bayes Rule?

A

P(A|B) = (P(B|A) / P(B)) * P(A)

or in distribution form

P(Y|X) = a * P(X|Y)P(Y)

a = normalization constant, whatever the fuck that is… [Follow-up]

21
Q

Bayes Rule and conditional independence? Is it possible?

A

Yes, this is basically the Naïve Bayes model:
P(a) * multiplicationof( P(b|a) )

Remember, Naïve Bayes works on the assumption that all probabilities are conditionally independent.
Otherwise it’ll just shit the bed because some probabilities are correlated with others etc.

P(Cavity|toothache∧catch)
= αP(toothache∧catch|Cavity)P(Cavity)
= αP(toothache|Cavity)P(catch|Cavity)P(Cavity)