Probabilistic Reasoning Flashcards

(28 cards)

1
Q

sample space

A

set of possible outcomes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

random variable

A

result of a random experiment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

event

A

any subset of points in a sample space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

probability density function

A

for continuous random variables distribution is expressed implicitly though a prob density function that returns the likelihood of an outcome being close to the given value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

probability distribution function

A

for discrete random variables distributed expressed explicitly through probability distribution function that returns as prob of an outcome being a given value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

likelihood

A

joint density of observed data as function of model parameters

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

joint distribution

A

distribution function over 2 or more random variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

P (a or b)

A

P(a) + P(b) - P(a,b)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

P(a and b)

A

P(a) * P(b)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

conditional probability and formula

A

probability of an event occurring given that another event has already occured
P(a|b)= P(a and b)/ P(b)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

what is bayes rule and derive it

A

P(a|b)= P(b|a)P(a) / P(b)

start with
P(b|a)= P(alb)/P(a) and P(a|b)= P(alb)/ P(b)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

what is posterior part

A

P(cause | effect) it is the probability hypothesis given some evidence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

what is likelihood part

A

P(effect | cause) this is the likelihood hat effect will occur if cause if true

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

what is prior belief part

A

P(cause) in top row prior belief in some cause

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

what is evidence part

A

P(effect) on bottom is the probability evidence across all possible causes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

why is bayes rule helpful?

A

helps convert problem that’s hard to measure to be computed from something that’s easy to measure

17
Q

BR is an _____ to

A

update to prior belief given new info

18
Q

example of bayes rule classifier

A

determine if patient has a disease based on + test
P(D|T) = P(T|D)P(D)/ P(T)

19
Q

joint probability distribution

A

represents probability of different events occurring together

20
Q

curse of dimensionality

A

as number variables increased size of JPD grows exponentially
if have n variables have to consider 2^n combinations

21
Q

independence

A

2 events independent if occurrence of one does not affect probability of the other

22
Q

independence rules

A

P(A,B) = P(A) * P(B)
formally a and b are independent if P(A|B)= P(A)

23
Q

independence in terms of conditional probability

A

knowing that event B happened doesn’t affect probability of A

24
Q

events are conditionally independent given a third event if…

A

P(X,Y|Z)= P(X|Z) P(Y|Z)

25
how naive bayes classifier simplifies computation?
by assuming all features were conditionally independent of each other each feature independently contributes to the likelihood of the class decreases complexity of computation by transforming computation into a series of independent likelihood calculations for each feature
26
naive bayes classifier bias and variance is
low bias high variance
27
naive bayes classifier assumption and real life?
assumption not always true in real life as features often correlated/dependent can lead to suboptimal classifications if there are strong dependencies between features (e.g cough and fever are not independent and may both contribute to the same underlying cause)
28
how to do spam/ham using NBC
1) calculate priors P(spam) and P(not spam) 2) calculate likelihoods - for each feature (word) calculate P(word|spam) = count of words in spam/ total count of words in spam P(word|not spam) 3) use bayes to computer posterior - for a new email calculate prob of being spam/not spam P(spam | word1, word2) sideways infinit P(spam) * P(w1|spam) * P(w2|spam) .,... 4) decision