Lecture 1 Flashcards

(28 cards)

1
Q

What Bayesian statistics is about?

A
  • using our existing knowledge, our prior beliefs, to make sense of the world

-continuously update our beliefs with data

  • learning from past experiences and data to form our posterior beliefs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How to choose prior?

A

based on theoretical considerations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What entails prior weighted heavily?

A
  • data doesn’t shift our beliefs
  • for example: prior on guessing parameter (participants should not perform worse than chance level)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What entails prior weighted equally with data?

A
  • it is often used in replication research
  • where you combine prior knowledge from original study with replication data to establish an effect
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What entails prior weighted lighter/less than data?

A
  • data really shifts out beliefs
  • most often in research
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What differentiates frequentist approach from Bayesian approach in terms of prior?

A

frequentist analysis gives NO voice to our prior knowledge -> purely data-driven approach

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How is concept of probability understood in frequentist way?

A

long-run relative frequency of repeatable event (”frequentist”)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How is concept of probability understood in Bayesian way?

A

measure of relative plausability of an event

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How data is treated in frequentist way?

A

data alone drives analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How data is treated in Bayesian way?

A

data should be weighted against our prior beliefs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How is RQ formulated in frequentist approach?

A

If the hypothesis is not correct, what are the chances I would have observed these data or more extreme ones?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How is RQ formulated in Bayesian approach?

A

In light of these data, what is the probability that the hypothesis is correct?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Sumarize Bayesian knowledge building

A
  • We construct our posterior knowledge by balancing information from out data with our prior knowledge
  • As more data comes in → we refine our knowledge and most of the time the influence of our orginal prior fades into the background
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How can we represent randomness in the variables?

A

using probability models

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

When probability model is valid?

A

1) accounts for all possible events
2) assigns probabilities to each event
3) these probabilities sum to 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How many models we need in Bayesian analysis?

A

3!
- prior probability model
- model for data (watch out -> likelihood function is not probability model)
- posterior probability model

17
Q

What is unconditional probability A?

A

P(A) measures the probability of observing A, without any knowledge of B

18
Q

What is conditional probability of A given B?

A

P(A|B) measures the probability of observing A in light of information that B occurred

19
Q

Is the order of conditioning important?

A

Yes!
P(A | B) ≠ P(B | A)

20
Q

When A and B are independent?

A

if and only if the occurence of B does not tell us anything about the occurence of A
P(A | B) = P(A)

21
Q

What is a difference between conditional probability and likelihood?

A

the numerical values are the same, different philosophical sense

conditional probability
P(A|B) - used to describe the probability of observing A(=data) after observing B is true

likelihood
L(B|A) - used to describe the probability of observed data A, as function of different possible values of B

22
Q

Is likelihood function probability model?

A

no! because likelihoods do NOT add up to 1

23
Q

What is maginal probability?

A

the probability of a single event occurring, without consideration of any other events.

It is derived from a joint probability distribution and represents the likelihood of an event happening in isolation.

So it is a sum of joint events!

24
Q

How to calculate joint probability?

A

For events A and B, the joint probability A ∩ B is calculated by weighting the conditional probability of A given B by the marginal probability of B

P (A∩B) = P(A|B) P(B)

25
How to calculate joint probability of independent events?
P (A∩B) = P(A) P(B)
26
What is law of total probability?
let A and B be events with P(B) > 0 and P(Bc ) > 0. The Law of Total Probability states: P(A) = P(B) × P(A | B) + P(Bc ) × P(A | Bc )
27
Bayes rule
(prior x likelihood)/normalizing constant
28
What is normalizing constant?
P(A) - our maginal probability it function is to ensure that all probabilities sum up to 1