Lecture 1 Flashcards
(28 cards)
What Bayesian statistics is about?
- using our existing knowledge, our prior beliefs, to make sense of the world
-continuously update our beliefs with data
- learning from past experiences and data to form our posterior beliefs
How to choose prior?
based on theoretical considerations
What entails prior weighted heavily?
- data doesn’t shift our beliefs
- for example: prior on guessing parameter (participants should not perform worse than chance level)
What entails prior weighted equally with data?
- it is often used in replication research
- where you combine prior knowledge from original study with replication data to establish an effect
What entails prior weighted lighter/less than data?
- data really shifts out beliefs
- most often in research
What differentiates frequentist approach from Bayesian approach in terms of prior?
frequentist analysis gives NO voice to our prior knowledge -> purely data-driven approach
How is concept of probability understood in frequentist way?
long-run relative frequency of repeatable event (”frequentist”)
How is concept of probability understood in Bayesian way?
measure of relative plausability of an event
How data is treated in frequentist way?
data alone drives analysis
How data is treated in Bayesian way?
data should be weighted against our prior beliefs
How is RQ formulated in frequentist approach?
If the hypothesis is not correct, what are the chances I would have observed these data or more extreme ones?
How is RQ formulated in Bayesian approach?
In light of these data, what is the probability that the hypothesis is correct?
Sumarize Bayesian knowledge building
- We construct our posterior knowledge by balancing information from out data with our prior knowledge
- As more data comes in → we refine our knowledge and most of the time the influence of our orginal prior fades into the background
How can we represent randomness in the variables?
using probability models
When probability model is valid?
1) accounts for all possible events
2) assigns probabilities to each event
3) these probabilities sum to 1
How many models we need in Bayesian analysis?
3!
- prior probability model
- model for data (watch out -> likelihood function is not probability model)
- posterior probability model
What is unconditional probability A?
P(A) measures the probability of observing A, without any knowledge of B
What is conditional probability of A given B?
P(A|B) measures the probability of observing A in light of information that B occurred
Is the order of conditioning important?
Yes!
P(A | B) ≠ P(B | A)
When A and B are independent?
if and only if the occurence of B does not tell us anything about the occurence of A
P(A | B) = P(A)
What is a difference between conditional probability and likelihood?
the numerical values are the same, different philosophical sense
conditional probability
P(A|B) - used to describe the probability of observing A(=data) after observing B is true
likelihood
L(B|A) - used to describe the probability of observed data A, as function of different possible values of B
Is likelihood function probability model?
no! because likelihoods do NOT add up to 1
What is maginal probability?
the probability of a single event occurring, without consideration of any other events.
It is derived from a joint probability distribution and represents the likelihood of an event happening in isolation.
So it is a sum of joint events!
How to calculate joint probability?
For events A and B, the joint probability A ∩ B is calculated by weighting the conditional probability of A given B by the marginal probability of B
P (A∩B) = P(A|B) P(B)