Week 10 Flashcards

1
Q

Lecture 28

A

Bayesian inference

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Difference between frequentist and Bayesian

A

Comes down to probability

Freq: when number of experiments goes to infinity (assumes you can do large number of trials the same) REPEATABILITY is key

Bayesian: degree of belief about an event

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Prove Bayes theorem

A

Here P(A,B) is P(A n B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What to do if marginal probability is not available but needed

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Def Bayesian parametric model

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Change in notation from freq to Bayesian

A

As θ is a RV therefore f(x|θ) is now a distribution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Interpretation of prior dist

A

Represents the uncertainty about the (true value of) the parameter θ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Multi variate prior

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

The prior has to be

A

Fully specified, hyper parameters (params of prior) must be fully specified

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Actualisation principle

A

Aka Bayes update

1) use prior dist
2) collect data
3) posterior dist

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Def posterior dist

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Posterior distribution analog in frequentist

A

Likelihood function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Prove posterior dist

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Derive posterior dist using Bayes theorem

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Lecture 29

A

Started around 2cards ago

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Beta binomial model

A
17
Q

Def conjugate priors

A

A family F of probability distributions in Θ is said to be conjugate (or closed under sampling) for a likelihood function f(x|θ) if, for every prior πΘF , the post dist π(θ| x) also belongs to F

18
Q

Lecture 30

A

Jeffrey’s prior

19
Q

(Im)proper prior ?

A
20
Q

(Im)proper posterior

A
21
Q

Relate (im)proper priors to posteriors

A

All proper priors lead to proper posteriors (by Bayes)

Some improper priors lead to proper posteriors, Some lead to improper posteriors

22
Q

Why do we care about improper priors

A

Nice theoretical properties: invariance under reparameterisation

23
Q

Jeffrys prior

A