Week 11 Flashcards

1
Q

Lecture 31

A

Bayesian point estimators

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Steps in Bayesian point estimation

A

Estimating parameter h(θ) € Rq
For θ € Rp

1) define loss function: L(θ,δ) = cost of using delta when param to be estimated is θ
(In freq this would be constant, in Bayesian it is RV)

2) as L is rv can’t minimise directly -> define conditional risk: Eπ(θ|x)[L(θ,δ) ] = integral ( L(θ,δ) π(θ|x) )dθ = R(δ)

3) point estimator: ^θB = argminδR(δ)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Def quadratic loss

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Maximum a posteriori

A

MAP is defined as mode of posterior dist

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Absolute loss

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Construct Bayes’ estimator from quadratic loss

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Summary table of Bayes’ point estimators

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

When n -> inf, Bayes estimates go?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

As n -> posterior variance

A

-> 0

Posterior dist concentrates around a point

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

The last written page on lecture 31

A

Is not examinable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Def the prior predictive dist

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

General distinction between (1) prior prediction and (2) posterior prediction

A

(1) not seen data, have prior geuss

(2) marginal conditional, given data wrt θ

In general integrate wrt prior/posterior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How to use kernel

A

When you find a kernel under an integral you know that when you integrate it it will be equal to the inverse of the normalisation constant as the full dist is integrated to 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Lecture 32 ‘a longer example’ , no informative priors

A

Not examinable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Posterior predictive dist

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Lecture 33

A

Also not examinable (I think?)