Week 9 (bayesian LR & kernel) Flashcards

(16 cards)

1
Q

Define a conjugate prior over W

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Common choice of prior & resulting posterior mean and variance

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Diagrams of Bayesian linear regression likelihood, posterior and data space

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Form predictive distribution

A

We are averaging over all possible W, weighted by the posterior:

p(t|w,β) is the weighting
p(w|t, α, β) is the posterior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Graphs of predictive distributions for sinusoidal data with Gaussian basis functions

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the equivalent kernel

A

We can write the mean as a weighted sum of the target values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Properties of equivalent kernel

A

Even non local basis functions have local equivalent kernels (see image)

The kernel relates to the covariance of predictions

Normalisation

Inner product

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Show the kernel as a covariance function

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Kernel as a covariance function

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Normalization and inner product for kernel

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Model evidence

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Predictive mixture distribution

A

*having computed p(Mi |D) (which is the posterior for model i)

For L models
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Calculating model evidence/marginal likelihood

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Approximating model evidence

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Full Bayesian predictive distribution

A

Empirical Bayes/Type 2 ML/Generalised ML/evidence approximation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Computing empirical Bayes

A

Here we p(α, β ) is assumed to be flat so
p(α, β | t ) is proportional to p(t|α, β)