Week 5 (GMM) Flashcards

(11 cards)

1
Q

Combining simplex models to form GMM

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Log likelihood for GMM

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Latent variable representation for GMM

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Form responsibility for GMM

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Derive MLE for μ for GMM

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Derive MLE for Σ for GMM

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

MLE for mixing term for GMM

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Put together all MLE for GMM

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

EM algorithm for GMM

A

Repeat:
1) calculate responsibility for each pairwise combo of n and k
2) update MLEs using prev step
3) update param values using MLEs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Problems with GMM

A

Singularity - a mixture component collapses on a data point

Identifiability - MLE solution in a K component mixture has K! Solutions due to permutation symmetry

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Crucial difference between EM and Variational inference

A

The parameters become stochastic variables in VI and therefore their parameter vector θ doesn’t appear in the ELBO decomposition (whereas it does in that fro EM)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly