lecture 13 NMF Flashcards

1
Q

PCA

A

Orthogonal
Sparse pca: components orthogonal& sparse

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Nmf

A

Latent representation and latent features are non- negative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Nmf

A

Easier to interpret
No cancellation like pca
No sign ambiguity
Can learn over complete representation
Soft clustering

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Downside of Nmf

A

Only applied to non- negative
Interpretable is hit/ miss
Non- convex optimization
Slow on large dataset
Not orthogonal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Manifold learning

A

Allow more complex transformations
Better visulizatjon -> less than 2
New representation of training not test
Good for eda

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Tsne

A

Random 2D
Points - close
Far——-far
Emphasis on points that are close by

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Outliers— parametric

A

Elliptic — Gaussian model— fit robust covariants matrix and mean

Only works if Gaussian assumption is reasonable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Kernel density

A

Non- parametric model
Need to adjust kernel bandwidth
Not good in high dimension

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

One class svm

A

Use Gaussian kernel to cover data
Select support vectors
Select gamma
Specify outlier ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Isolation tree

A

Outliers are easier to isolate from the rest

How well did you know this?
1
Not at all
2
3
4
5
Perfectly