{ "@context": "https://schema.org", "@type": "Organization", "name": "Brainscape", "url": "https://www.brainscape.com/", "logo": "https://www.brainscape.com/pks/images/cms/public-views/shared/Brainscape-logo-c4e172b280b4616f7fda.svg", "sameAs": [ "https://www.facebook.com/Brainscape", "https://x.com/brainscape", "https://www.linkedin.com/company/brainscape", "https://www.instagram.com/brainscape/", "https://www.tiktok.com/@brainscapeu", "https://www.pinterest.com/brainscape/", "https://www.youtube.com/@BrainscapeNY" ], "contactPoint": { "@type": "ContactPoint", "telephone": "(929) 334-4005", "contactType": "customer service", "availableLanguage": ["English"] }, "founder": { "@type": "Person", "name": "Andrew Cohen" }, "description": "Brainscape’s spaced repetition system is proven to DOUBLE learning results! Find, make, and study flashcards online or in our mobile app. Serious learners only.", "address": { "@type": "PostalAddress", "streetAddress": "159 W 25th St, Ste 517", "addressLocality": "New York", "addressRegion": "NY", "postalCode": "10001", "addressCountry": "USA" } }

Lecture 17 Flashcards

(14 cards)

1
Q

PCA what is it? (+/-)

A

reduce dimension, combine the feature variables in a specific way, retaining the most valuable parts of all of the feature variables, each of the “new variables” after PCA are all independent of one another (if strong correlation:reduction of dimension)
cons: loss of interpretability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

PCA how to overall

A

Find the directions of maximum variance in high- dimensional dataset (n dimension) and project it onto a subspace with smaller dimension (k dimension, with k < n), while retaining most of the information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

PCA eq

A

A = X.T@X (nxn) - covariance matrix
Diagonalize A=UDU^{-1} (D:λ1, …, λn ; U:u1, …, un)
New feature X* = XU such that X.T@X=D
X* first 2 columns (principal components) explain most of the variance (λ1/λtotal + λ2/λtotal)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Diagonalization

A

A nxn with n linearly independent eigenvector u, then A=UDU^{-1} (U lin indep normalised eigenvectors, D eigenvalues)
If less than n lin indep eigenvector, defective and not diagonalizable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

nxn symmetric matrix A with n distinct eigenvalues ?

A

Hence diagonalizable (/!\ if A diagonalizable, not necessarily distinct eigenvalues)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Identity matrix diagonalization

A
Iv = λv
I = UDU-1 with eigenvalues all 1 and u1=(1 0 ... 0), u2=(0 1 0 ... 0), ...
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

SVD

A

Factorization mxn matrix A=UΣV.T where U mxm orthogonal, V.T nxn orthogonal, Σ mxn diagonal (σ1 >= σ2 >= …)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

SVD, V?

A

A.T@A = V@Σ^2@V.T

V eigenvectors of A.T@A (lin. indep - vi.vj=0, right singular vectors)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

SVD, A@A.T eigenvectors?

A

A@A.T = U@Σ^2@U.T

Columns of U are eigenvectors of A@A.T (left singular vectors)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

SVD computation

A

1) A.T@A
2) Get V, D = eigen(A.T@A) (V columns right singular vector, D eigenvalues)
3) σi = √λi or Σ=D^{1/2} (singular values)
4) U = AVΣ^{-1} (columns are left singular vectors)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

A.T@A eigenvalues

A

Non-negative because semi positive definite and symmetric (always square root!)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Orthogonal singularity ?

A

Not singular (X^{-1} = X.T)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Σ can have zero diagonal

A

Yes, singular value can be zero

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Singular value of A ?

A

Squared singular values of A are eigenvalues of A.T@A (σi = √λi)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly