SVD Flashcards
(26 cards)
What problem did the Netflix Prize aim to solve?
Predict user preferences based on historical ratings using collaborative filtering.
What technique underlies many recommendation systems?
Matrix factorization using SVD (Singular Value Decomposition).
What does matrix factorization aim to do?
Break a matrix into structured components that capture underlying patterns.
What is the general form of the SVD for a matrix X?
X = S V Dᵀ
What do S, V, and D represent in the SVD?
S: left singular vectors, V: singular values (diagonal), D: right singular vectors.
What is the rank of a matrix in terms of SVD?
The number of non-zero singular values.
What does linear dependence between columns mean?
At least one column can be written as a combination of others.
What happens to singular values for a rank-deficient matrix?
Some singular values are zero and can be dropped in approximations.
What does a truncated SVD (rank-k) give you?
The best rank-k approximation to the matrix under Frobenius norm.
What does the Eckart–Young theorem state?
Truncated SVD gives the optimal low-rank approximation under Frobenius norm.
What is the Frobenius norm used for in SVD?
To measure reconstruction error between original and approximated matrix.
What does each term σᵢ·uᵢ·vᵢᵀ in the SVD represent?
A rank-1 matrix capturing one mode of variation in the data.
What happens to image storage using SVD compression?
It reduces storage size by keeping only top-k singular components.
How do you choose k for SVD image compression?
Use the elbow method or energy preservation (e.g. retain 99% of variance).
In recommender systems, what does the matrix X represent?
A user-item ratings matrix, often sparse and incomplete.
How does SVD help with recommendation tasks?
By uncovering latent factors and predicting missing entries via low-rank reconstruction.
What are the steps for SVD-based recommendation?
Factorize, truncate, reconstruct, and recommend top items based on scores.
What do latent dimensions in recommender systems represent?
Hidden features like genre preference or viewing style.
How is SVD connected to PCA?
PCA is the SVD of a mean-centered data matrix.
What does XᵀX equal in terms of SVD?
XᵀX = D V² Dᵀ, which is the eigendecomposition of the covariance matrix.
What do the columns of D in PCA represent?
Principal components (eigenvectors of the covariance matrix).
What do the singular values in PCA correspond to?
The square roots of the eigenvalues of the covariance matrix.
Why is SVD more general than eigen-decomposition?
It applies to any rectangular matrix, not just symmetric square matrices.
What does each σᵢ (singular value) tell us?
The strength or importance of the corresponding mode of variation.