Linear algebra Flashcards
(29 cards)
What does a vector represent in linear algebra?
An ordered list of values that can represent data points, directions, or positions.
What space does a vector with n elements live in?
It lives in ℝⁿ, the n-dimensional real coordinate space.
What does a matrix represent in ML?
A table of numbers used to store datasets or transformations between vector spaces.
What does A ∈ ℝ^{m×n} mean for a matrix?
The matrix A has m rows (samples) and n columns (features).
What does a matrix transpose do?
It flips the matrix over its diagonal, switching rows with columns.
What is the result of transposing a matrix A ∈ ℝ^{m×n}?
A^T ∈ ℝ^{n×m}
What does a vector norm measure?
The length or size of a vector.
What is the formula for the ℓ₂ (Euclidean) norm?
‖v‖₂ = sqrt(v₁² + v₂² + … + vₙ²)
What is the formula for the ℓ₁ (Manhattan) norm?
‖v‖₁ = |v₁| + |v₂| + … + |vₙ|
What does the ℓ∞ norm return?
The largest absolute value of the vector’s components.
What is a tensor?
A generalization of vectors and matrices to higher dimensions.
What is the shape of an image tensor with batch size N, channels C, height H, width W?
ℝ^{N×C×H×W}
What happens during matrix-vector multiplication?
Each output is a dot product between a matrix row and the input vector.
When is matrix-matrix multiplication defined?
When the number of columns in A matches the number of rows in B.
What does a matrix inverse do?
It undoes the effect of a matrix transformation, such that A·A⁻¹ = I.
When is a matrix A invertible?
If A is square and its inverse A⁻¹ exists such that AA⁻¹ = I.
What is a change of basis?
Expressing a vector in a new coordinate system using a transformation matrix.
What does it mean for two vectors to be orthogonal?
Their dot product is zero; they are perpendicular in space.
What is an orthogonal matrix?
A matrix whose transpose equals its inverse: A^T A = I.
What is the geometric interpretation of orthogonal matrices?
They preserve length and angles (e.g., rotations).
What is an eigenvector?
A vector whose direction doesn’t change under a matrix transformation.
What is an eigenvalue?
The scalar that an eigenvector is scaled by when multiplied by the matrix.
What equation defines eigenvectors and eigenvalues?
A·v = λ·v
Why are eigenvalues and eigenvectors important in ML?
They help in dimensionality reduction and understanding transformations (e.g., PCA).