Linear algebra Flashcards

(29 cards)

1
Q

What does a vector represent in linear algebra?

A

An ordered list of values that can represent data points, directions, or positions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What space does a vector with n elements live in?

A

It lives in ℝⁿ, the n-dimensional real coordinate space.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does a matrix represent in ML?

A

A table of numbers used to store datasets or transformations between vector spaces.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does A ∈ ℝ^{m×n} mean for a matrix?

A

The matrix A has m rows (samples) and n columns (features).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does a matrix transpose do?

A

It flips the matrix over its diagonal, switching rows with columns.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the result of transposing a matrix A ∈ ℝ^{m×n}?

A

A^T ∈ ℝ^{n×m}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does a vector norm measure?

A

The length or size of a vector.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the formula for the ℓ₂ (Euclidean) norm?

A

‖v‖₂ = sqrt(v₁² + v₂² + … + vₙ²)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the formula for the ℓ₁ (Manhattan) norm?

A

‖v‖₁ = |v₁| + |v₂| + … + |vₙ|

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What does the ℓ∞ norm return?

A

The largest absolute value of the vector’s components.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is a tensor?

A

A generalization of vectors and matrices to higher dimensions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the shape of an image tensor with batch size N, channels C, height H, width W?

A

ℝ^{N×C×H×W}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What happens during matrix-vector multiplication?

A

Each output is a dot product between a matrix row and the input vector.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

When is matrix-matrix multiplication defined?

A

When the number of columns in A matches the number of rows in B.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does a matrix inverse do?

A

It undoes the effect of a matrix transformation, such that A·A⁻¹ = I.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

When is a matrix A invertible?

A

If A is square and its inverse A⁻¹ exists such that AA⁻¹ = I.

17
Q

What is a change of basis?

A

Expressing a vector in a new coordinate system using a transformation matrix.

18
Q

What does it mean for two vectors to be orthogonal?

A

Their dot product is zero; they are perpendicular in space.

19
Q

What is an orthogonal matrix?

A

A matrix whose transpose equals its inverse: A^T A = I.

20
Q

What is the geometric interpretation of orthogonal matrices?

A

They preserve length and angles (e.g., rotations).

21
Q

What is an eigenvector?

A

A vector whose direction doesn’t change under a matrix transformation.

22
Q

What is an eigenvalue?

A

The scalar that an eigenvector is scaled by when multiplied by the matrix.

23
Q

What equation defines eigenvectors and eigenvalues?

24
Q

Why are eigenvalues and eigenvectors important in ML?

A

They help in dimensionality reduction and understanding transformations (e.g., PCA).

25
What is the covariance matrix used for?
To describe how features co-vary in a dataset.
26
What is the formula for a covariance matrix from data X?
S = (1 / (n - 1)) Σ (Xᵢ - X̄)(Xᵢ - X̄)^T
27
What do the diagonal entries of the covariance matrix represent?
The variance of each feature.
28
What does a dot product tell you?
How aligned two vectors are — their projection onto one another.
29
When is the dot product of two vectors zero?
When the vectors are orthogonal (perpendicular).