Linear Algebra Flashcards

1
Q

Define convex

A

Any line segment joining two points in the curve is above the curve

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Singular matrix

A

Inverse of the matrix doesn’t exist

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Multi-variate Gaussian

A

f(x) = 1/sqrt[ (2 pi)^d |covariance|] exp((x - m).T (x-m)/(2 covariance))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

LU Decomposition

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

QR Decomposition

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Singular Value Decomposition

A

A [m x n] = U [m x m] S [m x n] V.T [n x n]
U, V are orthogonal, unitary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Eigendecomposition

A

A = Q E Q^-1 for a square matrix A
Columns of Q = eigen vectors
diagonals of E = eigen values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Eigen values and vectors of a symmetric matrix

A

eigenvals = Real
vectors = orthogonal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Unitary Matrix

A

Conjugate transpose = inverse

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Positive definite vs positive semi-definite matrices

A

Symmetric Matrix ‘A’ is positive definite if
z.T A z > 0 for every non-zero vector z

Semi-definite: >=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How to know if a matrix is invertible

A

Lowest eigen val is positive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

SVD and Rank of matrix

A

Rank of matrix = # of non-zero singular values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

When does Ax = b have a unique solution? (hint ranks)

A

When rank[A] = rank[A| b] = n
where A is m x n, b is m x 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Dot product vs Cross product

A

Dot product yields a scalar: A.B = ||A|| ||B|| cos alpha
Cross product yields another vector perpendicular to both A and B with magnitude: A x B = || A || || B || sin alpha

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

inverse of a 3x3 matrix

A

1/|A| adjugate (A)
adjugate = transpose of cofactor
cofactor of x_ij = (-1) ^ (i + j) det(of matrix skipping row_i + col_j)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How to diagonalize a matrix?

A

To diagonalize matrix ‘A’, find eigen vectors and let P be matrix with eigen vectors as columns. If # of eigen values == # of rows/cols, then it can be diagonalized as:
D = P^{-1} A P

17
Q

Measures (L0, L1, L2, L-inf)

A

L0 = # of non-zero
L1 = sum of abs elems/Manhattan
L2 = euclidean distance to origin
L-inf = max abs value

18
Q

How to test if a matrix is positive definite?
Determinant test
Pivot test

A

All eigenvalues are positive

Determinant test:
All upper left determinants > 0
i.e.,
Every sub-matrix starting with single element in the top-left corner to larger sizes (row+1, col+1) has a determinant > 0

Pivot test:
convert to upper triangular matrix, if pivots (diagonal) elements > 0 –> positive definite

19
Q

Properties of eigen values and eigen vectors for:
symmetric matrices

A

Symmetric matrix: A.T A = I
Real Eigen values
Orthogonal eigen vectors

20
Q

Properties of eigen values and eigen vectors for:
positive definite matrices

A

eigen values real and > 0
orthogonal eigen vectors

21
Q

Properties of eigen values and eigen vectors for: orthogonal matrix

A

Orthogonal matrix: Q Q.T = I
| eigen value | = 1
orthogonal eigen vectors

lambda | = 1

22
Q

Why is an orthogonal matrix computationally preferable?

A

Because, inverse = transpose

23
Q

Determinant and eigenvalues

A

product of eigen values

24
Q

Interpretation of a determinant

A

Volume scaling done by linear transformation