Lecture 20 Flashcards

1
Q

Data fitting, y=Ax in general ∈ ?

A

∈ range(A), r orthogonal to all columns of A (y closest to b ∉ range(A)) Indeed, A.T@r = 0 => A.T@A@x = A.T@b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Condition number for normal equations

A

cond(A.T@A) = (cond(A))^2, worsen the conditioning of the matrix. Solution: use SVD (even if Cholesky is less costly!)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

SVD to solve a square system Ax=b? (A nxn)

A

A = UΣV.T

1) Solve Σy=U.T@b (y=V.T@x), we get yi=(ui.b)/σi
2) Solve x=Vy = y1.v1 + … + yn.vn, we get x = Σ_{i=1}^n yi.vi

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Shapes U, Σ, V of the reduced SVD of A 10x14 matrix?

A

U 10x10
Σ 10x10
V 14x10

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Solution reduced SVD full rank least square problem

A

A = Ur@Σr@V.T, σi≠0
A.T@A@x = A.T@b
gives x=V@Σr^{-1}@Ur.T@b (unique solution)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Solution reduced SVD rank deficient least square problem

A

rank(A) < n. Find min_x ||Ax-b||_2, min ||x||2.
A = Ur@Σr@V.T
1) Solve Σr@y = Ur.T@b with y=V.T@x, we get yi=(ui.b)/σi if σi≠0, 0 otherwise.
2) x=V@y = Σvi.yi = Σ
{σi≠0} vi.(ui.b)/σi, that is x = V@Σr^+@Ur.T@b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Cost of solving min_x ||b-Ax||_2^2 (A=UΣV.Tx calculated!)

A

O(mn) indeed, x=V@Σr^+@Ur.T@b (there’s also a n^2 operation but m>n so mn > n^2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Rank(A) necessary to solve least squares?

A

Rank(A) must be > number of coefficients, otherwise bad results.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Python eigenvalues, eigenvectors

A

eigvals, eigvecs = np.linalg.eigh(A)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Python svd

A

U, S, Vt = np.linalg.svd(A, full_matrices=False)

False for reduced SVD. Be careful, Vt returned and S is a 1D array! (use np.diag)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Python least squares

A

solution, residual, rank, σiOfA = np.linalg.lstsq(A, b)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Python linear/poly reg

A

from scipy import stats
slope, intercept, rvalue, pvalue, stderr = stats.linregress(x, y)
cubic, quadratic, linear, intercept = np.polyfit(x, y, 3)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Cholesky factorization, why?

A

Because the system of normal equations is symmetric, the normal equations can be solved by computing and using a Cholesky factorization

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Solve non-linear least-squares

A

Use of steepest descent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly