Exam 3 Flashcards

1
Q

5.1 Eigenvector Definition

A

a nonzero vector x such that Ax = lambdax for some value of lambda.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

5.1 eigenvalue

A

lambda is a eigenvalue of A if there is a nontrivial solution x of Ax = lambdax

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

5.1 Theorem 1

A

eigenvalues of a triangular matrix are the entries on its main diagonal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

5.1 Theorem 2

A

If the eigenvectors correspond to the eigenvalues of a matrix, then the vectors are linearly independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

5.2 Invertible Matrix Theorem: A is invertible only if __ is __ an eigenvalue of A

A

A is invertible only if the number 0 is not an eigenvalue of A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does det AB equal

A

det A(det B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does det A^T equal

A

det A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Does row replacement change the determinant

A

No, but it does change the sign of the determinant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How does row scaling effect the determinant

A

Row scaling scales the determinant by that scalar value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

multiplicity

A

how many times an eigenvalue occurs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Similarity

A

A is similar to B if there is an invertible matrix P such that APP^-1 (inverse of P) = B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

5.3 A matrix is diagonalizable if

A

A is similar to a diagonal matrix

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

5.3 diagonalization theorem

A

An matrix is diagonalizable
only if A has enough linearly independent eigenvectors to form a basis of R^n.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

5.3 A=PDP^-1 (is diagonalizable) only if what

A

the columns of P are n linearly independent eigenvectors of A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

5.3 Theorem 6

A

An nxn matrix with n eigenvalues is diagonalizable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

5.4 Theorem 8: Diagonal Matrix Representation

A

If the columns of B form a basis, then D is the basis matrix for the transformation x -> Ax

17
Q

6.1 Inner Product/ Dot Product

A

[u1, …un] [v1,…vn] = u1v1+… unvn

18
Q

6.1 length/norm of a vector

A

||v|| = sqrt(v*v) = sqrt(v1^2 + … vn^2)

19
Q

6.1 ||cv|| = ?

A

|c| ||v||

20
Q

6.1 unit vector

A

vector whose length is 1

21
Q

6.1 How to find the distance between u and v

A

dist(u,v) = ||u-v||

22
Q

6.1 Orthogonal

A

2 vectors are orthogonal if u*v = 0

23
Q

6.1 A vector is perpendicular to W only if what?

A

x is orthogonal to every vector in a set that spans W.

x * all other vectors = 0

24
Q

6.1Theorem 3

A

The orthogonal complement of the row space of A is the null space of A. The orthogonal complement of the column space of A is the null space of A^T

25
Q

6.2 Theorem 4

A

If S is an orthogonal set of nonzero vectors, then S is linearly independent and a basis for the subspace spanned by S

26
Q

6.2 Orthogonal Basis

A

A basis for W that is also an orthogonal set

27
Q

6.2 Theorem 5: For each y is W the constants in the linear combination y = c1u1…cpup are given by ______

A

cj = (yuj)/(ujuj)
where j = 1…p

28
Q

6.2 Theorem 6: An mxn matrix U has orthonormal columns only if ____

A

U^T*U = I(identity matrix)

29
Q

6.2 Theorem 7: Let U be an mxn matrix with orthonormal columns, and let x and y be in R^n. Then ||Ux|| = ?

A

||x||

30
Q

6.2 Theorem 7: Let U be an mxn matrix with orthonormal columns, and let x and y be in R^n. Then (Ux)*(Uy) = ?

A

x*y

31
Q

6.2 Theorem 7: Let U be an mxn matrix with orthonormal columns, and let x and y be in R^n. Then
(Ux)*(Uy) = 0 only if ____

A

x*y= 0

32
Q

6.3 The orthogonal Decomposition Theorem

A

Let W be a subspace of R^n. Then each y in R^n can be written in the form: y = yhat +z

where yhat is in W and z is perpendicular to W. If {u1…up} is an orthogonal basis of W, then:
Yhat = ((yu1)/u1u1)u1) + …((yup)/upup)up)

and z = y - yhat

33
Q

6.3 If y is in W = Span{u1…up}, then ?

A

projection of w y = y

34
Q

6.3 The best Approximation Theorem

A

Let W be a subspace of R^n, let y be a vector in R^n, and let yhat be the orthogonal projection of y onto W.

then yhat is the closest point in W to y.
||y-yhat|| < ||y-v||
for all v in W distinct from yhat

35
Q

6.3 theorem 10. if {u1… up} is an orthonormal basis for a subspace W of R^n then what

A

projw y = (yu1)u1 + … (yup)up

36
Q

6.3 theorem 10 if U = [u1…up] then what

A

projw y = UU^T y for all why in R^n

37
Q

4.1 subspace: 3 properties

A

contains the zero vector
is closed under addition
is closed under multiplication