Chapter 6 Flashcards

1
Q

Definition: dot products

A

For vectors u, v in R^n the inner product or dot product of u and v is given by u*v = u^T * v, where u and v and viewed as n x 1 matrices and is a scalar.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Theorem 6.1 (Properties of the Dot Product)

A

Let vectors u, v, w be elements of R^n, c is an element of the real numbers. Then:

a) u dot v = v dot u
b) (u+v) dot w = u dot w + v dot w
c) (cu) dot v = c(u dot v) = u dot (cv)
d) u dot u >=0, and u dot u = 0 if and only if u = zero vector

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Definition: distance between u and v

A

For vectors u, v are elements of R^n, the distance between u and v, dist(u,v), is given by dist(u,v) = ||u-v||

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Definition: orthogonal vectors

A

Two vectors u and v are called orthogonal (u is perpendicular to v) if u dot v = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Theorem 6.2 (The Pythagorean Theorem)

A

Two vectors u and v are orthogonal if and only if

|| u + v ||^2 = ||u||^2 + ||v||^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Orthogonal Complement

A

Let W be a subspace of R^n and let z be an element of R^n. The vector z is orthogonal to W if Z is orthogonal to every vector in W. The orthogonal complement of W, W perp is the set of all vectors orthogonal to the subspace W.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Theorem 6.3

A

For any m x n matrix A, (Row A) perp = Nul A and (Col A) perp = Nul A^T

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Other facts about W perp

A

For a subspace W of R^n:

1: x is an element of W perp if and only if x is orthogonal to every element in a basis for W
2: W perp is a subspace of R^n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Definition of Orthogonality

A

A set {u1, … , up} in R^n is orthogonal if ui * uj = o whenever i is not equal to j.
* An orthogonal set of unit vectors is called an orthonormal set.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Theorem 6.4

A

If a set S = {u1, … , up} is an orthogonal set of nonzero vectors in R^n, the S is linearly independent.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Orthogonal basis

A

An orthogonal basis for a subspace W of R^n is a basis that is also an orthogonal set.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Theorem 6.5

A

Let {u1, … , up} be an orthogonal basis for a subspace W of R^n. For every vector y is an element of W, if
y = c1u1+ … + cpup, then cj = (yuj)/(ujuj) for j = 1, … , p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Theorem 6.6

A

An m x n matrix U has orthonormal columns if and only if U^T * U = I

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Theorem 6.8 The Orthogonal Decomposition Theorem

A

Let W be a subset of R^n. Then each vector y is an element of R^n can be written uniquely as y = y hat + x, where y hat is an element of W and z is an element of W perp. In fact, if {u1, … , up} is an orthogonal basis for W, then

1: y hat= projection of y onto W = (yu1)/(u1u1)u1 + … + (yup)/(upup)up and
2: z = y - y hat

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Theorem 6.9 The Best Approximation Theorem

A

Let W be a subspace of R^n, a vector y is an element of R^n, y hat = projection of y onto W. Then y hat is the closest point in W to y, in the sense that:
|| y - y hat || < || y - v ||
for all v in W such that v is not equal to y hat.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Theorem 6.10

A

If {u1, … , up} is an orthonormal basis for a subspace W of R^n and a vector y is an element of R^n, then
the projection of y onto W = (yu1)u1 + … + (yup)up = U*U^T y,
where U = [ u1 … up].

17
Q

Theorem 6.11 The Gram-Schmidt Process

A

Given a basis {x1, … , xp} for a subspace W of R^n,
v1 = x1
v2 = x2 - ((x2v1)/(v1v1))v1
v3 = x3 - ((x3v1)/(v1v1))v1 - ((x3v2)/(v2v2))v2

vp = xp - ((xpv1)/(v1v1))v1 - ((p3v2)/(v2v2))v2 - … -((xpvp)/(vpvp))v2p
Then {v1, … , vp} is an orthogonal basis for W

18
Q

Least Squares Theorem

A

The set of least squares solutions of Ax = b coincides with the nonempty set of solutions of the normal equations A^T Ax = A^T b. If the columns of A are linearly independent, then A^T A is invertible and there is a unique least - squares solution, namely
x hat = (A^T A)^-1 A^T b