Chapter 2 Flashcards

1
Q

in the transformation y = Ax, what is A?

A

the (coefficient) matrix of the transformation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

identity matrix

A

denoted by I sub n, it’s the n by n matrix whose main diagonal consists of 1s, and the rest is 0s; it’s the coefficient of the identity transformation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what makes a linear transformation linear?

A

T(v + w) = T(v) + T(w) for all vectors v and w in R^m
T(kv) = kT(v) for all vectors v in R^m and all scalars k
T(0) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

distribution vector

A

a vector organized such that its components add up to 1 and are all positive or 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

transition/stochastic matrix

A

a square matrix organized such that all of its columns are distribution vectors - all of its entries are positive or zero and every column sums to 2.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

scaling

A

a linear transformation that changes the length of a vector. For any positive constant k, the matrix

[k 0
0 k]

defines a scaling by k. When k > 1, it’s a dilation. When k < 1, it’s a contraction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

orthogonal projection

definition

A

the shadow that the vector casts on some line running through the origin if a light were shined right on that line, denoted by projL(x).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

orthogonal projection

components

A

the vector x is the sum of that shadow and a line perpendicular to that shadow completing a triangle to it

x = projL(x) + perpendicular(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

orthogonal projection
equation and
transformation matrix

A

if w is a nonzero vector parallel to L, then
projL(x) = [(x * w)/(w * w)]w

if u is a unit vector in R^2 parallel to L, then
projL(x) = (x * u) * u.

The transformation T(x) = projL(x) is linear with matrix 
P = 1/(w1^2 = w2^2) *
[w1^2  w1w2
w1w2  w2^2] =
[u1^2 u1u2
u1u2 u2^2]
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

reflection

A

denoted by refL(x), is the reflection about some line L running through the origin.

x = (some line parallel to L) + (perpendicular line connection that parallel line and x)
so
refL = (that parallel line) - (that perpendicular line connecting that parallel line and x)

matrix is
[a b
b -a] where a^2 + b^2 = 1. Any matrix of this form represents a reflection about some line.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

formula relating reflections with orthogonal projections

A

refL(x) = 2projL(x) - x = 2(x * u)u - x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

orthogonal projections and reflections in space

A

if the plane is perpendicular to a line L, then
projV(x) = x - projL(x) = x - (x * u) * u
refL(x) = projL(x) - projV(x) = 2projL(x) - x = 2(x * u) * u - x
refV(x) = projV(x) - projL(x) = -refL(x) = x - 2(x * u) * u

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

rotation

A

transformation that rotates a vector x through a fixed angle Z (expressed in polar terms) in the counterclockwise direction.

The matrix in R^2 through an angle Z is
[ cosZ -sinZ
sinZ cosZ ]

it’s of the form
[a -b
b a] where a^2 + b^2 = 1.

Any matrix of this form represents a rotation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

rotations combined with a scaling

A

if r and Z are the polar coordinates of vector [a b] then
[a -b
b a]
represents a rotation through Z combined with a scaling by r.

[a
b] =
[r cos Z
r sinZ ].

r = sqrt(a^2 + b^2)

so Z = arccos[a / sqrt(a^2 + b^2)] and so forth

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

shear

A

transformation that sort of changes the reference angle of the axis that the shear occurs. Vertical shears change the directioning of things depending on how horizontal they are, and horizontal shears change the directioning of vertical things.

the matrix of a horizontal shear is of the form
[1 k
0 1]
and vertical,
[1 0
k 1], 

where k is an arbitrary constant reflecting the magnitude of the shear somehow, the slope of the “ruler” after transformation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

matrix multiplication by column

A

BA = B[v1 v2 v3 … vm] where vx is a column, of A.

= [Bv1 Bv2 … B*vm]

17
Q

matrix multiplication by the entries of the matrices

A

the ijth entry of BA is the dot product of the ith row of B with the jth row of column A.

18
Q

five properties of matrix multiplication

A
  • it’s associative
  • it’s usually noncommutative
  • A(C + D) = AC + AD and (A + B)C = AC + BC
  • (kA)B = A(kB) = k(AB)
  • A(identity matrix) = identitymatrix(A) = A
19
Q

invertibility and rref

A

a n x m matrix A is invertible if (and only if) rref(A) = In

or, equivalently, rank(A) = n

A^-1A = I underscore n = AA^-1

20
Q

invertibility and linear systems

A

If a square matrix A is invertible, then the system Ax = b has the unique solution x = A^-1b. If A is noninvertible then the system Ax = b has infinitely many solutions or none.

when b = 0, the system has x = 0 as a solution. But depending on the invertibility of A it either has one or infinitely many solutions.

21
Q

the inverse of a product of matrices

A

(BA)^-1 = (A^-1)B^-1

22
Q

what follows from BA = I sub n

A

A and B are both invertible
A^-1 = B and B^-1 = A
AB = I sub n

23
Q

inverse and determinant of a 2 x 2 matric

A

A =
[a b
c d]

is invertible iff
ad - bc != 0

if it is invertible,
A^-1 = 1/(ad - bc) *
[d -b
-c a]

24
Q

determinant of a 2 x 2 matrix

A

A =
[a b
c d]

ad - bc is the determinant of A, written det(A)

also, =
[a
c] * |sin Z| *
[b
d]

it’s the area of the parallelogram spanned by those two columns of A.

If those two columns are parallel, det A = 0
If 0 < Z < pi, det A > 0.
If -pi < Z < 0, det A < 0.