NOTES Flashcards

(94 cards)

1
Q

VECTOR SPACE

A
  • ∃0∈V : 0+u=u, ∀u∈V
  • u+v=v+u, ∀u,v∈V
  • u(v+w) = (u+v)+w, , ∀u,v,w∈V
  • ∃z∈V : u+z=0, ∀u∈V
  • 1.u=u, ∀u∈V
  • a.b.u = ab.u, ∀u∈V, ∀a,b∈F
  • a.(u+v) = (a.u)+(a.v), ∀u,v∈V, ∀a∈F
  • u.(a+b) = (a.u)+(b.u), ∀u∈V, ∀a,b∈F
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

V “zero vector space”

A

V={0}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

U is subspace of V (def.)

A

if U is subset of V that is a vector space with same vector addition and scalar multiplication as in V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

U⊆V subpace of V

A

cu+v ∈U whenever u,v ∈U and c ∈F

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

LINEAR COMBINATION of v1,…,vr ∈V

A

the expression c1v1 + … + crvr
for given c1,…,cr ∈F

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

span(v1,..,vr∈V)

A

set of all linear combinations of v1,…,vr
:= {c1v1+…+crvr|c1,…,cr∈F} ⊆V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

v∈span(v1,…,vn)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

span(∅)

A

= {0}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

PROPERTIES OF SPAN (thm.1.4.9,10)

A

let U,W ⊆ V
1. span U subspace of V
2. U ⊆ span U
3. U = span U ⟺ U is subspace of V
4. span(spanU) = span U
5. U ⊆W ⟹ span U ⊆ span W

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

spanning set of V

A

{v1,..,vr∈V} if every vector in V can be written as linear comb. of v1,..,vr
span(v1,..,vr) = V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

span(U ∪ W) when U,W subspaces of V

A

= U + W
is subspace of V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

DIRECT SUM

A

whenever U,W subspaces of V and U ∩ W = {0}
i.e. each vector in U+W can be expressed as unique sum of vector in U and vector in W

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

v1,…,vr LINEARLY DEPENDENT

A

if there are scalars c1,…,cr ∈F not all zero such that
c1v1+…+crvr=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

v1,…,vr LINEARLY INDEPENDENT

A

not linearly dependent

c1=…=cr=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Thm.1.6.11

A

v1,…,vr linearly independent
then a1v1+…+arvr = b1v1+…+brvr
⟺ ai=bi each i=1,…,r

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

BASIS of vector space V

A

v1,…,vr∈V
if linearly independent and span(v1,…,vr)=V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

MATRIX BASIS

A

Let A = [a1 … an] ∈ Mn(F) be invertible then a1,…,an is basis for Fn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

REPLACEMENT LEMMA

A
  • suppose β=u1,…,ur spans non-zero vector space V
  • let nonzero v∈V be v= Σciui from i=1 to r
    THEN
  • cj ≠ 0 for some j=1,…,r
  • cj ≠ 0 ⟹ v,u1,…,u(^)j,…,ur (*) spans V
  • β basis for V and cj ≠ 0 ⟹ (*) is basis for V
  • r>=2, β basis for V and v∉span{u1,…,uk} for some k ∈{k+1,k+2,…,r} such that v,u1,…,uk,uk+1,…,u(^)j,…,ur is a basis for V
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

DIMENSION V

A
  • {v1,…,vn} is basis of V
    ⟹ V dimension is n
  • V={0} ⟹ dimension 0
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

FINITE DIMENSIONAL

A
  • V has dimension n integer
    ⟹ dimension denoted dimV
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

INFINITE DIMENSIONAL

A
  • V not finite dimensional
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

DIMENSION MATRIX

A

Let A = [a1 … an] ∈ Mmxn(F) and β=a1,…,an in Fm.
then dim span β = dim col A = rank A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

dim(U ∩ W) + dim (U + W) =

A

dim U + dim W

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

β-BASIS REPRESENTATION FUNCTION

A
  • the function [.]_β :V->Fn defined by [u]_β = [c1…cn]^T
    where β = v1,…,vn is basis for finite-dim. V and u∈V is any vector written as (unique) linear comb. u= c1v1+…+cnvn
  • c1,…,cn are “coordinates of u” w.r.t. basis β.
  • [u]_β is “β-coordinate vector of u”
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
LINEAR TRANSFORMATION T:V->W
T(cu+v) = cT(u) +T(v) ∀c∈F,∀u,v∈V
26
LINEAR OPERATOR
V=W
27
SET OF LINEAR TRANSFORMATIONS/OPERATORS
L(V,W)/L(V)
28
LINEAR TRANSFORMATION INDUCED BY A
T_A
29
T∈L(V,W) is one-to-one ⟺
ker(T) = {0}
30
LINEAR TRANSFORMATION PROPERTIES
- T(cv) = cT(v) - T(0) = 0 - T(-v) = -T(v) - T(a1v1+...+anvn) = a1T(v1)+...+anT(vn)
31
KerT T:V->W
= {v∈V|T(v) = 0}
32
RanT T:V->W
= {w∈W|∃v ∈V : T(v)=w}
33
SYLVESTER EQUATION
T(X) = AX+XB = C
34
β-γ change of basis matrix
γ[I]β = [[v1]γ ... [vn]γ] β = {v1,...,vn} γ = {w1,...,wn} bases for V * describes how to represent each vector in basis β as linear combination of vectors in basis γ
35
inverse of γ[I]β
β[I]γ = [[w1] β ... [wn] β]
36
cor.2.4.11,12
1. if a1,...,an is basis of Fn then A=[a1...an] ∈Mn(F) is invertible 2. A∈Mn(F) invertible ⟺ rankA=n
37
A, B∈Mn(F) SIMILAR
if there is invertible S∈Mn(F) such that A = SBS^(-1)
38
A,B∈Mn(F) similar ⟺
there is n-dim. V, with bases β and γ and linear operator T ∈L(V) such that A=β[T]β and B= γ[T]γ
39
A,B∈Mn(F) similar ⟹
1. A-λI is similar to B-λI for every λ∈F (if there is λ∈F such that (A-λI) is similar to (B-λI) then A similar to B) 2. TrA=TrB and detA=detB
40
EQUIVALENCE RELATION
a relation between pair of matrices that is reflexive, symmetric and transitive.
41
REFLEXIVE
A similar to A
42
SYMMETRIC
A similar to B ⟹ B similar to A
43
TRANSITIVE
A similar to B and B similar to C ⟹ A similar to C
44
DIMENSION THEOREM FOR LINEAR TRANSFORMATIONS
T∈L(V,W), V finite dim. dim ker T + dim ran T = dim V
45
DIMENSION THEOREM FOR MATRICES A ∈Mmxn(F)
- dim null A + dim col A = n - if m=n then nullA={0} ⟺colA=Fn
46
dim COL (A) = dim ROW(A) =
number of leading 1s in row reduced echelon form
47
NULL (A)
ker(A)
48
INNER PRODUCT on V
function <.,.> : VxV -> F satisfying ∀u,v,w∈V and ∀c∈F: - real >=0 - = 0 ⟺ v=0 - = + - = c - = __
49
INNER PRODUCT SPACE
vector space V endowed with innerproduct
50
ORTHOGONAL u,v∈V
= 0 u⊥v
51
ORTHOGONAL SUBSETS A,B ⊆V
every u∈A, v∈B u⊥v
52
ORTHOGONAL PROPERTIES
- u⊥v ⟺ v⊥u - 0⊥u, ∀u∈V - v⊥u, ∀u∈V ⟹ v=0
53
= , ∀u∈V ⟹
v=w
54
NORM DERIVED FROM INNER PRODUCT
||v|| = √ referred to as norm on V
55
DERIVED NORM PROPERTIES
- ||u|| real >=0 - ||u||= 0 ⟺ u=0 - ||cu|| = |c|||u|| - =0 ⟹ ||u+v||^2 = ||u||^2 + ||v||^2 - ||u+v||^2 + ||u-v||^2 = 2||u||^2 + 2||v||^2
56
UNIT VECTOR u
||u|| = 1
57
CAUCHY SCHWARZ INEQUALITY
|| <= ||u||||v|| and equal ⟺ u,v linearly dependent i.e. one is scalar multiple of other
58
TRIANGLE INEQUALITY FOR DERIVED NORM
||u+v||<= ||u||+||v|| and equal ⟺ one is real non-neg. scalar multiple of other
59
POLARISATION IDENTITIES (4.5.24)
V F-inner-product space and u,v∈V - if F=R, then = 1/4 (||u+v||^2 - ||u-v||^2) - if F=C, then = 1/4 (||u+v||^2 - ||u-v||^2 + i||u+iv||^2 - i||u-iv||^2)
60
NORM on V
function ||.|| : V -> [0, ∞) with following properties for ∀u,v∈V and ∀c∈F: - ||u|| real >=0 - ||u||=0 ⟺ u=0 - ||cu|| = |c|||u|| - ||u+v||<= ||u||+||v||
61
NORMED VECTOR SPACE
real or complex V with norm
62
UNIT BALL of normed space V
{v∈V: ||v||<=1}
63
UNIT VECTOR u
if ||u|| = 1
64
NORMALISATION
- u/||u||
65
ORTHONORMAL VECTORS
u1,u2,... (finite or infinite) such that = δij for all i,j
66
δij
= 1 if i=j = 0 if not
67
ORTHONORMAL SYSTEM
orthonormal sequence of vectors u1,...,un ⟹ || Σaiui||^2 = Σ|ai|^2 for all ai ∈F ⟹ ui linearly independent
68
ORTHONORMAL BASIS
basis for a finite-dimensional inner product space that is orthonormal system
69
GRAM-SCHMIDT THEOREM
V inner product space v1,...,vn∈V linearly independent There is orthonormal system u1,...,un such that span{v1,...,vk} = span {u1,...,uk}, k=1,...,n
70
GRAM SCHMIDT PROCESS
u1=v1 uk = vk - Σ(uj)/||uj||^2 ek = uk/||uk|| sum from j=1 to k-1
71
LINEAR FUNCTIONAL
a linear transformation φ:V->F where V is F-vectorspace
72
RIESZ REPRESENTATION THEOREM
- let V be finite dim. F-inner product space and φ:V->F be a linear functional 1. there is a unique w∈V such that φ(v) = , ∀v∈V 2. let u1,...,un be an orthonormal basis of V. the vector w in (1) is w= (φ(u1))(_)*u1 + (φ(un))(_)*un
73
RIESZ VECTOR FOR LINEAR FUNCTIONAL φ
w
74
ADJOINT T*:W->V of T:V->W
if _W = _V , ∀v∈V, ∀w∈W
75
SELF-ADJOINT T
if T=T*
76
HERMITIAN A
if square matrix A=A*
77
ORTHOGONAL COMPLEMENT
if U nonempty subset of inner product space V U^⊥ = {v∈V:=0, ∀u∈U} * if U=∅, then U^⊥ = V
78
s MINIMUM NORM SOLUTION of Ax=y
if ||s||_2<=||u||_ 2 whenever Au=y for A(mxn) and y ∈colA
79
ORTHOGONAL PROJECTION of v onto u
the linear operator P_U∈L(V) defined by: (P_U)v = Σui, ∀v∈V for finite dim. subspace U of V with u1,...,ur orthonormal basis of U (sum from i=1 to r)
80
BEST APPROXIMATION THEOREM
let U be finite dim. subspace of inner prod. space V and let P_U be the orthogonal proj. onto U. ||v-(P_U)v||<=||v-u|| ∀v ∈V, ∀u ∈U with equality ⟺u=(P_U)V
81
NORMAL EQUATIONS
U finite dim. subspace of inner product space V span{u1,..,un}=U - the projection of V onto U P_U V= Σcjuj where [c1 ... cn]^T is solution of the normal equations [[]...[]][ci] = [] (*) - the system (*) is consistent - if u1,...,un linearly independent then (*) has unique solution
82
GRAM MATRIX of u1,..,un
G(u1,...,un) = [[]...[]] if u1,...,un vectors in inner product space
83
GRAM DETERMINANT of u1,...,un
g(u1,...,un) = detG(u1,..,un)
84
PROPERTIES OF GRAM MATRIX
- hermitian - positive semi-definite
85
LEAST SQUARE SOLUTION OF AN INCONSISTENT LINEAR SYSTEM
- if A∈M(mxn) and y∈Fm then x0∈Fn satisfies min(x∈Fn) ||y-Ax||_2 = ||y-Ax0||_2 ⟺ A*Ax0 = A*y - the system is always consistent, it has a unique solution if rankA=n
86
A∈Mn(F) POSITIVE SEMI-DEFINITE
A hermitian and >= 0 for all x∈Fn
87
A∈Mn(F) POSITIVE DEFINITE
A hermitian and > 0 for all nonzero x∈Fn
88
A∈Mn(F) NEGATIVE SEMI-DEFINITE
if -A is positive semidefinite
89
A∈Mn(F) NEGATIVE DEFINITE
if -A is positive definite
90
SINGULAR VALUES
- δ1,..., δr where δ1>=...>= δr>0 and (δi)^2= λi where λi is eigenvalue of A*A or AA*
91
SINGULAR VALUE DECOMPOSITION
- define Σr = [[δ1 0 ... 0][0 δ2 0...0]...[0...0 δr]] ∈ Mr(R) =: zero matrix with ascending singular values as diagonal * r=rankA - then there are unitary matrices V∈Mm(F) and W∈Mn(F) such that A=VΣW* where Σ = [[Σr]0] ∈ Mmxn(R) is the same size as A if m=n, then V,W∈Mn(F) and Σ = Σr + 0n-r * columns of V are "left singular vectors of A" * columns of W are "right-singular vectors of A"
92
MULTIPLICITY OF δi
= the multiplicity of δ^2 as eigenvalue of A*A * if δ zero then multiplicity:=min{m,n}-r * if singular value has multiplicity 1 then called "simple" * if every singular value of A is simple then they are "distinct"
93
A IDEMPOTENT
A^2=A
94
A ORTHOGONAL PROJECTION
A is idempotent and symmetric ranA=colA ker(A)=ran(A)^ ⊥