NECESSARY Flashcards

(67 cards)

1
Q

U⊆V subpace of V

A

U nonempty and cu+v ∈U whenever u,v ∈U and c ∈F

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

span(v1,..,vr∈V)

A

set of all linear combinations of v1,…,vr
:= {c1v1+…+crvr|c1,…,cr∈F} ⊆V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

spanning set of V

A

{v1,..,vr∈V} if every vector in V can be written as linear comb. of v1,..,vr
span(v1,..,vr) = V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

span(U ∪ W) when U,W subspaces of V

A

= U + W
is subspace of V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

DIRECT SUM

A

whenever U,W subspaces of V and U ∩ W = {0}
i.e. each vector in U+W can be expressed as unique sum of vector in U and vector in W

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

BASIS of vector space V

A

v1,…,vr∈V
if linearly independent and span(v1,…,vr)=V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

MATRIX BASIS

A

Let A = [a1 … an] ∈ Mn(F) be invertible then a1,…,an is basis for Fn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

DIMENSION MATRIX

A

Let A = [a1 … an] ∈ Mmxn(F) and β=a1,…,an in Fm.
then dim span β = dim col A = rank A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

dim(U ∩ W) + dim (U + W) =

A

dim U + dim W

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

β-BASIS REPRESENTATION FUNCTION

A
  • the function [.]_β :V->Fn defined by [u]_β = [c1…cn]^T
    where β = v1,…,vn is basis for finite-dim. V and u∈V is any vector written as (unique) linear comb. u= c1v1+…+cnvn
  • c1,…,cn are “coordinates of u” w.r.t. basis β.
  • [u]_β is “β-coordinate vector of u”
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

LINEAR TRANSFORMATION INDUCED BY A

A

T_A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

T∈L(V,W) is one-to-one ⟺

A

ker(T) = {0}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

KerT
T:V->W

A

= {v∈V|T(v) = 0}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

RanT
T:V->W

A

= {w∈W|∃v ∈V : T(v)=w}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

SYLVESTER EQUATION

A

T(X) = AX+XB = C

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

β-γ change of basis matrix

A

γ[I]β = [[v1]γ … [vn]γ]

β = {v1,…,vn}
γ = {w1,…,wn} bases for V

  • describes how to represent each vector in basis β as linear combination of vectors in basis γ
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

inverse of γ[I]β

A

β[I]γ = [[w1] β … [wn] β]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

cor.2.4.11,12

A
  1. if a1,…,an is basis of Fn then A=[a1…an] ∈Mn(F) is invertible
  2. A∈Mn(F) invertible ⟺ rankA=n
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

A, B∈Mn(F) SIMILAR

A

if there is invertible S∈Mn(F) such that A = SBS^(-1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

A,B∈Mn(F) similar ⟺

A

there is n-dim. V, with bases β and γ and linear operator T ∈L(V) such that A=β[T]β and B= γ[T]γ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

A,B∈Mn(F) similar ⟹

A
  1. A-λI is similar to B-λI for every λ∈F
    (if there is λ∈F such that (A-λI) is similar to (B-λI) then A similar to B)
  2. TrA=TrB and detA=detB
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

EQUIVALENCE RELATION

A

a relation between pair of matrices that is:
1. reflexive,
(A similar to A)

  1. symmetric,
    (A similar to B

    B similar to A)
  2. transitive,
    (A similar to B and B similar to C

    A similar to C)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

DIMENSION THEOREM FOR LINEAR TRANSFORMATIONS

A

T∈L(V,W), V finite dim.
dim ker T + dim ran T = dim V

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

DIMENSION THEOREM FOR MATRICES
A ∈Mmxn(F)

A
  • dim null A + dim col A = n
  • if m=n then nullA={0} ⟺colA=Fn
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
dim COL (A) = dim ROW(A) =
number of leading 1s in row reduced echelon form
26
NULL (A)
ker(A)
27
ORTHOGONAL u,v∈V
= 0 u⊥v
27
INNER PRODUCT on V
function <.,.> : VxV -> F satisfying ∀u,v,w∈V and ∀c∈F: - real >=0 - = 0 ⟺ v=0 - = + - = c - = __
28
ORTHOGONAL SUBSETS A,B ⊆V
every u∈A, v∈B u⊥v
29
ORTHOGONAL PROPERTIES
- u⊥v ⟺ v⊥u - 0⊥u, ∀u∈V - v⊥u, ∀u∈V ⟹ v=0
30
= , ∀u∈V ⟹
v=w
31
NORM DERIVED FROM INNER PRODUCT
||v|| = √ referred to as norm on V
32
DERIVED NORM PROPERTIES
- ||u|| real >=0 - ||u||= 0 ⟺ u=0 - ||cu|| = |c|||u|| - =0 ⟹ ||u+v||^2 = ||u||^2 + ||v||^2 - ||u+v||^2 + ||u-v||^2 = 2||u||^2 + 2||v||^2
33
CAUCHY SCHWARZ INEQUALITY
|| <= ||u||||v|| and equal ⟺ u,v linearly dependent i.e. one is scalar multiple of other
34
TRIANGLE INEQUALITY FOR DERIVED NORM
||u+v||<= ||u||+||v|| and equal ⟺ one is real non-neg. scalar multiple of other
35
POLARISATION IDENTITIES (4.5.24)
V F-inner-product space and u,v∈V - if F=R, then = 1/4 (||u+v||^2 - ||u-v||^2) - if F=C, then = 1/4 (||u+v||^2 - ||u-v||^2 + i||u+iv||^2 - i||u-iv||^2)
36
NORM on V
function ||.|| : V -> [0, ∞) with following properties for ∀u,v∈V and ∀c∈F: - ||u|| real >=0 - ||u||=0 ⟺ u=0 - ||cu|| = |c|||u|| - ||u+v||<= ||u||+||v||
37
NORMALISATION
- u/||u||
38
ORTHONORMAL VECTORS
u1,u2,... (finite or infinite) such that = δij for all i,j
39
δij
= 1 if i=j = 0 if not
40
ORTHONORMAL SYSTEM
orthonormal sequence of vectors u1,...,un ⟹ || Σaiui||^2 = Σ|ai|^2 for all ai ∈F ⟹ ui linearly independent
41
ORTHONORMAL BASIS
basis for a finite-dimensional inner product space that is orthonormal system
42
GRAM-SCHMIDT THEOREM
V inner product space v1,...,vn∈V linearly independent There is orthonormal system u1,...,un such that span{v1,...,vk} = span {u1,...,uk}, k=1,...,n
43
GRAM SCHMIDT PROCESS
u1=v1 uk = vk - Σ(uj)/||uj||^2 ek = uk/||uk|| sum from j=1 to k-1
44
LINEAR FUNCTIONAL
a linear transformation φ:V->F where V is F-vectorspace
45
RIESZ REPRESENTATION THEOREM
- let V be finite dim. F-inner product space and φ:V->F be a linear functional 1. there is a unique w∈V such that φ(v) = , ∀v∈V 2. let u1,...,un be an orthonormal basis of V. the vector w in (1) is w= (φ(u1))(_)*u1 + (φ(un))(_)*un
46
RIESZ VECTOR FOR LINEAR FUNCTIONAL φ
w
47
HERMITIAN A
if square matrix A=A*
48
ORTHOGONAL COMPLEMENT
if U nonempty subset of inner product space V U^⊥ = {v∈V:=0, ∀u∈U} * if U=∅, then U^⊥ = V
49
s MINIMUM NORM SOLUTION of Ax=y
if ||s||_2<=||u||_ 2 whenever Au=y for A(mxn) and y ∈colA
50
ORTHOGONAL PROJECTION of v onto u
the linear operator P_U∈L(V) defined by: (P_U)v = Σui, ∀v∈V for finite dim. subspace U of V with u1,...,ur orthonormal basis of U (sum from i=1 to r)
51
BEST APPROXIMATION THEOREM
let U be finite dim. subspace of inner prod. space V and let P_U be the orthogonal proj. onto U. ||v-(P_U)v||<=||v-u|| ∀v ∈V, ∀u ∈U with equality ⟺u=(P_U)V
52
NORMAL EQUATIONS
U finite dim. subspace of inner product space V span{u1,..,un}=U - the projection of V onto U P_U V= Σcjuj where [c1 ... cn]^T is solution of the normal equations [[]...[]][ci] = [] (*) - the system (*) is consistent - if u1,...,un linearly independent then (*) has unique solution
53
GRAM MATRIX of u1,..,un
G(u1,...,un) = [[]...[]] if u1,...,un vectors in inner product space
53
best approximation of v w.r.t. V
projection of v onto V
54
PROPERTIES OF GRAM MATRIX
- hermitian - positive semi-definite
55
LEAST SQUARE SOLUTION OF AN INCONSISTENT LINEAR SYSTEM
- if A∈M(mxn) and y∈Fm then x0∈Fn satisfies min(x∈Fn) ||y-Ax||_2 = ||y-Ax0||_2 ⟺ A*Ax0 = A*y - the system is always consistent, it has a unique solution if rankA=n
56
A∈Mn(F) POSITIVE SEMI-DEFINITE
A hermitian and >= 0 for all x∈Fn
57
A∈Mn(F) POSITIVE DEFINITE
A hermitian and > 0 for all nonzero x∈Fn
58
A∈Mn(F) NEGATIVE SEMI-DEFINITE
if -A is positive semidefinite
59
A∈Mn(F) NEGATIVE DEFINITE
if -A is positive definite
60
SYMMETRIC PROPERTIES
1. Has real eigenvalues 2. Eigenvectors corresponding to the eigenvalues are orthogonal
61
SINGULAR VALUES
- δ1,..., δr where δ1>=...>= δr>0 and (δi)^2= λi where λi is eigenvalue of A*A or AA*
62
SINGULAR VALUE DECOMPOSITION
- define Σr = [[δ1 0 ... 0][0 δ2 0...0]...[0...0 δr]] ∈ Mr(R) =: zero matrix with ascending singular values as diagonal * r=rankA - then there are unitary matrices V∈Mm(F) and W∈Mn(F) such that A=VΣW* where Σ = [[Σr]0] ∈ Mmxn(R) is the same size as A if m=n, then V,W∈Mn(F) and Σ = Σr + 0n-r * columns of V are "left singular vectors of A" * columns of W are "right-singular vectors of A"
63
MULTIPLICITY OF δi
= the multiplicity of δ^2 as eigenvalue of A*A * if δ zero then multiplicity:=min{m,n}-r * if singular value has multiplicity 1 then called "simple" * if every singular value of A is simple then they are "distinct"
64
A IDEMPOTENT
A^2=A
65
A ORTHOGONAL PROJECTION
A is idempotent and symmetric ranA=colA ker(A)=ran(A)^ ⊥