matrix final Flashcards

(70 cards)

1
Q

consistent system

A

system of linear equations that has at least 1 solution (either unique or infinitely many)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

inconsistent system

A

no solution at all [0 0 0|15]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

elementary row operations (ero)

A
  • are reversible
  • 2 matrices are row equivalent if there exists a set of row operations that transforms one matrix into another
  • each ero replaces a system with an equal system
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

basic variable

A

a variable that corresponds to a pivot column

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

free variable

A

a variable that doesn’t correspond to a pivot column

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

if a linear system is consistent

A
  • consistent and independent: the system has exactly one solution
  • consistent and dependent: the system has infinitely many solutions
  • if you are not asked for a solution, you can just see if you get a triangular matrix of 0 on the bottom.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

linear combination

A

A linear combination of vectors or matrices involves expressing a given vector (or set of vectors) as a sum of scalar multiples of other vectors
-> 𝑣 = 𝑐1𝑣1 + 𝑐2𝑣2 + 𝑐3𝑣3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

how to see if two vectors (x1, x2) are a linear combination of another vector (b)

A

put x1 and x2 in a matrix and augment it with b. RREF and if there is no free variable (consistent), it is a linear combination. if there is a free variable (inconsistent) it is not a linear combination

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Ax = b (matrix equation)

A

if A is on an mxn matrix, with columns a1,….an, and if x is a vector in Rn, then the product of A and x, denoted Ax is the following linear combination:
x1a1 + x2a2 + x3a3 + …. + xnan

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

overdetermined system

A

when there are more equations than unknowns

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

solution to Ax = b

A

the equation Ax = b has a solution if and only if b is a linear combination of the columns of A.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Span {v1, v2, … vp} (Spans Rn)

A

the set of all possible linear combinations of those vectors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

subset of Rn

A

a collection of vectors within Rn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

homogeneous

A

a system of equations is called homogeneous f all the constant terms (right-hand side values) are zero.
- written as Ax = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

trivial solution

A

a homogeneous system always has at least 1 solution, called the trivial solution. ALL UNKNOWNS ARE ZERO
(X = 0)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

nontrivial solution

A

a homogeneous system may have infinitely many solutions, meaning THERE ARE FREE VARIABLES IN THE SYSTEM

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

non homogeneous

A

a system of linear equation is non-homogeneous if it can be written as Ax = b, where b ≠ 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

linearly dependent

A
  • vectors that are scalar multiples
  • matrix with free variables
  • a set that contains the 0 vector
  • no trivial solution for Ax = 0
  • if the set contains more columns than rows, set is linearly dependent
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

linearly independent

A
  • only one solution to Ax = 0
  • not scalar multiples
  • does not contain the 0 vector
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

commute

A

is AB, BA are defined and AB = BA, we say A and B commute

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

identity matrix

A

all entries are 0 except for the diagonals which equal 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

transpose of A (A^T)

A

an nxm matrix whose columns are formed by the corresponding rows of A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

transpose theorems

A
  • (A^T)^T = A
  • (A + B)^T = A^T + B^T
  • for any scalar r, (rA)^T = rA^T
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

inverse

A

an nxn matrix A is invertible if there is an nxn matric C such that
AC = In = CA

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
singular matrix
a matrix that is NOT invertible
26
nonsingular matrix
a matrix that IS invertible
27
process to find the inverse of a matrix
- form the augmented matrix [A|In] and reduce to echelon form - if the reduced echelon form has the In matrix on the left side, it is invertible - the reduced echelon form should be on the right side - to check, multiply original matrix by inverse to get In
28
determinant (det[a b; c d] = ad - bc
- if the determinant = 0, matrix is not invertible - if determinant ≠ 0, the inverse of that matrix A is 1/det(A)
29
subspace of Rn
a vector space that: - contains the 0 vector (must pass through the origin) - closed under vector addition (the sum of u + v must be in H) - closed under scalar multiplication (think of graphs, must pass through (0,0))
30
column space (Col(A))
row reduce matrix A. ColA are the pivot columns of the original matrix after reducing it
31
null space (Nul(A))
row reduce matrix A and solve for free variables. Solutions you get from this are your NulA
32
basis
smallest collection of vectors that are linearly independent and that are in the span
33
rank (rankA)
the dimension of ColA (number of pivot columns in ColA)
34
rank nullity theorem
if a matrix A has n columns, then the rankA + dimNulA = n - dimension of ColA + dimension of NulA should equal total number of columns for matrix A
35
basis theorem
let H be a p-dimensional subspace of Rn. Any linearly independent set of exactly p-vectors that belong in H will be a basis for H also any set of p vectors of H that spans H is a basis for H
36
properties of determinants
- row replace A to get B, det(B) = det(A) - interchange/swap two rows in A to get B, det(B) = -det(A) - scale a row in A by k to get B, det(B) = kdet(A) - if A is a triangular matrix, then detA is the product of the main diagonal entries
37
cramer's rule
- find detA - replace vector B with every column of A and find the determinant - to find a solution to the linear system: take each vector determinant, divide by determinant of A which gives you the solutions for each column
38
adjugate (adjA)
the adjugate of A is the nxn matrix is used to find the inverse of A - take determinant of each entry location - create transpose with determinant values - multiply transpose of A by 1/det(A)
39
to find area of a parallelogram
shift parallelogram to have on point centered at (0,0). take determinant of the two vectors that come out of the origin
40
eigenvector
an eigen vector of an nxn matrix A is a nonzero vector x such that Ax = λx for some scalar λ
41
eigenvalue λ
a scalar λ if there is a nontrivial solution x of Ax = λx such that x is called an eigenvector corresponding to λ
42
eigenvalue things
- eigenvalue is the scalar. so 4u, 4 is the eigenvalue to the eigenvector u - if there is an eigenvector, there is only 1 eigenvalue - if you have the eigenvalue, there can be multiple vectors - cannot do row operations to find eigenvalues - if an eigenvalue is 0, the matrix is not invertible
43
eigenspace
the eigenspace of A corresponding to an eigenvalue λ is Nul(A-λIn) (think of it graphically)
44
eigenvalues of a triangular matrix
the eigenvalues of a triangular matrix are the entries on its main diagonal
45
characteristic equations (new eigenvalue definition)
det(A-λIn) = 0
46
characteristic polynomial
det(A-λIn) - it is used to find eigenvalues
47
algebraic multiplicity
number of times λ appears as a root of the charateristic polynomial (how many times eigenvalues appear, including repeats)
48
similar matrices equation
let A and B be nxn matrices. we sat A is similar to B or A and B are similar if there is an invertible matrix P such that B = P^-1AP
49
What does it mean for matrices to be similar?
if nxn matrices A and B are similar, then they have the same charateristic polynomial and hence have the same eigenvalues (including algebraic multiplicity)
50
diagonalization (A = PDP^-1)
A square matrix A is diagonalizable if A is similar to a diagonal matrix
51
diagonalization theorem
A nxn matrix A will be diagonalizable if and only if A has n linearly independent eigenvectors
52
steps to diagonalize if possible
1. find all eigenvalues of A using det(A-λIn) 2. find eigenvectors that form the eigenspace which are linearly independent 3. construct p from the basis vectors and if the number of vectors match the number of eigenvalues, then it is diagonalizable
53
normalizing v
taking the length of vector v and turning it into a unit vector
54
distance
distance between two vectors u and v is the length of u-v. so √u-v
55
orthogonal
vectors are orthogonal to each other if the dot product is 0. orthogonal means perpendicular
56
orthogonal set
a set of vectos {u1, u2, ... up} in Rn is called an orthogonal set if each pair of distinct vectors from the set is orthogonal. - to be an orthogonal set, the vectors must be linearly independent and not contain the 0 vector
57
orthogonal basis
An orthogonal basis for a vector space is a set of orthogonal vectors that span the space. - every vector in the space can be expressed as a linear combination of the these basis vectors that are all orthogonal
58
orthonormal set
make the orthogonal set into unit vectors
59
orthonormal basis
basis of the orthonormal vectors
60
orthogonal matrix
nxn matrix U with orthonormal columns that is invertible
61
orthogonal projection (Proj(L)Y)
taking Y and projecting it onto L
62
gram-schmidt process
final answer should be a basis of projection vectors but they should be in their unit vector form
63
QR factorization
to find Q - do the gram-schmidt process to find R - take transpose of Q and multiply it by A
64
symmetric matrix
a symmetric matrix is an nxn matrix A such that A^T = A
65
orthogonally diagonalizable
an nxn matrix is said to be orthogonally diagonalizable if there exists an orthogonal matrix P and a diagonal matrix D such that A = PDP^-1 = PDP^T
66
how to orthogonally diagonalize a matrix
same beginning steps as normal diagonalization but when you find the eigenvectors, see if they are orthogonal. - an nxn matrix A is orthogonally diagonalizable if and only if A is a symmetric matrix
67
spectral decomposition
helps justify if a matrix is orthogonally diagonalizable
68
quadratic form (Q(x) = x^TAx)
a quadratic form on Rn is a function Q (has input of n entries and outputs a scalar) defined on Rn such that (Q(x) = x^TAx) where A is an nxn symmetric matrix
69
change of variable in a quadratic form (x = Py)
y^TDy is a quadratic form with no cross product terms (only values on the diagonal) - P is an invertible matrix and y is the new variable vector.
70
classifying quadratic forms
- positive definite: x > 0 for all x ≠ 0 - negative definite: x < 0 for all x ≠ 0 - indefinite: x has a positive and negative value - positive semidefinite: x ≥ 0 for all x - negative semidefinite: x ≤ 0 for all x