Linear Algebra Flashcards

(59 cards)

1
Q

5.1: Distance Between Two Vectors

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

5.1: The Cauchy-Schwarz Inequality

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

5.1: The Triangle Inequality

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

5.1 The Pythagorean Thereom

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

5.2: Definition of Inner Product

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

5.2 Orthogonal Projection and Distance

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

5.3 Orthonormal Basis

A

A set of vectors that are both mutually orthogonal and unit vectors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

5.3 Gram-Schmidt Orthonormalization Process

A
  1. B={v₁, v₂…,v}, a set of vectors that are the basis for an inner product space V.
  2. B’={w₁, w₂…,w}, w₁=v₁, w₂=v₂-proj_v₂w, w₃=v₃-proj_v₃w₁-proj_v₃w₂; orthogonalization
  3. Find the unit vectors for each w vector.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

3.1 Minor

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

3.1 Cofactor

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

3.1 Determinant of a Triangular Matrix

A

The product of all the entries on the principal diagonal.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

3.2 Elementary Row Operations and Determinants

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

3.3 Determinant of a Matrix Product
Determinant of a Scalar Multiple of a Matrix
Determinant of an inverse Matrix
Determinant of a Transpose of a Matrix

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

3.4 Adjoint of a Matrix

A

Adj(A)=the transpose of a cofactor matrix.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

3.4 Inverse of a nxn Matrix Using its Adjoint

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

3.4 Cramer’s Rule

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

3.4 Area of a Triangle with vertices

(x₁, y₁), (x₂, y₂), and (x₃, y₃)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

3.4 Two-Point Form of the Equation of a Line (x₁, y₁), (x₂, y₂)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

3.4 Volume of a Tetrahedron with vertices

(x₁, y₁,z₁), (x₂, y₂, z₂), (x₃, y₃, z₃), and (x₄, y₄, z₄)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

3.4 Three-Point Form of the Equation of a Plane

(x₁, y₁,z₁), (x₂, y₂, z₂), and (x₃, y₃, z₃)

25
2.3 Inverse of a 2x2 matrix
26
2.3 Inverse of an nxn matrix
27
2.3: Solve a system of equations
28
2.4 Definition of an Elementary Matrix
A nxn matrix that can be obtained from the identity matrix I by a single elementary row operation.
29
2.5 Stochastic Matrics
P=Probability Matrix (columns add up to 1): PX
30
2.5 Leontief Input-Output Models
31
2.5 Matrix Form for Linear Regression
32
2.5 Encryption
33
1.1: Row Echelon Form
A stair step pattern with leading coefficients of 1.
34
1.2: Reduced Row Echelon Form
Every column that has a leading 1 has zeros in every position above and below its leading 1.
35
1.2: Gaussian Eliminination with Back Substitution
1. Write the augmented matrix of the system of linear equations. 2. Use elementary row operations to rewrite the matrix in row-echelon form. 3. Use back-substitution to find the solution.
36
1.2: Gauss-Jordan Elimination
Same as Gaussian Elimination but instead of row-echelon form you rewrite the matrix in reduced row-echelon form.
37
1.2: Homogeneous System of Linear Equations
* Systems of linear equations in which each of the constant terms is zero. * A trivial solution is where all variables equal 0. * Must have at least one solution, which is the trivial solution. * If the system has fewer equations than variables, then it must have infinitely many solutions.
38
1.3.: Polynomial Curve Fitting
Subsitute each of the given points into the polynomial function then solve for each variable.
39
4.1: Properties of Vector Addition and Scalar Multiplication in Rn
40
4.1: Properties of Additive Identity and Additive Inverse
41
4.2: Properties of Scalar Multiplication
42
4.3: Test for a Subspace
Subspaces must contain the zero vector.
43
4.4: Finding a Linear Combination
44
4.5: Definition of Basis
A set of vectors S in a vector space V that span V and are linearly independent.
45
4.5: Characteristics of Bases 1. Uniqueness of Basis Representation 2. Bases and Linear Dependence 3. Number of Vectors in a Basis
1. If S is a basis for a vector space V then every vector in V can be written in one and only one way as a linear combination of vectors in S. 2. If S is a basis for a vector space V then every set containing more than n vectors in V is linearly dependent. 3. If a vector space V has one basis with n vectors, then every basis for V has n vectors.
46
4.5: Number of Dimensions in a Vector Space Rn, Pn, Mm,n
Rn: n, Pn:n+1, Mm,n: mn
47
4.6 Basis for the Row Space of a Matrix
If a matrix *A* is row-equivalent to a matrix *B* in row-echelon form, then the nonzero row vectors of *B* form a basis for the row space of *A*.
48
4.6: Basis for a Column Space of a Matrix
Nonzero row vectors of *B* form a basis for the row space of *AT*.
49
4.6: Rank of a Matrix
The dimension of the row (or column) space of a matrix *A* is called the rank of *A* and is denoted by rank(*A*).
50
4.6: Nullspace
If *A* is an *m* x *n* matrix, then the set of all solutions of the homogeneous system of linear equations *A***x***=* **0** is a subspace of *Rn* called the nullspace of *A* and is denoted by *N*(*A*).
51
4.6: Finding the Nullspace
1. Write the coefficient matrix in row-echelon form. 2. Solve for the variables, making use of parametric form. 3. Write the system as linear combinations of the variables.
52
4.6: Dimension of the Solution Space
Rank (*A*)–nullity
53
4.6: Finding the Solution Set of a Nonhomogenuous System of Linear Equations *A***x**=**b**
1. Write the augmented matrix in row-echelon form. 2. Solve for the variables, making use of parametric form. 3. Write the vectors as a linear combinations and remove the particular solution **x**p.
54
4.6: Summary of Equivalent Conditions for Square Matrices
55
4.7: Finding a Coordinate Matrix Relative to a Standard Basis
56
4.7: Finding a Coordinate Matrix Relative to a Nonstandard Basis.
1. Write **x** as a linear combination of the nonstandard basis **u**. **x**=*c*1**u**1+*c*2**u**2+*c*3**u**3. 2. Write as a system of linear equations and matrix equation. 3. Solve for the variable
57
4.7: Change of Basis of *Rn*
*P* [**x**]B' = [**x**]B ; P–1 [**x**]B = [**x**]B' ## Footnote *P* is the transitional matrix P–1 is the inverted transitional matrix [**x**]B is the coordinate matrix of **x** relative to *B* [**x**]B' is the coordinate matrix of **x** relative to *B'*
58
4.8: The Wronskian
59