test 1 vocab Flashcards

(89 cards)

1
Q

equivalent

A

when two systems possess equal solutions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

unique solution

A

there is one and only one set of values for the xi’s that satisfies all equations simultaneously

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

no solution

A

there is not set of values for the xi’s that satisfies all equations simultaneously- the solution set is empty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

infinitely many solutions

A

there are infinitely many different sets of values for the xi’s that satisfy all equations simultaneously. It is not difficult to prove that if a system has more than one solution, then it has infinitely many solutions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

elementary operations

A
  1. ) interchanging ith and nth equations
  2. )replace ith equation by nonzero multiple of itself
  3. ) replace jth equation by a combination of itself plus a multiple of the with equation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

square system

A

n equations and n unknowns

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Gaussian eliminations

A

1.) eliminate all terms below the first pivot
2.) select a new pivot
3.) eliminate all terms below second pivot
and continue

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

triangularized

A

all pivots are 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

back substitution

A

the last equation is solves for the value of the last unknown and then substituted back into the penultimate equation and so on

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

scalar

A

a real or complex number

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

row

A

horizontal line

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

column

A

vertical line

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

submatrix

A

of A is an array obtained by deleting any combination of rows and columns from A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

shape or size

A

m(rows)xn(columns)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Gauss-Jordan Method

A
  1. ) at each step, the pivot element is forced to be 1

2. ) all terms above and below the pivot are eliminated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

tridiagonal

A

the nonzero elements occur only on the sub diagonal, main diagonal, and super diagonal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

partial pivoting

A

at each step, search the positions on and below the pivot position for the coefficient with the maximum magnitude. If necessary interchange the rows to bring the larger number into the pivot position

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

row scaling

A

multiplying selected rows by non zero multipliers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

column scaling

A

multiplying selected columns by nonzero multipliers

-alters exact solution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

complete pivoting

A

search the pivot position and every position below or to the right for the maximum magnitude, if necessary perfume row and column interchange to bring largest number to pivot position

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

ill conditioned

A

some of the perturbation in the system can produce relatively large changes in the exact solution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

well conditioned

A

if not ill conditioned

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

rectangular

A

if m and n are no the same

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

main diagonal

A

where the pivot positions are located, the diagonal line from the upper left hand to the lower right hand corner

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
row echelon form
1. )if Ei*, consists entirely of zeros, then all rows below Ei* are also entirely zero, i.e. all zero rows are at the bottom 2. ) if the first nonzero entry in Ei* lies the nth position then all entries below the ith position in columns E*1, E*2...E*j are zero
26
rank
in echelon form, the number of pivots, number of nonzero rows in E, number of basic columns in A
27
basic columns
those columns in A that contain the pivotal positions
28
reduced row echelon form(EA)
1. ) E is in row echelon form 2. ) the first nonzero entry in each row(each pivot) is 1 3. )all entries above each pivot are 0
29
consistent
a system of m linear equations in n unknowns that posses at least one solution - rank[A|b]=rank(A) - b is a nonbasic column in [A|b] or is a combination of the basic columns in A
30
inconsistent
a system of m linear equations in n unknowns that has no solutions, when a row of all zeros produces a nonzero solution
31
homogeneous system
the right hand side consists entirely of 0's | -consistency is never an issue because the zero solution is always a solution
32
nonhomogeneous system
there is at least one nonzero number on the right hand side
33
trivial solution
the solution consisting of all zeros
34
basic variables
when there are more unknowns then equations, we have to pick "basic" unknowns and solve for these in terms of the other unknowns - there are r basic variables
35
free variables
whose values must remain arbitrary or free | - there are n-r free variables
36
general solution
use Gaussian elimination to reduce to row echelon form. identify basic and free variables. apply back substitution and solve for the basic variables in terms of the free variables x=P(particular solution)+xf1h1 +xf2h2+...
37
equal matrices
when A and B are the same size and corresponding entries are equal
38
column vector
an array consisting of a single column
39
row vector
an array consisting of a single row
40
addition of matrices
if A and B are mxn the sum is the mxn matrix A+B, by adding corresponding entries
41
additive inverse (-A)
the matrix obtained by negating each of the entries
42
transpose (A^T)
``` of a mxn matrix, is the nxm matrix A^T obtained by interchanging rows and columns. A^T= aji properties: (A+B)^T= A^T+B^T (sA)^T=sA^T ```
43
conjugate matrix (A^-)
a^-ij,
44
conjugate transpose (A^-T)(A*)
a^-ji properties: (A+B)* =A*+B* (sA)*= s^-A*
45
diagonal matrix
entries are symmetrically located about the main diagonal
46
symmetric matrix
A =A^T, when aij=aji
47
skew-symmetric matrix
A=-A^T, when aij= -aji
48
hermitian matrix
A=A*, when aij=a^-ji. the complex analog of symmetry
49
skew-hermitian matrix
A=-A*, when aij= -a^-ji. the complex analog of skew symmetry
50
linear function
1. ) f(x+y)= f(x)+f(y) | 2. ) f(sx)=sf(x)
51
conformable
in AB when A has exactly as many columns as B has rows
52
matrix product
for comfortable matrices Amxp= aij and Bpxn=bij, AB is the mxn matrix whose i,j entry is the inner product of the ith row of A with the jth column in B -matrix multiplication is NOT commutative
53
cancellation law
when sB=sY and s=/0 implies B=Y
54
linear system
Ax=b
55
distributive and associative laws
for comfortanble matrices: A(B+C)=AB+BC (D+E)F=DF+EF A(BC)=AB(C)
56
identity matrix (I)
nxn matrix with 1's on the main diagonal and 0's everywhere else AI*j=A*j
57
reverse order law for transposition
for comfortable A and B (AB)^T=B^TA^T (AB)*=B*A*
58
Trace
for a square matrix, is the sum of its main diagonal entires trace(AB)=trace(BA)
59
block matrix multiplication
A and B are partitioned into sub matrices, referred to as blocks. if the pairs (Aik, Bkj) are comfortable then A and B are comfortably partitioned
60
reducible systems
block-triangular systems
61
inverse of A
given, A and B are square, AB=I and BA=I | the inverse of A is, B=A^-1
62
nonsingular matrix
an invertible square matrix
63
singular matrix
a square matrix with no inverse
64
matrix equations
if A is nonsingular then there is a unique solution for X, (Anxn)(Xnxp)=Bnxp and the solution is: X=A^-1(B) for system of n linear equations and n unknowns: (Anxn)(Xnx1)=bnx1 X=A^-1(b)
65
existence of an inverse
for nun the following are equivalent: - A^-1 exists (nonsingular) - rank(A)=n - A------>(Guass Jordan)--->I - Ax=0 (implies x=0)
66
computing the inverse
[A|I}---->GJ--->[I|A^-1]
67
properties of matrix inversion
``` For nonsingular A and B: (A^-1)^-1=A AB is nonsingular (AB)^-1=B^-1A^-1 (A^-1)^T= (A^T)^-1 (A-1)*=(A*)^-1 ```
68
Sherman-Morrison Formula
if Anxn is nonsingular and c and d are nx1 columns such that 1+d^TA^-1c is not 0, then the sum of A+cd^T; (A+cg^T)^-1= A^-1 - (A^-1cd^TA^-1)/(1+d^TA^-1c)
69
sherman morrison woodbury formula
if C and D are nxk such that (I+D^TA^-1) exists then: | (A+CD^T)^-1 = A^-1C(I+D^TA^-1C)^-1D^TA^-1
70
Neumann Series
if lim n--->infinity, then I-A is nonsingular and (I-A)^-1=I+A+A^2... sum of A^K. provides approximation of (I-A)^-1 when A has entires of small magnitude.
71
ill conditioned
if a small relative change in A can cause a large relative change in A^-1.
72
condition number
how the degree of ill conditioning is gauged. | k=||A|| ||A^-1|| where ||*|| is a matrix norm.
73
sensitivity
of the solution of Ax=b to perturbations(or errors) in A is measured by the extent to which A is an ill conditioned matrix.
74
elementary matrices
matrices in the form I-uv^T, where u and v are nx1 columns such that v^tu=/1 they are nonsingular and (I-uv^T)^-1 = I- (uv^T)/(v^Tu-1)
75
type 1
interchanging rows
76
type 2
multiplying rows(columns) by a scalar
77
type 3
adding a multiple of a row(column) i to a row(column)j
78
products of elementary matrices
A is a nonsingular matrix if and only if A is the product of elementary matrices of type 1,2, and 3
79
equivalent matrices(~)
when B can be derived from A by a combination of elementary row and column operations A~B
80
row equivalent (~row)
when B can be obtained from A by preforming a sequence of elementary row operations only A~rowB
81
column equivalent (~col)
when B can be obtained from A by preforming a sequence of elementary column operations only A~colB
82
transitive
A~B and B~C------> A~C
83
Rank normal form (Nr)
A~Nr= (Ir 0) | (00)
84
LU factorization of A
A=LU, product of lower triangle matrix L and an upper triangle matrix U. the decomposition of A into A=LU - matrices L and U are called LU facts of A
85
elementary lower triangular matrix
Tk = I-cke^Tk, where ck is a column with zeros in the first k positions
86
leading principle sub matrices
the sub matrices taken from the upper left hand corner
87
positive definite
A symmetric matrix A possessing an LU factorization in which each pivot is positive
88
band matrix
aij=0
89
bandwidth
when |i-j|>w for some positive integer w