True or False Flashcards

(113 cards)

1
Q

A vector is any element of a vector space.

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

A vector space must contain at least two vectors.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

If u is a vector and k is a scalar such that ku = 0, then it must be true that k = 0.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

The set of positive real numbers is a vector space if vector addition and scalar multiplication are the usual operations of addition and multiplication of real numbers.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

In every vector space the vectors (βˆ’1)u and βˆ’u are the
same.

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

In the vector space 𝐹(βˆ’βˆž, ∞) any function whose graph
passes through the origin is a zero vector.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Every subspace of a vector space is itself a vector space.

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Every vector space is a subspace of itself.

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Every subset of a vector space 𝑉 that contains the zero vector in 𝑉 is a subspace of 𝑉.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

The kernel of a matrix transformation 𝑇𝐴 ∢ 𝑅n →𝑅m is a
subspace of 𝑅m.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

The solution set of a consistent linear system 𝐴x = b of m equations in n unknowns is a subspace of 𝑅n.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

The intersection of any two subspaces of a vector space 𝑉
is a subspace of 𝑉.

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

The union of any two subspaces of a vector space 𝑉 is a subspace of 𝑉.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

The set of upper triangular n Γ— n matrices is a subspace of
the vector space of all n Γ— n matrices.

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

An expression of the form k1v1 + k2v2 + β‹… β‹… β‹… krvr is called a linear combination.

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

The span of a single vector in 𝑅2 is a line.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

The span of two vectors in 𝑅3 is a plane.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

The span of a nonempty set 𝑆 of vectors in 𝑉 is the smallest subspace of 𝑉 that contains 𝑆.

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

The span of any finite set of vectors in a vector space is closed under addition and scalar multiplication.

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Two subsets of a vector space 𝑉 that span the same subspace of 𝑉 must be equal.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

The polynomials x βˆ’ 1, (x βˆ’ 1)2, and (x βˆ’ 1)3 span 𝑃3.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

A set containing a single vector is linearly independent.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

No linearly independent set contains the zero vector.

A

TRUE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Every linearly dependent set contains the zero vector.

A

FALSE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
If the set of vectors {v1, v2, v3} is linearly independent, then {kv1, kv2, kv3} is also linearly independent for every nonzero scalar k.
TRUE
26
If v1, . . . , vn are linearly dependent nonzero vectors, then at least one vector vk is a unique linear combination of v1, . . . , vkβˆ’1.
TRUE
27
The set of 2 Γ— 2 matrices that contain exactly two 1’s and two 0’s is a linearly independent set in 𝑀22
FALSE
28
The three polynomials (x βˆ’ 1)(x + 2), x(x + 2), and x(x βˆ’ 1) are linearly independent.
TRUE
29
The functions 𝑓1 and 𝑓2 are linearly dependent if there is a real number x such that k1𝑓1(x) + k2𝑓2(x) = 0 for some scalars k1 and k2.
FALSE
30
If 𝑉 = span{v1, . . . , vn}, then {v1, . . . , vn} is a basis for 𝑉
FALSE
31
Every linearly independent subset of a vector space 𝑉 is a basis for 𝑉.
FALSE
32
If {v1, v2, . . . , vn} is a basis for a vector space 𝑉, then every vector in 𝑉 can be expressed as a linear combination of v1, v2, . . . , vn
TRUE
33
The coordinate vector of a vector x in 𝑅n relative to the standard basis for 𝑅n is x
TRUE
34
Every basis of 𝑃4 contains at least one polynomial of degree 3 or less
FALSE
35
The zero vector space has dimension zero.
TRUE
36
There is a set of 17 linearly independent vectors in 𝑅17
TRUE
37
There is a set of 11 vectors that span 𝑅17
FALSE
38
Every linearly independent set of five vectors in 𝑅5 is a basis for 𝑅5
TRUE
39
Every set of five vectors that spans 𝑅5 is a basis for 𝑅5
TRUE
40
Every set of vectors that spans 𝑅n contains a basis for 𝑅n
TRUE
41
Every linearly independent set of vectors in 𝑅n is contained in some basis for 𝑅n
TRUE
42
There is a basis for 𝑀22 consisting of invertible matrices
TRUE
43
If 𝐴 has size n Γ— n and 𝐼n, 𝐴, 𝐴2, . . . , 𝐴n2 are distinct matrices, then {𝐼n, 𝐴, 𝐴2, . . . , 𝐴n2 } is a linearly dependent set
TRUE
44
There are at least two distinct three-dimensional subspaces of 𝑃2
FALSE
45
There are only three distinct two-dimensional subspaces of 𝑃2.
FALSE
46
If 𝐡1 and 𝐡2 are bases for a vector space 𝑉, then there exists a transition matrix from 𝐡1 to 𝐡2
TRUE
47
Transition matrices are invertible
TRUE
48
If 𝐡 is a basis for a vector space 𝑅n, then 𝑃𝐡→𝐡 is the identity matrix
TRUE
49
If 𝑃𝐡1→𝐡2 is a diagonal matrix, then each vector in 𝐡2 is a scalar multiple of some vector in 𝐡1
TRUE
50
If each vector in 𝐡2 is a scalar multiple of some vector in 𝐡1, then 𝑃𝐡1→𝐡2 is a diagonal matrix
FALSE
51
If 𝐴 is a square matrix, then 𝐴 = 𝑃𝐡1→𝐡2 for some bases 𝐡1 and 𝐡2 for 𝑅n
FALSE
52
The span of v1, . . . , vn is the column space of the matrix whose column vectors are v1, . . . , vn.
TRUE
53
The column space of a matrix 𝐴 is the set of solutions of 𝐴x = b
FALSE
54
If 𝑅 is the reduced row echelon form of 𝐴, then those column vectors of 𝑅 that contain the leading 1’s form a basis for the column space of 𝐴.
FALSE
55
The set of nonzero row vectors of a matrix 𝐴 is a basis for the row space of 𝐴.
FALSE
56
If 𝐴 and 𝐡 are n Γ— n matrices that have the same row space, then 𝐴 and 𝐡 have the same column space
FALSE
57
If 𝐸 is an m Γ— m elementary matrix and 𝐴 is an m Γ— n matrix, then the null space of 𝐸𝐴 is the same as the null space of 𝐴
TRUE
58
If 𝐸 is an m Γ— m elementary matrix and 𝐴 is an m Γ— n matrix, then the row space of 𝐸𝐴 is the same as the row space of 𝐴
TRUE
59
If 𝐸 is an m Γ— m elementary matrix and 𝐴 is an m Γ— n matrix, then the column space of 𝐸𝐴 is the same as the column space of 𝐴
FALSE
60
The system 𝐴x = b is inconsistent if and only if b is not in the column space of 𝐴.
TRUE
61
There is an invertible matrix 𝐴 and a singular matrix 𝐡 such that the row spaces of 𝐴 and 𝐡 are the same
FALSE
62
Either the row vectors or the column vectors of a square matrix are linearly independent
FALSE
63
A matrix with linearly independent row vectors and linearly independent column vectors is square
TRUE
64
The nullity of a nonzero m Γ— n matrix is at most m
FALSE
65
Adding one additional column to a matrix increases its rank by one
FALSE
66
The nullity of a square matrix with linearly dependent rows is at least one
TRUE
67
If 𝐴 is square and 𝐴x = b is inconsistent for some vector b, then the nullity of 𝐴 is zero
FALSE
68
If a matrix 𝐴 has more rows than columns, then the dimension of the row space is greater than the dimension of the column space
FALSE
69
If rank(𝐴𝑇) = rank(𝐴), then 𝐴 is square
FALSE
70
There is no 3 Γ— 3 matrix whose row space and null space are both lines in 3-space
TRUE
71
If 𝑉 is a subspace of 𝑅n and π‘Š is a subspace of 𝑉, then π‘ŠβŸ‚ is a subspace of π‘‰βŸ‚
FALSE
72
If 𝐴 is a square matrix and 𝐴 x = πœ†x for some nonzero scalar πœ†, then x is an eigenvector of 𝐴.
FALSE
73
If πœ† is an eigenvalue of a matrix 𝐴, then the linear system (πœ†πΌ βˆ’ 𝐴)x = 0 has only the trivial solution
FALSE
74
If the characteristic polynomial of a matrix 𝐴 is p(πœ†) = πœ†2 + 1 then 𝐴 is invertible
TRUE
75
If πœ† is an eigenvalue of a matrix 𝐴, then the eigenspace of 𝐴 corresponding to πœ† is the set of eigenvectors of 𝐴 corresponding to πœ†
FALSE
76
The eigenvalues of a matrix 𝐴 are the same as the eigenvalues of the reduced row echelon form of 𝐴
FALSE
77
If 0 is an eigenvalue of a matrix 𝐴, then the set of columns of 𝐴 is linearly independent
FALSE
78
An n Γ— n matrix with fewer than n distinct eigenvalues is not diagonalizable
FALSE
79
An n Γ— n matrix with fewer than n linearly independent eigenvectors is not diagonalizable
TRUE
80
If 𝐴 and 𝐡 are similar n Γ— n matrices, then there exists an invertible n Γ— n matrix 𝑃 such that 𝑃𝐴 = 𝐡𝑃
TRUE
81
If 𝐴 is diagonalizable, then there is a unique matrix 𝑃 such that π‘ƒβˆ’1𝐴𝑃 is diagonal
FALSE
82
If 𝐴 is diagonalizable and invertible, then π΄βˆ’1 is diagonalizable
TRUE
83
If 𝐴 is diagonalizable, then 𝐴𝑇 is diagonalizable
TRUE
84
If there is a basis for 𝑅n consisting of eigenvectors of an n Γ— n matrix 𝐴, then 𝐴 is diagonalizable
TRUE
85
If every eigenvalue of a matrix 𝐴 has algebraic multiplicity 1, then 𝐴 is diagonalizable
TRUE
86
If 0 is an eigenvalue of a matrix 𝐴, then 𝐴2 is singular
TRUE
87
The matrix [1 0 0 1 0 0] is orthogonal
FALSE
88
The matrix [1 βˆ’2 2 1] is orthogonal
FALSE
89
An m Γ— n matrix 𝐴 is orthogonal if 𝐴𝑇𝐴 = 𝐼
FALSE
90
A square matrix whose columns form an orthogonal set is orthogonal
FALSE
91
Every orthogonal matrix is invertible
TRUE
92
If 𝐴 is an orthogonal matrix, then 𝐴2 is orthogonal and (det 𝐴)2 = 1
TRUE
93
Every eigenvalue of an orthogonal matrix has absolute value 1
TRUE
94
If 𝐴 is a square matrix and ‖𝐴uβ€– = 1 for all unit vectors u, then 𝐴 is orthogonal
TRUE
95
If 𝐴 is a square matrix, then 𝐴𝐴𝑇 and 𝐴𝑇𝐴 are orthogonally diagonalizable
TRUE
96
If v1 and v2 are eigenvectors from distinct eigenspaces of a symmetric matrix with real entries, then β€–v1 + v2β€–2 = β€–v1β€–2 + β€–v2β€–2
TRUE
97
Every orthogonal matrix is orthogonally diagonalizable
FALSE
98
If 𝐴 is both invertible and orthogonally diagonalizable, then π΄βˆ’1 is orthogonally diagonalizable
TRUE
99
Every eigenvalue of an orthogonal matrix has absolute value 1.
TRUE
100
If 𝐴 is an n Γ— n orthogonally diagonalizable matrix, then there exists an orthonormal basis for 𝑅n consisting of eigenvectors of 𝐴.
TRUE
101
If 𝐴 is orthogonally diagonalizable, then 𝐴 has real eigenvalues.
TRUE
102
If all eigenvalues of a symmetric matrix 𝐴 are positive, then 𝐴 is positive definite
TRUE
103
x2^1βˆ’ x2^2 + x3^2 + 4x1x2x3 is a quadratic form
FALSE
104
(x1 βˆ’ 3x2)^2 is a quadratic form
TRUE
105
A positive definite matrix is invertible
TRUE
106
A symmetric matrix is either positive definite, negative definite, or indefinite
FALSE
107
If 𝐴 is positive definite, then βˆ’π΄ is negative definite
TRUE
108
x Β· x is a quadratic form for all x in 𝑅n
TRUE
109
If 𝐴 is symmetric and invertible, and if x𝑇𝐴x is a positive definite quadratic form, then xπ‘‡π΄βˆ’1x is also a positive definite quadratic form
TRUE
110
If 𝐴 is symmetric and has only positive eigenvalues, then x𝑇𝐴x is a positive definite quadratic form
TRUE
111
If 𝐴 is a 2 Γ— 2 symmetric matrix with positive entries and det(𝐴) > 0, then 𝐴 is positive definite
TRUE
112
If 𝐴 is symmetric, and if the quadratic form x𝑇𝐴x has no cross product terms, then 𝐴 must be a diagonal matrix
TRUE
113
If x𝑇𝐴x is a positive definite quadratic form in two variables and c β‰  0, then the graph of the equation x𝑇𝐴x = c is an ellipse
FALSE