Algorithms Flashcards

Week 2.7, 2.8, 2.9 (78 cards)

1
Q

sum of arithmetic series

A

a1 + (n - 1)d

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

sum of geometric permutations

A

(a1(1 - r^n))/(1 -r)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

number of permutations

A

n!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

number of k-combinations

A

n1/(n - k)!k!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

log inequality

A

log2 n <= n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

pseudocode I/O

A

input/output operations - brief descriptions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

pseudocode for arrays of fixed size

A

A[0 … n-1]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

pseudocode !=

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

pseudocode <=

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

pseudocode assignment

A

i <- 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

pseudocode if condition

A

if i < n

else

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

pseudocode for loops

A

for i <- 1 to n do

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

pseudocode for while loops

A

i <- 1
while i < n do

i <- i + 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

resource constraints of algorithms

A
  • space
  • time
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

algorithm cost of solution

A

amount of resources a solution consumes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

empirical analysis of algorithms

A
  • using methods like gmtime() to get the actual running time
  • actual running time of an algorithm depends on computer used
  • data collected gives an idea how running time grows when input size grows
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

theoretical analysis of algorithms

A
  • analytical approach aimed at estimating the time complexity of the running time of algorithm based on the pseudocode
  • time complexity (or running time) of an algorithm in the RAM model is the total number of basic operations for each possible problem instance as a function of the size of the instance
  • worst-case instance considered when analysing time complexity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

while loop with n/2 iterations

A

i <- 1
while i <= n do

i <- i + 2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

pseudocode nested loops

A

for i <- 1 to n do
for j <- 1 to n do

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

pseudocode dependent nested loops

A

for i<- 1 to n do
for j <- 1 to i do

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

iterations of a dependent nested loop

A

1/2 x n(n + 1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

2 basic principles of complexity analysis

A
  1. consider a dominating term - low-order ignored
  2. ignore multiplitive constants
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

log2n loop

A

i <- 1
while i < n do

i <- i * 2

OR

i <- 1
while i < n do

i <- i/2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

describe big-O

A
  • worst-case efficiency
  • worst-case instances of size n
  • asymptotic upper bound for a given function
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
how to prove that t(n) is O(f(n))
to prove that f(n) is O(f(n)) we need to specify two positive constants c and n0 such that t(n) <= cf(n) for all n >= n0
26
describe big omega
asymptotic lower bound for a given function
27
how to prove t(n) is omega(f(n))
a function t(n) is said to be omega(f(n)) if there exists a positive constant c and n0 such that t(n) >= cf(n) for all n >= n0
28
describe big theta
- used to express two functions t(n) & f(n) are of the same order, or asymptotically equal
29
how to show that f(n) equals theta(f(n))
a function t(n) is said to be in theta(f(n)) if there exists positive constants c', c'' such that 0 <= c'f(n) <= t(n) <= c''f(n) for all n >= m0
30
how to use limits to show f(n) is O(f(n))
limn->infinity((t(n))/(f(n)) = 0
31
what can we deduce from limn->infinity(t(n)/f(n)) = c
where c > 0 is a finite constant, then t(n) is theta(f(n)). - it is also O(f(n)) and is omega(f(n))
32
what can we deduce from limn->infinity(t(n)/f(n)) = infinity
then t(n) is omega(f(n))
33
selection sort steps
- search array for smallest element & swap element with the first one - search for the second smallest and swap that element with the second one - repeated described steps until the whole array is sorted
34
key characteristics of a selection sort
- the array at any moment is divided into 2 parts: sorted & unsorted - in the sorted array all elements <= to elements of the unsorted array - in every pass sorted sub-array extends by 1 - brute-force
35
big theta of selection sort
n^2
36
big theta of bubble sort
n^2
37
bubble sort step through
- place the 1st largest element at its correct position - repeat through elements in list - sorted sub array starts from the back of the list
38
bubble sort pseudocode
39
insertion sort pseudocode
40
big theta of insertion sort
n^2
41
describe insertion sort
- building a sorted sub-array starting from the beginning of the input array - it then iterates through the remaining unsorted elements, inserting each into its correct position within the sorted sub-array - first element considered sorted so starts from second element
42
what type of algorithm is sequential search
brute-force
43
strength of sequential search
simple & straightforward
44
weakness of sequential search
not efficient
45
worst case of sequential search
element in the last index of the array
46
big O of sequential search
n
47
what type of algorithm is binary search
divide & conquer
48
describe the steps of binary search in terms of divide & conquer
1. divide - if stopping criterion is not reached, then a problem's instance is divided into smaller instances 2. recur - the smaller instances are solved recursively 3. conquer - if necessary, the solutions obtained for smaller instances are combined in order to get a solution to the original problem
49
prequesition of the array for binary search
array must be sorted
50
binary search pseudocode
51
big O of binary search
log n
52
what are the 2 overheads associated with recursive algorithms
1. time 2. space
53
explain the time overhead of recursive algorithms
a recursive call incurs time to create an activation record to maintain the stack of activation records. When the call finishes it takes time to remove the activation record from the stack and resume execution of the suspended program
54
explain the space overhead of recursive algorithms
additional memory is needed to store a return address (the place in algorithm from where the procedure was called) and to store thee data used by the procedure
55
define recursive
repetitive process in which an algorithm calls itself
56
2 features of recursive algorithm
1. base case 2. recursive part
57
box tracing
58
steps to find big theta of recursive algorithms
1. find the main operation 2. set up a recurrence relation with the initial condition - i.e let M(n) denote the number of multiplications 3. solve the recurrence relation to find a general formula for 0 <= i <= n 4. determine correctness by inductive proof, substituting i = n to find big theta
59
find big theta of recursive binary search
60
pseudocode of the 3 recursive algorithms to calculate 2^n
61
what are the three different big theta's of the algorithms used to calculate 2^n
A - 2^n B - n C - log n
62
big theta of towers of hanoi
2^n
63
big theta of recursive binary search
log2n
64
big theta of fibonacci numbers recursive
set^n
65
big theta of an iterative fibonacci number sequence
n
66
pseudocode for euclid's algorithm
67
how to determine if a base case is appropriate
the problem gets smaller starting from the second recursive call and the base case is always reached
68
pseudocode for k-combinatioons
69
box tracing for k-combinations
70
big theta of merge sort
nlogn
71
big theta of quick sort
n^2
72
proof of merge sort big theta
73
describe the master theorem
let T(n) be the recurrance relation for the number of basic operations done, by an algorithm. For a 'standard recurrence', T(n) we assume: - in the base case, T(n) is constant - for larger n, T(n) = aT(n/b) + f(n) where: - a >= 1 is the number of subproblems to be solved - b > 1 is the input shrinkage factor - f(n) is the number of basic operations for splitting into subproblems
74
binary search master theorem
a = 1 b^d = 2^0 T(n) = T(n/2) + 1 = log n
75
merge sort master theorem
a = 1 b^d = 2^1 T(n) = 2T(n/2) + 1 = nlogn
76
if a < b^d then the master theorem is
theta(n^d)
77
if a = b^d then the master theorem is
theta(n^d x log n)
78
if a > b^d then the master theorem is
theta(n^(logb(a)))