Divide and Conquer Flashcards
(54 cards)
Q: If a recursive algorithm divides its input size by 2 at each step, follows one path, and does constant work, what’s the runtime?
A: O(log n) - Example: Binary Search
Master Theorem:
T(n) = aT(n/b) + O(n^d)
T(n) =1T(n/2) + O(n^0) # Each level is O(1)
b^d (1) = a (1), therefore O(n^d log n) using MT
Simplifies to O(log n)
Q: If a recursive algorithm looks at BOTH halves of the input and does constant work per level, what’s the runtime?
A: O(n) - Example: Binary Tree Traversal
Master Theorem:
T(n) = aT(n/b) + O(n^d)
T(n) = 2T(n/2) + O(n^0)
- subproblems: 2
- scaling: 2
- work at each level: O(1)
b^d (1) < a (2), therefore O(n^{log_b a}) using MT
Plug in: O(n^{log_2 2})
Simplifies to O(n^1) to O(n)
Q: If a recursive algorithm divides input by 4, looks at all quarters, and does O(n^2) work per level, what’s the runtime?
A: O(n^2)
Q: If a recursive algorithm divides input by 2, processes both halves, and does O(n*log(n)) work per level, what’s the runtime?
A: O(n*log^2(n))
Q: If a recursive algorithm divides input by 3, processes all thirds, and does O(sqrt(n)) work per level, what’s the runtime?
A: O(n^{log_3(3)}) = O(n)
Q: If a recursive algorithm divides input by 2, processes both halves, and does O(n^2) work per level, what’s the runtime?
A: O(n^2)
- T(n) = 2T(n/2) + O(n^2)
- a=2, b=2, d=2, since a<b^d, we get O(n^d)
Q: If a recursive algorithm divides input into 3 equal parts, processes all parts, and does O(1) work per level, what’s the runtime?
A: O(n^{log_3(3)}) = O(n)
Q: If a recursive algorithm divides input by 2, looks at BOTH halves, and does O(n) work per level, what’s the runtime?
A: O(n log n) - Example: Mergesort
Q: When you see T(n) = 1T(n/2) + O(n^0), what runtime should you expect?
A: O(log n)
Why? Dividing by 2 each time with constant work per level
Master theorem:
b^d = 2^0 = 1 = a
if b^d = a, T(n) = O(n^d log n)
(simplifies to O(log n) since d is zero)
Q: When you see T(n) = 2T(n/2) + O(n), what runtime should you expect?
A: O(n log n)
Why? Linear work at each level, with log n levels
Master theorem:
b^d = 2 = a, so
T(n) = O(n log n)
Q: When you see T(n) = 2T(n/2) + O(1), what runtime should you expect?
A: O(n)
Why? Even though dividing by 2, visiting both halves means visiting all n elements once
Master’s Theorem:
b^d = 2^0 = 1
2 (a) > 1 (b^d)
if a > b^d: O(n^{log_b a})
T(n) = O(n^{log_2 2})
= O(n^1)
= O(n)
Q: If a recursive algorithm divides input into 3 parts and does constant work, what’s the runtime?
A: O(n)
T(n) = 3T(n/3) + O(1)
a > b^d
O(n^(log_b a)) = O(n^(log_3 3)) = O(n)
Given O(3^(log_2 n)) , solve for n
(Hint: just need to swap 2 things)
O(n^(log_2 3))
Idea: Switch 3 and n
[Recurrence relation]
For substitution based recurrences of form T(n) = aT(n-b) + c, what does it typically result in?
O(a^n)
What is runtime of:
T(n) = 2T(n-1) + c
O(2ⁿ)
What is runtime of:
3T(n-1)
O(3^n)
What is runtime of:
2T(n-2) + c
O(2^{n/2})
We subtract 2 each time (not 1)
We need to reach the base case T(0) or T(1)
So if we subtract 2k, we need: n-2k = 0
Solving for k: k = n/2
[Recurrence Relation]
What is runtime of below with no merge work:
T(n) = T(n/2)
O(1)
T(n/2)
= O(n^{log_2 1})
= O(n^0)
= O(1)
[FastSelect] Explain why select v in list A between 25th and 75th percentile will ensure sublists A_L and A_R to have size at worst 3/4 of A
- A number at the 75th percentile has 75% of elements less than or equal to it, and 25% greater. A_L has at most 75% of elements
- A number at the 25th percentile has 25% of elements less than or equal to it, and 75% greater A_R has at most 75% of elements.
- Therefore, no matter where v falls in this range, neither S_L nor S_R can be larger than 75% of the original array
[FastSelect]
Proof of good pivot
pivot p is good if?
|A < p| <= 3/4n
|A > p| <= 3/4n
[FastSelect]
Runtime analysis
Breaking A in n/5 groups: ?
Sorting each 5-element group: ?
Call FastSelect(S, n/10) to get pivot: ?
Partitioning A (k<=|A<p|, k>|A<p| + |A=p|, p): ?
Breaking A in n/5 groups: O(n)
Sorting each 5-element group: O(n) (O(1) * n groups)
Call FastSelect(S, n/10) to get pivot: T(n/5)
Partitioning A (k<=|A<p|, k>|A<p| + |A=p|, p): ? T(3n/4) if good pivot
T(n) = T(3n/4) + T(n/5) + O(n)
= O(n)
[Masters Theorem]
What does a^{log_b n (or n^{log_b a}) represent in a D&C problem of size n?
Width of the last level (base case of 1), ie. a^{log_b n}
Master’s Theorem formula:
if a > b^d, what is T(n)?
O(n^{log_b a})
Master’s Theorem formula:
if a = b^d, what is T(n)?
O(n^d log(n))