Week 4 (Divide and Conquer and Combine) Flashcards

(10 cards)

1
Q

What is divide and conquer?

A

Divide and conquer breaks down a problem into smaller problems to be solved recursively before combining the sub solutions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the divide and conquer algorithm for merge sort?

A

Given a list of n numbers, we want to sort them in increasing order:

  • Split into two sets
  • Whilst both subsets are not empty, add the smallest value to the final subset
  • Increase counter on the set with the smaller value
  • When subset empty, add remainder of other subset

T(n) = T(n/2) + T(n/2) +T(combining)
T(n) = 2.T(n/2) + O(n) = O(nlogn)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How does breaking down merge sort into more sub problems change the time complexity?

A

There is no improvement on the time complexity, as there are more combination steps

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is unrolling the recurrence?

A
  • Open up the recurrence step-by-step

We write the recurrence for some constant c: T(n) <= 2.T(n/2) + cn with a base case: T(2) <= C.

(Use the n/x as the base case)

1) Analyse the first few levels:
- Lv:0 takes cn time and delegates two recursive calls of size n/2
- Lv:1 takes c(n/2)+c(n/2) and delegates 4 calls
2) Identify a pattern:
- Number of subproblem doubles as size halves
- Lv:j has 2^j subproblems of size n/2^j
- Lv:j takes 2^j . (c.(n/2^j) = cn time
3) Sum over all levels of recursion
- n/2^t = 2, t = (log2 n) -1 => O(log2 n) levels

=> Total running time O(n log2 n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is verifying by substitution in the recurrence

A
  • Check by induction if the guess is satisfied, might get a higher bound that isn’t optimal

Suppose T(n) <= cn.log2 n
1) Using the base case = 2, T(2) <= c <= cn.logn
2) Inductive hypothesis: For each 2 <= m < n we have T(m) <= cm.logm
3) Inductive step:
T(n) <= 2T(n/2) + cn
<= 2c(n/2).log(n/2)+cn
= cn(logn-1) + cn
= cn.logn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the method of unrolling the recurrence when we divide the merge sort into 3 subsets of n/2?

A

For T(n) = 3.T(n/2) + cn, we have log2 n levels which need cn.(3/2)^j time

T(n) <= ∑ cn.(3/2)^j
= cn.∑ (3-2)^j
= cn . (((3/2)^logn - 1) / (3/2 - 1))
<= cn. (((3/2)^logn) / (3/2 - 1))
= 2cn.n^(log3/2)
= 2cn.n^(log3 - 1)
= 2cn.n^log3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the general solution for unrolling the recurrence when q > 2 where q is the number of branches?

A

T(n) = O(n^(log2 q))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How can we use divide and conquer to count inversions?

A

Split the list of numbers into two sets and use a merge sort algorithm to count the number of inversions in each group before counting the number of inversions between the groups.

T(n) <= 2.T(n/2) + O(n) = O(nlogn)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How can we use divide and conquer for faster squared integer multiplication?

A

Let T(n) be the time to square a n-bit number

Split x = 2^n/2.x0 + x1
x^2 = 2^n.x0^2 + 2^n/2 .2.x0.x1 + x1^2

To make sure that we are squaring to get all the values, we use:
(x0 + x1)^2 - x0^2 - x1^2 = 2.x0.x1

Total running time = T(n) <= 3.T(n/2) + O(n) = O(n^log2 3)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How can we use divide and conquer for faster squared multiplication?

A

Let T(n) be the time to multiply a n-bit number

Split x = 2^n/2.x1 + x0 and y = 2^n/2.y1 + y0
x.y = x0y0+2^n/2.(x1y0 + x0y1) + 2^n/x1y1

This has 4 instances of multiplication, which gives us O(n^2), no improvement on the naïve algorithm. We can reduce the number of instances by using:

x0y1 + x1y0 = (x1 + x0)(y1 + y0) - x1y1 - x0y0

Total running time = T(n) <= 3.T(n/2) + O(n) = O(n^log2 3)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly