# Lecture 18-19 - Divide-And-Conquer Flashcards

Define the divide and conquer problem.

Recursive in structure

– Divide the problem into sub-problems that are

similar to the original but smaller in size

– Conquer the sub-problems by solving them

recursively. If they are small enough, just solve

them in a straightforward manner.

– Combine the solutions to create a solution to

the original problem

Define in words how Merge sort is accomplished using divide and conquer

Sorting Problem: Sort a sequence of n elements into

non-decreasing order.

• Divide: Divide the n-element sequence to be

sorted into two subsequences of n/2 elements

each

• Conquer: Sort the two subsequences recursively

using merge sort.

• Combine: Merge the two sorted subsequences to

produce the sorted answer.

Prove the correctness of Merge Merge(A, p, q, r) 1. n1 ¬ q – p + 1 2. n2 ¬ r – q 3. for i ¬ 1 to n1 4. .....do L[i] ¬ A[p + i – 1] 5. for j ¬ 1 to n2 6. .....do R[j] ¬ A[q + j] 7. L[n1+1] ¬ inf 8. R[n2+1] ¬ inf 9. i ¬ 1 10. j ¬ 1 11. for k ¬p to r 12. .....do if L[i] <= R[j] 13. ..........then A[k] ¬ L[i] 14. ...............i ¬ i + 1 15. ..........else A[k] ¬ R[j] 16. ...............j ¬ j + 1

Loop Invariant property (main for loop)

• At the start of each iteration of the for loop, Subarray A[p..k – 1] contains

the k – p smallest elements of L and R in sorted order.

• L[i] and R[j] are the smallest elements of L and R that have not been copied back into A

Initialization: Before the first iteration: • A[p..k – 1] is empty. • i = j = 1. • L[1] and R[1] are the smallest elements of L and R not copied to A.

Maintenance:

Case 1: L[i] £ R[j]

• By LI, A contains p – k smallest elements of L and R in sorted order.

• By LI, L[i] and R[j] are the smallest elements of L and R not yet copied into A.

• Line 13 results in A containing p – k + 1 smallest elements (again in sorted order). Incrementing i and k reestablishes the LI for the next iteration.

Case 2: Similar arguments with L[i] > R[j]

Termination:

• On termination, k = r + 1.

• By LI, A contains r – p + 1 smallest elements of L and R in sorted order.

• L and R together contain r – p + 3 elements including the two sentinels. So all elements are sorted.

What’s the running time of Merge sort? Measure it in terms of Divide, Conquer and Combine.

• Divide: computing the middle takes Q(1)

• Conquer: solving 2 subproblems takes 2T(n/2)

• Combine: merging n elements takes Q(n)

• Total:

T(n) = Q(1) if n = 1

T(n) = 2T(n/2) + Q(n) if n > 1

=> T(n) = Q(n lg n)

Assuming n is in power of 2, prove that Merge sort is T(n) = nlog2n with a recursion tree.

Draw a recursion tree for each term T(n) = 2T(n/2). Show that height is log2n and n nodes so T(n) = nlgn

Assuming n is in power of 2, prove that Merge sort is T(n) = nlog2n using induction on n.

・Base case: when n = 1, T(1) = 0.

・Inductive hypothesis: assume T(n) = n log2 n.

・Goal: show that T(2n) = 2n log2 (2n).

T(2n) = 2 T(n) + 2n

= 2 n log2 n + 2n

= 2 n (log2 (2n) – 1) + 2n

= 2 n log2 (2n). ▪

Prove that T(n) <= nlogn

for T(n) <= 0 if n = 1

T(n) <= T[upper(n/2)] + T[[lower(n/2)] + n otherwise.

Use strong induction.

・Base case: n = 1.

・Define n1 = [upper(n/2)] and n2 = [lower(n/2)]

・Induction step: assume true for 1, 2, … , n – 1.

T(n) <= T(n1) + T(n2) + n

<= n1 * upper[log2n1] + n2 * upper [log2n2] + n

<= n1 * upper[log2n2] + n2 * upper [log2n2] + n

= n * upper[log2n2] + n (see @)

<= n * (upper[log2n] - 1) + n

= n *upper[log2n]

@ n2 = upper (n/2)

<= upper(2[log2n/2])

= 2 [log2n]/2

log2n2 <= upper (log2n) -1

Define the multiplication of x and y in divide an conquer.

To multiply two n-bit integers x and y:

・Divide x and y into low- and high-order bits.

・Multiply four ½n-bit integers, recursively.

・Add and shift to obtain result.

(2^m a + b) (2^m c + d) = 2^2m ac + 2^m (bc + ad) + bd where m = n/2 a = x/2^m b = xmod2^m c = y/2^m d = ymod2^m

Prove the following proposition:

The divide-and-conquer multiplication algorithm requires Θ(n^2) bit operations to multiply two n-bit integers.

Apply case 1 of the master theorem to the recurrence:

T(n) = 4T(n/2) recursive call

+ Q(n) add shift

= Q(n^2)

Define the karatsuba trick. How does this help divide and conquer multiplication?

bc + ad = ac + bd – (a – b) (c – d)

This makes it so that Only three multiplication of n / 2-bit integers are needed.

Prove that karatsuba algorithm requires O(n^1.585) bit operations to multiply two n-bit integers.

T(n) = 3 T(n / 2) + Θ(n) ⇒ T(n) = Θ(n^lg3) = O(n^1.585)

How do you determine the running time of a divide-and-conquer algorithm? Explain each term.

The Master Theorem

T(n) = a * T(n/b) + f(n)

where

・a ≥ 1 is the number of subproblems.

・b > 0 is the factor by which the subproblem size decreases.

・f(n) = work to divide/merge subproblems.

Recursion tree.

・k = logb n levels.

・a^i = number of subproblems at level i.

・n / b^i = size of subproblem at level i.

What’s the T(n) of:

- Merge Sort
- Binary Search
- Karatsuba

T(n) = 2 * T(n/2) + n T(n) = T(n/2) + 1 T(n) = 3 * T(n/2) + n

Prove that if T (n) satisfies T (n) = 3 T (n / 2) + n, with T (1) = 1, then T (n) = Θ(n^lg 3).

r = 3/2 T(n) = (1 + r + r^2 + ... + r^log2n) * n = 3n^log2(3) - 2n = Θ(n^lg 3).

Write 3^(log2n) in a different way.

n ^ log2(3)