(W1) Divide and Conquer Flashcards
(28 cards)
3x Big O Notations
Big O (upper bound)
Big Omega Ω (lower bound)
Big Theta Θ (exact)
How to know if an algorithm is correct?
- Program always terminates
- Produces the correct result when it terminates
What are Loop Invariants?
Conditions that hold true before and after each iteration of the loop - and possibly throughout the execution of the loop
What is the cost of comparison for numbers?
For 2004, we can assume its O(1)
What is the cost of comparison for strings?
Worst case is O(L), where L is the length of the shortest string
What is the lower complexity bound for comparison based sorting algorithms?
Ω(N log N)
Selection Sort Algorithm
for i in N elements
- j = get the smallest number in [i…n]
- swap i with j
Selection Sort Time/Space/Aux Complexity
Time complexity is O(N^2)
Space complexity is O(N)
Aux Space Complexity is O(1) - in place
Is Selection Sort stable?
No!
Take the following example {3, 3, 1}
selection sort will swap the 1st and 3rd values - resulting in it being unsorted
What is the Quick Sort Big Idea?
If the list is less than 1: do nothing (base case)
pivot = Choose a Pivot
partition around the pivot
quick sort on the left, quicksort on the right
What is the Complexity of QuickSort?
It depends :)
If the pivot is the median, then it’ll be O(n log n)
O(n) to partition, O(log n) depth
However, if the pivot happens to the smallest or largest element, then it will end up being O(n^2)
O(n) to partition, O(n) depth
What is the Naive QuickSort partitioning approach? (and how to make it stable)
Create a left, right list
- For each element which is less or equal to the pivot, ○ Put in left - For each element which is greater than pivot ○ Put in right Return [left] + pivot + [right]
This won’t be stable for elements which equal the pivot
To fix this, create a centre list, and put all items equal to the pivot in this (including the pivot)
Return [left] + [pivots] + [right]
What is the Hoare Paritioning Algorithm (incl if it’s in place)
Create left_bad and right_bad pointers, equal to 0 and N respectively
While left_bad <= right_bad
- move left_bad right until we find a bad element (item > pivot)
- move right_bad left until we find a bad element (item <= pivot)
- swap these elements
this is in place (so better space complexity)
What is the Dutch National Flag Approach to partitioning (and what problems does it solve)?
Group elements into 3 groups
< pivot
= pivot
> pivot
Importantly - it creates a distinction for elements equal to the pivot
What is the Dutch National Flag Algorithm?
3 Pointers:
1. Low - the position where the next <p should be placed
2. Mid - the current element being considered
3. High - the position where the next >p should be placed
Starting positions Low/Mid -> start of array, High -> end of array
Traverse the array from mid to high
- If the element is <p, swap it with the element at low and increment low and mid
- if the element =p, move mid forward
- if the element is >p, swap it with high and decrease high
Dutch National Flag loop invariants
- Array [1, low-1] contains the <p
- Array [low, mid-1] contains the mid
- Array [mid, high] contains unknown items
- Array [high to 1+n] contains the blue items
What is our approach to choosing a pivot?
Solve the k-th smallest element problem.
Lets say we had this ability - we could find the median by finding n/2-th smallest element
what if we could do this in O(n)? That would ensure that QuickSort never ends up being O(n^2)
What is the QuickSelect Algorithm?
Finds the k-th smallest value
If high > low:
- select a pivot
- partition the array around the pivot
- if k < mid -> quickselect on left
- if k > mid -> quickselect on right
- otherwise return array[k]
return array[k]
What are the two options to select a pivot for QuickSelect?
Pick a random pivot O(n)
Median of Medians Approach
- Divide the original list of n elements into [n/5] groups of size at most 5
- Find the median of each of these groups
- Find the true median of the n/5 medians
What is the Counting Sort algorithm (naive)
Determine the max possible value
Make a count array
Iterate over the input, and count the number of times each input appears
Iterate through the count array and use this to produce a sorted output
What is the problem with the naive counting sort algorithm?
Not stable, and no ability to carry across data (can literally only be used for numbers)
What is the Counting Sort Algorithm (stable)
(same as naive)
Determine the max possible value
Make a count array
Iterate over the input, and count the number of times each input appears
(different)
create a position array (same length as count)
- set first position to 0
- position[i] = position[i-1] + count[i-1]
iterate over the input, and for each (key, value) pair
- set output[position[key]] = (key, value)
- increment position[key]
What is the space/time complexity of Counting Sort (and what is it good/bad for)
Time: O(N+U) or O(N) if U=N
Space: O(N+U) or O(N) if U=N
Therefore, this algorithm is good for when the domain is minimal
(linear time)
but very bad for when there are few values, and the domain is massive (e.g. sorting [0, 123123123123])
What is the approach with Radix Sort?
Sort each position (instead of each item), and work ur way up