Sorting Flashcards
What is a heap?
- an array visualized as a binary tree
- ordered binary tree
- an implementation of a priority queue
- it must be a complete binary tree, which means that we fill up all of the nodes on the level we are on before adding another level
ex:
5
1 | 4
3 2 | 0 -1
What is a max heap?
All parent elements must be larger than both of their child elements
What does the build-max-heap method do in heaps?
builds a max heap from an unsorted array
What does the maxheapify method do in heaps?
- assumes part of the array is already sorted
- corrects a single violation of the heap property in a tree’s subroot
What is the time complexity of maxheapify?
lg(n)
What is the time complexity of heapsort?
nlgn
What are the steps for heapsort?
- build a max heap, so that the element with the greatest value is at the top.
- switch it with the last element and remove it from the heap.
- repeat steps one and two until the heap has only one element remaining.
How do we find the left child of any parent element?
multiply the parent element’s index by 2 and add 1.
let left = 2indexOfParent + 1
How do we find the right child of any parent element?
multiply the parent element’s index by 2 and add 2
let right = 2indexOfParent + 2
What are the steps for quick sort?
- choose a pivot
- find itemFromLeft that is larger than pivot
- find itemFromRight that is smaller than pivot
- swap itemFromLeft with itemFromRight
- repeat the process
- stop when index of itemFromLeft > index of itemFromRight
- swap itemFromLeft with the pivot
How do you choose a pivot for quick sort?
-choose middle element or
- median-of-three approach
look at the first medium and last positions of the array. we sort them in those positions and choose the middle element of the array after those three elements are swapped and placed in those three positions in sorted order
What is the time complexity of quick sort?
Worst case: n^2
average case: nlogn
this depends on how effectively you choose the pivot
What advantage does quick sort have over merge sort?
it is an in place sorting algorithm which means it does not need any auxiliary data structures
Is quick sort a divide and conquer algorithm?
yes
Implement quicksort
function QuickSort(arr, left = 0, right = arr.length - 1) { let len = arr.length, index
if(len > 1) {
index = partition(arr, left, right) if(left < index - 1) { QuickSort(arr, left, index - 1) } if(index < right) { QuickSort(arr, index, right) }
}
return arr
}
function partition(arr, left, right) { let middle = Math.floor((right + left) / 2), pivot = arr[middle], i = left, // Start pointer at the first item in the array j = right // Start pointer at the last item in the array
while(i <= j) {
// Move left pointer to the right until the value at the // left is greater than the pivot value while(arr[i] < pivot) { i++ }
// Move right pointer to the left until the value at the // right is less than the pivot value while(arr[j] > pivot) { j-- }
// If the left pointer is less than or equal to the // right pointer, then swap values if(i <= j) { [arr[i], arr[j]] = [arr[j], arr[i]] // ES6 destructuring swap i++ j-- } }
return i
}
// Java program for implementation of QuickSort
class QuickSort
{
/* This function takes last element as pivot,
places the pivot element at its correct
position in sorted array, and places all
smaller (smaller than pivot) to left of
pivot and all greater elements to right
of pivot */
int partition(int arr[], int low, int high)
{
int pivot = arr[high];
int i = (low-1); // index of smaller element
for (int j=low; j Array to be sorted,
low –> Starting index,
high –> Ending index /
void sort(int arr[], int low, int high)
{
if (low < high)
{
/ pi is partitioning index, arr[pi] is
now at right place */
int pi = partition(arr, low, high);
// Recursively sort elements before // partition and after partition sort(arr, low, pi-1); sort(arr, pi+1, high); } }
/* A utility function to print array of size n */ static void printArray(int arr[]) { int n = arr.length; for (int i=0; i
What is the runtime of Insertion sort?
Average n^2
worst n^2
What is the runtime of merge sort?
Worst nlgn
average nlgn
How much space does merge sort take?
O(lg n)
How much space does insertion sort take?
O(1)
How much space does quick sort take?
O(lg n)
Since quicksort calls itself on the order of log(n) times (in the average case, worst case number of calls is O(n)), at each recursive call a new stack frame of constant size must be allocated. Hence the O(log(n)) space complexity.
When would you use bucket sorting?
-When you know the upper and lower bounds of the data.
Bucketing is a very effective idea whenever we are confident that the distribution
of data will be roughly uniform. It is the idea that underlies hash tables, kd-trees,
and a variety of other practical data structures. The downside of such techniques
is that the performance can be terrible when the data distribution is not what we
expected. Although data structures such as balanced binary trees offer guaranteed
worst-case behavior for any input distribution, no such promise exists for heuristic
data structures on unexpected input distributions.
Bucket sort with insertion sort in the buckets is only reasonable when we can expect that there will be only a few items in each bucket. When there are only a few items, insertion sort is fine.
In real life, this doesn’t happen too often. Usually its when we expect that data to be uniformly distributed because we’re sorting into hash order or something like that.
Bucket sort is most commonly used when it’s the entire sort – i.e., the buckets don’t need to be sorted at all and you can just append each item into the bucket list.
the performance of the Bucket Sort degrades with clustering; if many values occur close together, they will all fall into a single bucket and be sorted slowly.
example of when bucket sorting is a benefit
- say we are using bubble sort on a list of 10
- we use bucket sort and break the list into 2
- then we run bubble sort on the 2 lists
- O(n^2) on each list
- new runtime = 5^2 + 5^2
- the 5 comes from each list length then they are combined
- make 50 potential comparisons instead of 100 potential comparisons
Describe bucket sort
The process of bucket sort can be understood as a scatter-gather approach. The elements are first scattered into buckets then the elements of buckets are sorted. Finally, the elements are gathered in order.
Bucket sort breaks down a big list into smaller lists to be sorted by another algorithm and then those lists get combined into one large sorted list.
example:
{1, 2, 4, 3, 8, 7, 5, 6}
{1,2,4,3}
{8, 7, 5, 6}
use insertion sort on the buckets to sort them
combine the two buckets into one list
What is the runtime of bucket sort?
worst case: O(n^2)
average case: O(n + k)
k is the number of buckets created
What is the runtime of bubble sort?
worst: O(n^2)
average: O(n^2)