Sorting Flashcards

1
Q

Selection Sort

A
  • Simplest sort algorithm
  • Continually swap smallest element into leftmost position

Steps

  1. Create outer loop over entire array
  2. Create inner loop starting from j = i
  3. Compare element at index j with every other element
  4. Swap smallest element into index j

Complexity

Time

Best ⇔ Average ⇔ Worst: O(n2)

Space

O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Bubble Sort

A
  • Very simple sorting algorithm that moves (bubbles) largest element to the end with each pass

Steps

  1. Create an loop of size n
  2. Create an inner loop starting from 0
  3. Compare element at index i with index i+1
  4. Swap if necessary
  5. Continue swapping until largest element is at far right
  6. Repeat n times with progressively smaller array (ignoring sorted values at end)
  7. Can exit early when no swaps are necessary on a pass

Complexity

Time

Best: O(n) ♦ Average: O(n2) ♦ Worst: O(n2)

Space

O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Insertion Sort (Pairwise Swaps)

A
  • Variation of insertion sort using pairwise swapping

Steps

  1. ​Divide list into sorted and unsorted by placing marker at index 1
  2. Start looping through unsorted items, checking if item is in right place
  3. If not, start inner loop
  4. Repeatedly pairwise swap up sorted portion until order is ok, or you reach start of array
  5. Return to main loop, proceed to next unsorted item

Complexity

Time

Best: O(n) ♦ Average: O(n2) ♦ Worst: O(n2)

Space

O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Insertion Sort (Shifting)

A
  • Variation of insertion sort using shifting

Steps

  1. ​Divide list into sorted and unsorted by placing marker at index 1
  2. Start looping through unsorted items, checking if item is in right place
  3. If not, vacate spot (store value in variable) and initiate an inner loop
  4. Shift element at i-1 into hole at i
  5. Check whether new hole (now at i-1) is correct insertion spot
  6. Continue shifting elements right until order is ok, or you reach start of array
  7. Copy element at marker to its new location
  8. Return to main loop, proceed to next unsorted item

Complexity

Time

Best: O(n) ♦ Average: O(n2) ♦ Worst: O(n2)

Space

O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Shellsort

A
  • Generalization of insertion sort
  • Uses gaps to move elements multiple spaces at a time

Steps

  • Determine size of gap using knuth’s sequence
  • Apply rule h = 3h + 1, up to h < N
  • Divide array ino 2 sections, “unsorted” and “sorted” (if you will) by starting loop from h (instead of 1)
  • For each item i, compare with i - h, loop and swap as necessary
  • Exit inner loop, and continue processing elements from i
  • After each sweep, divide h by 3, using integer division
  • Continue starting loop from new h until h == 1 (normal insertion sort)

Complexity

Time

Best: O(n log n) ♦ Worst: O(n3/2)

Space

O(1)

*Analysis is an open topic
*Analysis depends on gap sequence used; this assumes knuth’s sequence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Mergesort

A
  • Comparison-based sorting algorithm
  • Recursively implemented
  • Utilizes divide and conquer paradigm

Steps

  1. If size of subarray is < 2, return (subarray is already “sorted”)
  2. Divide array into two halves
  3. Conquer by performing mergesort on each half
  4. Merge sorted halves
    1. Allocated new array aux
    2. Copy smaller element from each half into aux
    3. After copying, one array will have extra elements
    4. Loop through both arrays, copying remainder of elements into aux
    5. Set array = aux

Complexity

Time

Best ⇔ Average ⇔ Worst: O(n log n)

Space

O(n)

*Stack space is log(n) (depth of the recursive tree)
*Merge operation requires O(n) (in call stack, space for merge has not yet been allocated)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Quicksort

A
  • Comparison-based sorting algorithm
  • Uses divide & conquer paradigm
  • In-place, unlike mergesort

Steps

  1. If size of subarray is < 2, return
  2. Partition array around a pivot
  3. Split array into two halves around pivot
  4. Run quicksort on each half (start - pivot-1) and (pivot+1 - end), ignoring pivot

Complexity

Time

Best: O(n log n) ♦ Average: O(n log n) ♦ Worst: O(n2)

Space

Best ⇔ Average: O(log n) ♦ Worst: O(n)

*n2 worst case occurs when array is sorted in reverse order, and partition operation continually divides array at beginning
*Requires constant (ie: no) aux space (in-place operation)
*Stack depth is log(n) in best and average case, but in worst case, array in each recursive call is only 1 smaller than previous, so height of tree is n (instead of log(n))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Parition

A
  • Paritioning is a subprocedure of quicksort
  • Partition achieves two things
    • It reorganizes elements around pivot
    • It returns pivot index

Steps

  1. Select a pivot “arbitrarily”, should select the end
  2. Place pivotIndex at start
  3. Iterate through arr from start to end - 1
  4. If element at i is <= pivot, swap with pivotIndex
  5. Increment pivotIndex
  6. After loop is finished, swap end with pivotIndex
  7. Return pivotIndex
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Heapsort

A
  • Comparison-based sorting algorithm that uses a heap

Steps

  1. Organize data into array
  2. Perform buildheap to transform array into max heap
  3. Iteravely delete max from heap
  4. But instead of decrementing array, store removed element in location vacated by swapped leaf
  5. Keep going until size of heap is 1 (already sorted)

Complexity

Time

Best ⇔ Average ⇔ Worst: O(n log n)

Space

O(1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Bucket Sort

A
  • Sorting algorithm that distributes elements into buckets of linked lists​
  • Similar to a hash table with separate chaining

Steps

  1. Initialize array of empty buckets
    • Number of buckets is arbtrary
  2. Create some sort mapping of elements to buckets
    • Each bucket can represent a key (0-9, etc.), a letter, or a range of values
  3. Loop through input array
  4. Add element from input to each bucket based on mapping
  5. Loop through bucket array, sorting linked list in each bucket
    • Use insertion sort, or (less commonly) recursively use bucket sort
  6. Concatenate linked lists in each bucket while preserving their order

Complexity

Time

Best: O(n + k) ♦ Average: O(n + k) ♦ Worst: O(n2)

Space

O(n + k)

*k is the number of buckets
*n2 running time in worst case is because of insertion sort

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Counting Sort

A
  • Sorting algorithm used to sort elements (keys) within a specific range
  • Elements in range are usually integers (0-9 in binary, or 0,1 in decimal) or letters (a-z), etc.
  • Used to store individual values, not words (1991, ‘book’)
  • Requires an input array, a count array and a temp array

Steps

  1. ​Create count array, a historgram of key frequencies
    • Loop through input, counting number of times each key appears
  2. Compute the starting index of each key
    • Loop through count array, setting count equal to sum of previous frequencies up to index - 1
  3. Copy elements from input array to temp array using indices
    • Increment starting index in count array every time copy is made
  4. Return output

Complexity

Time

Best ⇔ Average ⇔ Worst: O(n + k)

Space

O(n + k)

*k is the number of keys

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Radix Sort

A
  • A sorting algorithm that sorts integers/strings by least significant digits/letters
  • Elements in arrary should consist of letters/digits defined within some range (0-9, 0-1, a-z)
  • Implemented with counting sort or (less commonly) bucket sort

Steps:

  1. Figure out length of longest word (element)
  2. Create loop of size word-length from least significant position (position d) up to position 0
  3. Iteraveley perform counting sort (or bucket sort) on input array at position d

Complexity

Time

Best: O(n) ♦ Average: O(w(n + k)) ♦ Worst: O(w(n + k))

Space

O(n + k)

*Complexity assumes counting sort
*Linear result requires some analysis, but involves minimizing running time by choosing “best k”
*w is the length of the word; k is the number of keys (size of “alphabet”)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly