# Sorts Flashcards

1
Q

Quicksort Time Complexity

A

best, avg O(n log n) (orange)

worst O(n^2) (red)

2
Q

Quicksort Space Complexity

A

log n (green)

Quick sort calls itself log(n) times. For each recursive call, a stack frame of contsant size is created.

3
Q

Quicksort - How does it work?

A

How Does It Work?

Choose a pivot. Reorder the array so that all the values less than the pivot value are before the pivot and all the values greater than the pivot value are after the pivot. Now the pivot is in its final position. Apply the same pattern recursively.

Choice of pivot matters greatly. If the final position is chosen as the pivot, performance will be O(n^2) worst case for already sorted arrays or arrays of identical elements.

Computer architecture favors quicksort because it is a cache-efficient sort. It linearly scans the input and linearly partitions the input, which means you make the most of each cache load before reloading.

4
Q

Quicksort Features

A

Features

In practice, 2-3x faster than mergesort and heapsort despite having the same best case and avg case Big O

In place

Not stable

5
Q

Why is quicksort 2-3x faster than mergesort and heap sort despite having the same Big O time complexity?

A

Computer archutecture favors quicksort because it is a cache efficient sort. Quicksort processes segments of the array in order. When it loads the first element, the location is loaded into the cache. When it needs to access the next element, that element is likely already in the cache. Other algorithms, like heapsort, jump around the array a lot making them less cache efficient.

6
Q

Mergesort Natural Variant

A

Can get best case of O(n) using the “natural” variant which exploits naturally occuring sorted sequences. Rather than starting with sublists of size 1 or 2, split the data into naturally occuring runs

3,4,2,1,7

becomes

3,4 2 1,7

7
Q

Mergesort Time Complexity

A

Time Complexity

Best, avg, and worst case: n log n

8
Q

Mergesort - How Does It Work?

A

Divides entire dataset into groups of at most two.

Compares each number one at a time, moving the smallest number to left of the pair.

Once all pairs sorted it then compares left most elements of the two leftmost pairs creating a sorted group of four with the smallest numbers on the left and the largest ones on the right.

This process is repeated until there is only one set.

9
Q

Mergesort Space Complexity

A

Space Complexity

O(n) worst case

10
Q

Mergesort Features

A

Features

Stable (!)

Divide and conquer

11
Q

What’s the best algorithm for sorting? Questions to ask…

A

What can you tell me about the data?

Is it already sorted, mostly sorted, or unsorted?

How large are the data sets likely to be?

Are there duplicate key values?

How expensive are key comparisons?

What are the requirements?

Are we optimizing for best-case, worst-case, or average-case performance?

Does the sort need to be stable?

Is the largest data set to be sorted smaller than the available memory?

12
Q

What is a stable sort?

A

Preserves the input ordering of elements with equal keys

13
Q

Selection Sort Time Complexity

A

Time Complexity

Worst, avg, best case: n^2 (red)

14
Q

Selection Sort Space Complexity

A

Space complexity

O(1)

15
Q

Selection Sort - How Does It Work?

A

How does it work?

Loop through array. Find the smallest element then swap it with the element in position 0. Repeat the same process to find the next smallest element and swap it with the item in position 1.

At each iteration, you are selecting the next smallest element and moving it into the sorted porition of the array.

16
Q

Heap Sort - Time Complexity

A

Time Complexity

Best, avg, worst: n log n (orange)

17
Q

Heap Sort Space Complexity

A

Space Complexity

Worst: O(1) (green)

18
Q

Heap Sort Features

A

In place

Not stable

Same time complexity as a merge sort but much better space complexity - O(1) vs 0(n).

Same space complexity as a selection sort but much better time complexity - O(n log n) vs O(n^2)

19
Q

Heap Sort - How Does It Work?

A

Very similiar to the seletcion sort. It divides the data into a sorted and unsorted region. Instead of looping through an array to find the next smallest element O(n), it uses a heap data structure O(log n).

20
Q

Counting Sort Time Complexity

A

Time Complexity

Best, avg, and worst case O(n+k) (green) where k is the maxiumum value of the key

21
Q

Counting Sort Space Complexity

A

Space Complexity

O(k) (yellow) where k is the maximum value of the key

22
Q

Counting Sort - How Does It Work?

A

How It Works

Not comparison based

Allocate an array, numCounts, of size k, where the indicies represent the range of the input and the values represent the number of times the number appears.

Make one pass through the array to fill in numCounts.

Create sorted array and fill in the sorted numbers.

23
Q

Sorts with O(n^2) (red) Time Complexity

A

Quicksort: worst case

Selection sort: best, avg, and worst case

Insertion sort: avg and worst case

Bubble sort: avg and worst case

24
Q

Sorts with O(log n) (light green) Space Complexity

A

Quicksort

25
Q

Sorts with O(n) (yellow) Space Complexity

A

Mergesort

26
Q

Stable Sorts

A

Mergesort

Counting sort

Insertion sort

27
Q

Sorts with O(1) (Dark Green) Space Complexity

In place sorts

A

Heapsort

Selection sort

Insertion sort

Bubble sort

28
Q

Sorts with O(n log n) (orange) time complexity

A

Quicksort: best, avg

Mergesort: best, avg, worst

Heapsort: best, avg, worst

29
Q

A

Time Complexity

O(nk), where k is the range of values

30
Q

A

Space Complexity

O(n+k), where k is the range of keys

31
Q

Radix Sort - How Does It Work?

A

How Does It Work

Not comparison based

Groups keys by significant position and value

Least significant digit (LSD) radix sort: short keys before long keys then sort keys of the same length lexicographically

Most significant digit radix sort: Use lexicographic order. b, ba, c…

32
Q

Insertion Sort Time Complexity

A

Time Complexity

Best case: O(n) (yellow)

Average case: O(n^2)

Worst case: O(n^2)

33
Q

Insertion Sort Space Complexity

A

Space Complexity

O(1)

34
Q

Insertion Sort Features

A

Features

Stable

In place

Efficient for small data sets

Much less efficient than quicksort, heapsort, and merge sort for large datasets

35
Q

Insertion Sort - How Does It Work?

A

Loop through the array beginning at item 1. (item 0 becomes the sorted portion of the array.) At each iteration, insert the current element into sorted position at the front of the array.

(If asked to sort a deck of cards, most people would do it this way.)

36
Q

Bubble Sort Time Complexity

A

best case O(n) (yellow)

avg case, worst case O(n^2) (red)

37
Q

Bubble Sort - How Does It Work?

A

A comparison based sorting algorithm

It iterates left to right comparing every couplet, moving the smaller element to the left.

It repeats this process until it no longer moves an element to the left.

38
Q

Bubble Sort Space Complexity

A

O(1) (dark green)

39
Q

Java Collections.sort()

A

This implementation is a stable, adaptive, iterative mergesort that requires far fewer than n lg(n) comparisons when the input array is partially sorted, while offering the performance of a traditional mergesort when the input array is randomly ordered. If the input array is nearly sorted, the implementation requires approximately n comparisons. Temporary storage requirements vary from a small constant for nearly sorted input arrays to n/2 object references for randomly ordered input arrays.

40
Q

A

Takes advantage of presortedness in the input sequence

41
Q

Java Arrays.sort()

A

Quicksort Implementation

Time complexity

Best, Avg: n log n

Worst: n^2

Space complexity: log n

42
Q

When to use mergesort over quicksort

A
• MergeSort is stable by design, equal elements keep their original order.
• MergeSort is well suited to be implemented parallel (multithreading).
• MergeSort uses (about 30%) less comparisons than QuickSort. This is an often overlooked advantage, because a comparison can be quite expensive (e.g. when comparing several fields of database rows).