Complexity of sorting Flashcards
(18 cards)
What is the time complexity of selection and insertion sort?
O(n²) — quadratic time.
What is the best known time complexity for comparison-based sorting?
Θ(n log n)
What does MergeSort’s time complexity show?
That it is asymptotically optimal among comparison-based sorting algorithms.
What theoretical tool do we use to analyze the lower bound of sorting?
A decision tree model.
How does a decision tree represent a sorting algorithm?
Each internal node is a comparison, and each leaf is a sorted permutation.
Why must a decision tree have at least n! leaves?
Because there are n! possible orderings of n elements.
What is the minimum height of a decision tree that sorts n elements?
log₂(n!)
What is the worst-case number of comparisons for any comparison sort?
At least Ω(n log n)
What is the lower bound for comparison-based sorting derived from log₂(n!)?
Ω(n log n)
Why can’t any comparison sort be faster than n log n in general?
Because it would violate the decision tree lower bound.
Why is MergeSort considered optimal in complexity?
Because it matches the Ω(n log n) lower bound in worst-case time.
What does MergeSort rely on for correctness?
Correctness of the merge operation and a loop invariant during merging.
What is a loop invariant in MergeSort’s merge step?
It ensures the result array always contains the smallest remaining elements from both input arrays.
What is the significance of the merge operation in MergeSort?
It guarantees that two sorted subarrays are merged into one sorted array.
Is Shellsort faster than insertion sort?
Yes — it reduces disorder with long strides, though worst-case is still suboptimal.
What is Shellsort’s worst-case complexity with better gap sequences?
O(n^{3/2})
What happens if a sorting algorithm skips possible permutations in the decision tree?
It may fail to sort all inputs correctly.
What does the decision tree lower bound prove about MergeSort?
That no other comparison-based sort can asymptotically beat its performance.