asymptotic complexity Flashcards
(18 cards)
Why don’t we use a stopwatch to measure code performance?
Because it’s too imprecise due to human error and system noise.
Why is system time an unreliable metric for program complexity?
Because it’s affected by many external factors like OS activity and hardware.
What is the best way to analyze algorithm performance?
Count the number of operations as a function of input size.
What does T(n) = 3n² + 2n + 3 mean in complexity?
The algorithm performs roughly 3n² + 2n + 3 operations for input size n.
Which term dominates in 3n² + 2n + 3 as n grows?
n² — it determines the asymptotic growth.
What is Big-O notation used for?
Describing the upper bound of an algorithm’s growth rate (worst-case complexity).
What is the formal definition of Big-O?
f(n) = O(g(n)) if ∃ c > 0, n₀ such that f(n) ≤ c·g(n) for all n ≥ n₀.
What does Big-O ignore?
Constant factors and lower-order terms.
Is 3n² in O(n²)?
Yes — constants are ignored in Big-O.
Is n log n in O(n²)?
Yes — n log n grows slower than n².
Is 2ⁿ in O(n²)?
No — exponential growth outpaces polynomial growth.
What is the runtime of selection sort?
O(n²) — due to nested loops or repeated minimum searches.
What is Big-Ω notation?
A lower bound on algorithm growth — best case performance.
What is Big-Θ notation?
A tight bound — both upper and lower bounds match asymptotically.
What does T(n) = Θ(n²) mean?
The algorithm grows exactly like n² for large n.
What does it mean if an algorithm is O(1)?
It runs in constant time — performance does not depend on input size.
What is a useful heuristic when comparing complexity classes?
Exponential > Polynomial > Logarithmic > Constant
Why is Big-O more useful than raw timings?
Because it abstracts away machine-dependent behavior and focuses on scalability.