algorithms Flashcards
U (101 cards)
Euclids equality
gcd(m,n) = gcd(n, m mod n)
euclid code
Euclid(m.n)
while n != 0 do
r <– m mod n
m <- n
n <– r
return m
what is The Sieve of Eratosthenes
how many numbers need to check up to
Prove it
The Sieve of Eratosthenes gives a way to list all primes up
to some integer n
We only need to check all numbers up to √n
* Assume there is another composite number x ≤ n with a divisor a>√n
* Then b = x/a <√ n would also be a number that divides x
* But since we have gone through all numbers smaller √n,
we must have already eliminated all multiples of (factors of)
b including x =⇒ contradiction
Sieve of Eratosthenes algorithm
Eratosthenes(n)
//input positive integer n>1
//output boolean Array A of length n in which A[i] true if i is prime
for p<–2 to n do
A[n] <– true
for p<– 2 to √n do
if A[p] is true then
j <–p * p
while j <= n do
A[j] <– false
j <– j + p
binary search algorithm
and time complexity
binary_search(a,x)
first <– 0
last <— n-1
while first <= last do
mid = (first + last) /2
if x ==a[mid] then
return true
if x < a[mid] then
last = mid -1
else
first = mid + 1
return false
O(log n)
rule to show O(f) (function grows no faster than f(n))
rule for Ω(f) (function grows at least as fast as f(n))
rule for Θ(f) (function grows like f(n))
rule for o(f) (f(n) grows faster)
rule for ω(f) ( g(n) grows faster)
g(n) ≤ c · f(n) big O
g(n) ≥ c · f(n) big Omega
g ∈ O(f) ∧ g ∈ Ω(f)
lim
n→∞ g(n)/f(n)= 0
f(n)/g(n)= 0
hierarchy of growth
- 1 ≺ log(n) ≺ log2(n) ≺ log3(n) ≺cuberoot√ n
≺√n
*√n ≺ n ≺ n · log(n) ≺ n ·√n ≺ n^2 ≺ n^3 ≺ 2^n ≺ 3^n ≺ n!
What do do with Sum of Functions growth rate
What to do with product of functions growth rate
pick the Highest growth rate
multiply the individual growth rates 4 · n · n ∈ O(n^2)
prove loga(n) = logb(n)/logb(a)
loga(n) = x
mean a^x = n
logb(a^x) = logb(n)
xlogb(a) = logb(n)
x = logb(n)/logb(a)
loga(n) = logb(n)/logb(a)
L’Hopital’s Rule
f(n)/g
(n) when these are derivations of f(n) and g(n)
Selection sort
selection_sort(A)
for i <– 0 to n-2 do
min = i
for j = i + 1 to n-1 do
if A[j] < A[min] then
min <–j
swap A[i] and A[min]
bubble sort
complexity
Bubble_sort(A)
swap <– True
While swap
swap <– false
for i = 0 to n-2
if A[i] > A[i+1] then
swap A[i] and A[i +1]
swap <- true
O(n^2)
Merge Sort Idea
divide array of numbers into two halves until subarray only has one element
Merge sorted subarrays into larger sorted subarrays until only one sorted array left
Merge Sort Algorithm
Algorithm Merge_Sort(A) // Input: Array A of n integers
if n>1 then
Copy first half of array A to new array B
Copy second half of array A to new array C
Merge_Sort(B)
Merge_Sort(C)
Merge(B,C,A)
Merge algorithm
Merge(C,B,A) // A for storing results
// input sorted arrays B and C of length p and q
// output : sorted combined array a of length p +q
i<– 0; j <— 0; k<— 0
while i <p and j < q do
if B[i] <= C[j] then
A[k] <– B[i]
i <– i + 1
else
A[k] <– C[j]
j <– j + 1
k <– k + 1
Copy elemtns from remaining sequence to end of array A
how efficient is merge sort
O(n * log(n))
Quick sort idea
-Pick any element in the array for the pivot element
-Partition elements into left part(smaller than pivot) and right part(elemetns larger than pivot)
-Repeat same procedure for both parts
Quick Sort Algorithm
Quick_Sort(A, l, r) // Array a of n integers
if l < r then
p <–Partition(A,l,r)
Quick_Sort(A,l,p-1)
Quick_Sort(A,p+1,r)
Partition idea
input Array A and index p of pivot element
Output Array A with rearranged elements
Scan array from left and from right using pointers i and j
Whenever A[i] > A[p] and A[j] < A[p], swap A[i] and A[j]
When pointers meet somewhere in middle,stop scanning
swap pivot element into the middle
Partition Code
Partition(A,l,r) // input array, partition bounds, pivot
i<– l - 1
j <–r
p<— r
while i<j do
do i <– i + 1 while (i<j and A[i] < A[p]);
do j <— j -1 while(j> i and A[j] > A[p]);
if i>= j then
swap(i,p)
else
swap(i,j)
Master theorm formula
how to compute d
what are three cases
how to determine c
T(n) = a * T(n/b) + f(n)
d = logb(a)
Case 1 Θ(nd) :
f(n) ∈ O(n^c) for some constant c < d
Case 2:Θ(nd· logk+1(n))
f(n) ∈ Θ(n^d· log^k(n)) for a k ≥ 0
Case 3:Θ(f(n))
f(n) ∈ Ω(n^c) for some constant c > d, and given
that a · f(n/b) ≤ k · f(n) for some constant k < 1 and large
enough n.
power of n for F(n) is = c, no power means 0
what does rank k mean
An element of a totally ordered set has rank k if exactly k − 1
elements are smaller than this element
Quick sort best case
prove
requires n log(n) + O(n) comparisons
recurrence relation : qs recursively calls problems half the size.
Tqs(n) = Tqs((n-1)/2) + Tqs((n-1)/2) + n
Tqs(1) = 0
Quick Sort worst case
requires O(n^2)
Recurrence relation
Tqs(n) = Tqs(n-1) + n-1 for n>=2
Tqs(1) = 0