Final review cards Flashcards
(89 cards)
Computational complexity?
- How does the speed of the operation depend on N number of digits?
- Addition- Doing thing in for loop N times, other things once
- little operations dont depend on N, they are constants
- c1+c3+c2*N- only focus on N not constants
- Multiplication- some things done N times, some things done N squared times, some things done once
- Takes time proportional to N- O(N) order of N- addition O(N), multiplication O(N^2)
How many bits N do we need to represent m?
between log2m and log2m+1
List types info
- Arrays have constant time access
- Arraylist- ordered set of elements. For reference type, they objects are not in the arraylist they are referenced by it
- Linked list- nodes and objects- reference to first (head) and last (tail)- addFirst, removeFirst, addLast, removeLast
- For doubly linked list, removing last item is easier- dont have to loop through everything
Time complexity in list types
ArrayList SLinkedList DLinkedList
- AddFirst O(N), O(1), O(1)
- removeFirst O(N), O(1), O(1)
- addLast O(1), O(1), O(1)
- removeLast O(1), O(N), O(1)
- remove(i) O(N), O(N), O(N)
Abstract Data Structure (ADT)
-defines data type by values and operations from user prospective, ignores implimentation, more abstract than data structure
Stack
-push, pop, isEmpty, peek- last in, first out
Queue
enqueue, dequeue-first in, first out
-slow when implemented with arraylist- requires shift, or with expanding array, slow- use circular array. if needs to be expanded move head to front
bubble sort
- time O(N^2)
- repeatedly loop through list, for each iteration if two elements in wrong order, swap them
- have to use temp variable for swap
- at end of first pass, largest is at end of list but smallest element can be anywhere
- add continue boolean to determine if switches are made
- add variable bc walk is one shorter each time
- best case, 1, worst case N*N/2
- passes through 1 more time after sorted to make sure it doesnt need to be sorted again
Selection sort
- split list into 2, part at front is sorted, grow it by one. go through rest, find min, tack on to one that is sorted
- swaps min with element at beginning
- how many passes through inner loop? N(N-1)/2
- O(N)^2
- regardless of best or worse case, same amount of time
Insertion sort
Insert element at index k into correct position with respect to elements at indices k, k-1
- shift items, down, insert each item one by one in correct place
- Best case, time N. Worst case N^2
1+2+3….N
sequence that adds to n(n+1)/2
Induction steps
For all n>=x, P(n) is true
- prove base case (P(x))
- For any k >=x, if P(k) is true, then P(k+1) is true
Tower of hanoi
-Uses recursion
-tower(n-1)
-
Call stack
stack which recalls which methods are being called
-call stack is used for recursion (factorial 2 is pushed, factorial 1 is pushed, 1 is popped 2 is popped )
Fast power method
- divide by 2, either multiply x^n * x^n if even, x^n * x^n * x if odd, aka if 0 in binary 1 mult, i 1 in binary 2 mult
- still slower, multiplying bigger numbers- (mult is O(N) so longer num takes more time).
- fast recursive is same amount of time as iterative
Binary search
search by going to middle of list, if greater checking before if lower checking after, changing mid to list subset mid
- can call recursively changing low or high in calling of method
- it is called
- O(log2n)- time that binary search, power x^n, converting to binary takes
Fast computing time
- need something faster than n squared- this can potentially take minutes, hours, years
- need sorting that is between O(n) (not gonna be faster than order n) but faster than n^2
Mergesort
- break list into 2 equal part
- sort each half recursively- continue to break in half until each list is 1, then merge
- merge two halves (look at head of each, add bigger. when one is empty, add rest of other list)
- proportionate to n log n (close to n, less than n squared)
- way better for big numbers than other sort methods
- slower than quicksort because it uses an extra list
quicksort
-remove pivot- item on list
make a list of elements less than pivot, list of elements greater or equal to pivot
-recursively sort
-not same size- if bad pivot, all in same list
-concatenate list one and list 2
-dont need new lists- can be done in place. can use orig array only- no extra memory
-move items greater than pivot after, move items lower than pivot before
recurrence relation
- sequence of numbers where t(n) is defined in terms of earlier numbers
- in computing- time it takes to solve a problem
- ex c+t(n-1) -> c + c + t(n-2) … cn + t(0)
- aka time it takes is proportional to N
- time for quicksort is n squared (bad pivot)- do middle of 3 values
Big O
t(n) is a function that describes the time it takes for an algorithm on input size n, need to describe how t(n) grows as n becomes large (asymptotic)
-unlike limit, compare it to simple squation like n squared
-for graphs t(n) and g(n) for any larger number, t(n) is asymptotically bounded above by g(n)
t(n) is O(g(n)) if there are two constants n0 and c such that for all n>n0 t(n) < c g(n)
Big omega Ω
- Lower bounds- at least___ time to do function of size n
- bound below asymptotically g(n) if beyond a certain n0 g(n) is below f(n)
- 2 constants (like big O)
- same rules as bigO
constant factor rule
f(n) is O(g(n)) and a is positive constant
-af(n) is O(g(n))
sum rule
f1(n) is O(g(n)) and f2(n) is O(g(n)) then f1(n) + f2(n) is O(g(n))
if f1(n) is O(g1(n)) and f2(n) is O(g2(n)) then f1(n) + f2(n) is O(g1(n) + g2(n))