Algorithmics Flashcards

1
Q

Stack Operations

A

push
peak
pop
isEmpty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Queue Operations

A

enqueue
peek
dequeue
isEmpty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Priority Queues Additional Operations

A

insert (giving a priority)
findMin
deleteMin/removeMin - returns the lowest element

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

List Operation

A

add - shifts elements to make space
remove
set - replaces element at position
get

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Set Operations

A

add
remove
contains
size
isEmpty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Map Operations

A

put
get
remove
size

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Skip List

Data Structure

A

Hierarchies of linked lists which allow for binary search, allowing Θ(log(n)) search. Good for concurrent access compared to binary trees

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Binary Tree

A

A tree (a graph with no cycles and no direction) where each node has at most 2 children

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Binary Search Tree

A

A binary tree where every element in the left subtree is smaller and each element in the right subtree is larger for every parent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Successor

Binary Search Tree

A

The next smallest element. Found by:
1. If has right child move right once then left as far as possible
2. Otherwise go up left as far as possible then up right once

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Deletion

Binary Search Tree with 2 children

A

Replace the element with it’s successor. Then remove the successor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Single rotation

Binary Search Tree

A

Move top element of larger subtree up and change middle subtree to old top element

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Double Rotation

Binary Search Tree

A

Needed if large subtree is in the middle. First move large subtree to outside. Then do normal rotation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

AVL trees

A

Binary search tree where all parents left and right subtrees differ by at most 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

AVL tree implemenation

A

Each node has a balance factor. If > |1| then do rotations on deepest section.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

AVL additions

A

Iterate up tree changing balance factor (increase if on left). Stop when balancefactor > |1| or is 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

AVL deletion

A

Requires updating of balance factors and rotations but may need repeated rotations. Worst case O(log(n))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Red-Black Trees

A
  • Black Root
  • Children of red nodes are black
  • Number of blacks on route from root to any exposed node (<= 2 children) are the same for all routes
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Complete Tree

A

Lowest level is all on the left and other levels are full.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Heap

A

A complete tree with every child being >= its parent (so root is smallest)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Adding to Heap

A

Add element to next space in tree and swap with parent until corrrect (percolate up)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

removeMin

heap priority queue

A
  1. Pop root
  2. Replace it with last node in heap
  3. Swap with smallest child until fixed
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Heap Array Implementation

A

Root stored at 0. Children of node at k are at 2k + 1 and 2k + 2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Heap Sort

A

Add all elements to a heap then take them off again. Worst case O(nlog(n)) as we need to do add/remove 2n times and add/remove are O(log(n))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
B-tree
Balanced trees for keeping related data close together. Good for disk access time
26
M-way tree
Allow for nodes to have many children. Reduces the depth of trees greatly, reducing access time
27
B+ Tree
All data stored in leaves. Non leaf nodes store M/2 - M-1 keys with a key corresponding to a child (Can have an extra child if on left or right). All leaves have L/2 - L data entries. Each key corresponds to a subtree with the smallest value in that subtree being that key value. Leftmost subtree has no key for it.
28
Trie
Multiway tree for storing sets of words. All words end with $. Use lots of memory
29
Internal one-way branching | Trie
Edges between non-leaf nodes, can be removed by condensing
30
External one-way branchiong | Trie
Edges that reach the leaf nodes, can be removed by condensing
31
Suffix Tree
Trie of all suffixes of string, so all parts of string included. Good for finding if string in text
32
Content-addressable memory
Where we want to access data based on a linked objects contents (use hash table)
33
Turn Hash into address
Hash % tableSize
34
Hash Collision
Two objects map to same address
35
Seperate Chaining | Hashing
Make a linked list to store any hash collisions
36
Open Addressing
Finding a new space in the hash table to put hash collisions
37
Linear Probing
Hash collisions are moved to next available space. Can cause clustering making loading data slow
38
Loading Factor | hashing
Proportion of full entries in the table
39
Quadratic Probing | hashing
hash collisions are placed in next available space 1, 4, 9, ... spaces away (i^2). Prevents clustering, may cause failure to place element when table not full
40
Double Hashing
hash collision are placed in next available space multiples of new hash address away (from a different hashing function). Table size should be prime to allow all spaces to be checked.
41
Element Removal when open addressing | Hash Table
Can be sone by replacing removed elements with dummy entry, counted as full for finding, but empty for placing
42
Union-Find
Groups elements together. Operations: * union (A, B) - Combines the groups containing A and B together * isConnected (A, B) - Checks if A and B are in the same group * find (A) - Checks what group A is in
43
Dynamic Connectivity | Data Structures
A property of a data structure such that it stores objects where some of them are connected. These connections can change. A Union-Find has dynamic connectivity
44
Equivalence class properties
A relation on a set of elements such that it is: * Reflexive - A ∼ A * Symmetric - if A ∼ B then B ∼ A * Transitive - if A ∼ B and B ∼ C then A ∼ C A property of a union-find
45
Quick-Find
Uses a 1 to 1 array of elements and an array of ids (groups). Elements are in the same class (group) if their id is the same. Allows O(1) find * isConnected (A, B) - checks if A and B have the same id * find (A) - returns A's id * union (A, B) - changes id of all elements in group A to the id of group B
46
Quick-Union
Elements stored in trees. Parent of a tree is itself by default. The id of an element is it's parent, so that's what the array of ids stores. Bad as O(n) for all operations. Operations: * find (A) - return the id of A's root * isConnected (A, B) - true if A and B's roots are the same * union (A, B) - Makes A's root the child of B's root
47
Weighted (Balanced) Quick-Union
Store an additional size[] array storing the number of elements in each tree. For union (A, B), make the root of the smaller tree the child of the root of the larger tree. Allows for O(log(n)) find and union when doing path compression
48
Path Compression | Quick-Union
Improvement to quick union where elements visited when finding a root have their id set to the root
49
Stability | Sorting Algorithm
Stable if the order of elements with the same value does not change
50
In-Place | Soting Algorithm
If memory used is O(1)
51
Insertion Sort Properties
It is stable and in-place. But O(n^2) worst case time complexity. Very good for small arrays
52
Selection Sort
Search for smallest element and swapped with the element in the first unsorted position
53
Selection Sort Properties
in-place but not stable as swaps can change order. O(n^2) time complexity in all cases. Performs few swaps
54
Loop Invariant | Algorithms
Property of a loop that helps show correctness. Must show: * Initialisation - Be true before the first iteration * Maintenance - Is true before loop, is true after * Termination - Can be used to show that the algorithm is correct after the final loop
55
Merge Sort
Divide the array into two halves until 1 element arrays. Then Merge sorted arrays together until 1 array. stable, O(n) space and O(nlog(n)) time
56
Merge Sort Improvement
Can use insertion sort on small sub-arrays, due to it being fast on small arrays
57
Quicksort
Choose pivot and put smaller elements on left and larger elements on right. Repeatedly pick pivot until all elements been picked as pivot. not stable but in place, O(n^2) worst case but O(nlog(n)) average
58
Choosing pivot well | quicksort
* Choose randomly to avoid worst case (can be done by shuffling array at start) * Choose median of first middle and last element
59
Quicksort improvement
Use insertion sort on the small arrays
60
Comparison-Based time complexity
Minimum omega(nlog(n)) due to minimum number of comparisons
61
Quicksort Partitioning
Algorithm for splitting elements lower and higher than pivot
62
Lomot Partitioning Scheme | Quicksort Partitioning
2 sections for higher and lower with each element sorted into one of them. O(n) time complexity
63
Hoare Partitioning Scheme | Quicksort Partitioning
2 Pointers at ends of array which move away from ends. If element at pointer not correct in relation to pivot, swapped with equivalent element at other pointer. Stop when pointers cross. Pointer that started at the right is partition point. O(n) time complexity
64
Knuth Shuffle | Quicksort
Starting from last element go back, swapping each element with random one in unshuffled section. O(n)
65
Multi-pivot Quicksort Benefits
Better performance due to rduced cache misses.
66
Radix Sort
Every element must be d characters, with k possible characters. Sort array by each character, starting from last using stable sort. Is Θ(d(n + k)) when using counting sort with d the number of digits and k the base
67
Counting Sort
Count the occurences of each element. Make the occurences into a cumulative counter. Loop backwards through the unordered array and place element in position as stated by cumulative array. Then decrease count by 1.
68
Counting Sort Properties
Stable but not in-place. time O(n + k) and space O(n + k) with k being base. Good when n > > k. if k > n then bad as have to make large structure
69
Adjacency Matrix | Graph
Represents a graph. A matrix with each node representing a row and column. Intersections between nodes have 1s if an edge is present and a 0 if not
70
Adjacenecy List | Graph
Represents a graph. Nodes are stored as a list with each connected nodes stored in that list
71
Euler Trail | Graph
Route through a graph where each edge is passed through once. Requires <= 2 odd-degree nodes
72
Euler Circuit | Graph
Euler trial with the end node being the starting node. Requires 0 odd-degree nodes
73
Euler Circuit Construction | Graph
1. Travel along any adjacent unused edge 2. Repeat step one until the start node for the loop is reached 3. Mark the loop and move along it until there is an adjacent unused edge and go to step one. if the start is reached first (every edge used), the circuit is done
74
Minimum Spanning Tree | Graph
A subgraph that spans all nodes of the parent graph but minimises the weight of the edges crossed.
75
Kruskal's Algorithm | Graph
Method for finding a minimum spanning tree. Contains a set of trees each step the safe edge with the lowest weight is added to the forest (set of trees)
76
Safe Edge | Graph
An edge that when added to a graph does not form a cycle, and would prevent a minimum spanning tree
77
Kruskal's Implementation | Graph
All nodes stored as thier own tree initially in a union-find. When two subtrees connected a union is done on the subtrees. O(elog(e))
78
Prim's Algorithm | Graph
Method for finding a minimum spanning tree. Contains a tree, each step the safe edge with the lowest weight connected to the tree is added to the tree
79
Prim's Implementation | Graph
Two arrays for the connected nodes and the unconnected nodes. When node connected to the tree it is added to the connected nodes array O(elog(e))
80
Dijkstra's | Graph
Method for finding the shortest route from a root node to every other node
81
Hamiltonian Path | Graph
A path that goes through each node exactly once
82
Dynamic Programming
The idea of combining smaller overlapping subproblems to solve a given problem. The subproblems are created recursively.
83
Dynamic Programming Steps
1. Find a recursive algorithm to solve the problem 2. Remove recalculation of subproblems using memoization/bottom-up 3. Store the actual solution, not just the values
84
Memoization | Dynamic Programming
The storage of results to sub-problems to prevent their future calculation
85
Bottom-up | Dynamic Programming
The calculation of subproblems starting at the base case then working up to the full problem, replacing the recursive algorithm with an iterative one
86
P | Complexity Theory
Problems that can be solved in polynomial time
87
NP | Complexity Theory
Problems that can have solutions verified in polynomial time
88
NP to P relationship | Complexity Theory
P ⊆ NP As if a solution is found, that verifies it. Assume P ⊂ NP for our uses
89
NP-Hard | Complexity Theory
Problems in NP-Hard are at least as hard as all problems in NP
90
NP-Complete | Complexity Theory
Problems that are in NP-Hard and in NP. i.e. the hardest problems in NP
91
α-approximation algorithm
An algorithm that runs in polynomial time and produces a solution that is at most a factor of α of the original solution
92
Satisfiability | Algorithms
Gives a boolean formula in the structure A ∧ B ∧ C ∧ ... where each letter is a clause containing A1 ∨ A2 ∨ A3 ∨ ... Is there an assignement where the formular is true. It is NP-Complete. SAT if you reach an empty formula
93
DPLL Algorithm | Algorithms
Two rules * If a clause has 1 variables, it must be true * If a variable is pure (It's negation does not appear), it can be set to true
94
Branch and Bound | Algorithms
When doing backtracking, instead of trying all possibilities, you can exclude certain routes that are impossible to provide a solution,
95
Local Search | Algorithms
Used in an optimisation problem. At each step changes to a neighboring output that goes in the wanted direction. A greedy strategy as gets stuck in local optima
96
Simulated Annealing | Algorithms
At each step randomly choose a new location. If better do it. If worse do it with a probability, increased if less bad and if higher temperature. Probability given by e^(−∆/T)
97
Chromosome | Genetic Algorithms
A representation of an individual
98
Gene | Genetic Algorithms
A position in a chromosome
99
Allele | Genetic Algorithms
A possible value for a gene
100
Steps | Genetic Algorithms
1. Compute fitness of all individuals 2. Keep some individuals, favouring higher fitness 3. Mutate the individuals 4. Crossover individuals to replace dead ones