SWE Flashcards
(107 cards)
SWE: What does it mean for function f(n) to be big-O of g(n), or f(n) in O(g(n)), or f(n) = O(g(n))
There exists number n* and constant c so for all n > n*,
f(n) < cg(n)
SWE: When we use big-O, are we typically describing best-case, worst-case, or expected-case run time?
Worst case. (But it can be used for the other two.)
SWE: What does asymptotic analysis do for function f(n)? What is our primary tool for asymptotic analysis?
It describes the limiting behavior for function f(n). We use big-O.
SWE: If we have two arrays with lengths a and b respectively, what is big-O run time for nested loops iterating through all elements of one array for all elements of another?
O(ab)
SWE: What are the two main rules for calculating big-O of a function f(n)?
Drop any constants, either multiplicative or additive, and drop any additive terms that are asymptotically not the largest.
SWE: What is big-O of f(n) = 4*2n + 3n10 + 10?
O(2n)
SWE: What “general pattern” emerges for O(log n) runtimes?
Continually dividing n by some constant, such as 2, until you reach 1. (Or, continually multiplying a constant by another constant until you reach n.)
SWE: Why can string questions be approached as array questions, and vice versa?
Because a string is just an array of characters.
SWE: What is a (singly) linked list?
A list of nodes, where each node has a value and a pointer to the next node.
SWE: What is a double linked list?
A list of nodes, where each node has a value and a pointer to both the previous and next node.
SWE: What is an advantage of linked lists over traditional arrays? What is a disadvantage?
Advantage: You can add and remove elements from the end of the list in constant time (whereas with a resizable array, additions are only amortized constant time)
Disadvantage: You can no longer use an index to find a value in constant time. If you want the kth element, you must iterate through k nodes.
SWE: How do you implement a linked list in python, with add and remove methods?

SWE: Are linked list problems commonly recursive or iterative?
Recursive, because linked lists have a naturally recursive structure. (But there are many problems with iterative solutions too!)
SWE: What is the runner technique for linked lists?
You examine the linked list using two pointers: one that is “slow”, and one that is “fast” and is generally at a farther position in the linked list, and which moves faster than the other.
This is a pretty general technique, but be aware that it exists: examining with two pointers might be handy for many problems.
SWE: What is a hash table used for, and what makes it so useful?
A hash table is a data structure that maps keys to values; you insert key-value pairs, and then search for a key to get its value.
It is useful because it enables fast lookup of elements, in basically constant time as long as your hash table has enough available memory.
SWE: What Python data structure implements the hash table?
Dictionaries (or defaultdicts, which are convenient)
SWE: How is a hash table implemented? What components are needed, and how are they used to insert a key-value pair, or retrieve a key-value pair? (There are multiple possible implementation strategies, we just focus on one in particular.)
Our hash table is an array of linked lists. So each entry in the array is its own linked list.
We use a hash function to map keys to an entry in the array, and this mapping is used for both insertion and retrieval.
To insert pair (key,val), get your hash value h(key), and then find your array index using h(key)%m, where m is the length of the array. We use modulus to keep the index within bounds. We then add our (key,value) pair to the linked list: if the key is already there, we update the value; otherwise, we add a new node with that key-value pair.
To check the value for a key, we get our index h(key)%m again, and look for our key in the linked list at that index. If it isn’t there, then that key hasn’t been inserted.
SWE: What is a hash function? What in general makes a “good” hash function in the context of, say, a hash table?
A hash function maps some value to an integer, in a non-random way. A good hash function will “evenly distribute” values across the different integers. In the context of a hash table, it should distribute the integers evenly across each of the cells of the array. This is good because it leads to fewer long chains in the hash table.
SWE: When making a hash table, why do we need a linked list in every cell? Why can’t we just insert key-value pairs into the cell directly?
Because of collisions: if 2 pairs are mapped to the same cell, they collide, and we can’t put them both in the same cell. Hence, a linked list is used.
SWE: When discussing hash tables, what is a load factor? What does it describe
The load factor is the number of keys in the hash table divided by the length of the array. It is n/m.
It describes how “full” the array is, or how long on average the chains are: if you have a load factor well over 1, then you have several collisions and longer chains.
SWE: What is the worst-case runtime for a lookup in a hash table, in terms of the number of keys n? In what sorts of situations would this happen, and how do these situations cause this runtime?
The worst-case runtime is O(n). This would happen if your array isn’t large enough, and as a result you have some long chains in some of the array indexes.
When looking for a key in a chain, it is O(k), where k is the number of keys in the chain. If you have a lot of keys in the chain – an amount that becomes substantively “on the order of” n, such as say n/10, then you’re looking at linear time.
SWE: When using a hash table, how can we assure constant-time insertion and lookups?
You need to keep your load factor low; you need to have a large enough array so there are few collisions, leading to chains that are at most a few elements long.
SWE: Is a stack first-in-first-out or first-in-last-out? Is a queue first-in-first-out or first-in-last-out?
A stack is first-in-last-out
A queue is first-in-first-out
SWE: What is a stack? How is it used at a high level?
A stack is a data structure which stores elements. You can add an element (or “push” it), and you can remove an element (or “pop” it). Elements are removed in reverse order of how they’re added, or first-it-last-out. This is like a stack of blocks where you continually add a blocks to the top when you want to add, and remove a blocks from the top when you want to remove.





















