Trees Flashcards

1
Q

What is a binary tree?

A

A tree whose elements have at most 2 children (left and right child)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How are binary trees stored in memory?

A

Each node has a pointer to parent (or NULL/itself if root), pointer to left/right child (or NULL/itself if there isn’t one), and a value representing the item stored in the node

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How do you search a binary tree?

A

Start at the root and search the tree level-by-level (top to bottom, left to right)
This has time complexity O(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a binary search tree?

A

A binary tree where for every node (with value v), all elements in the left sub-tree are smaller than v and all elements in the right sub-tree are bigger than v

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Preorder vs inorder vs postorder tree traversal

A

Preorder = print(root), preorder(left), preorder(right)
Starts with root, goes all the way left, ends with rightmost

Inorder = preorder(left), print(root), preorder(right)
Starts with leftmost, ends with rightmost, items in sorted order

Postorder = preorder(left), preorder(right), print(root)
Starts with leftmost, ends with root

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Inserting into a BST

A

Call the .insert() function on the root
This function:
Compares the new value to the root
If it’s less than the root, recurses into the left subtree
If it’s greater than the root, recurses into the right subtree
If the left/right subtree doesn’t exist, it appends the new value there

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Searching a BST

A

Call the .lookup(n) function on the root
If this is a match, return it and its parent
If our target is smaller, recurse into the left sub-tree, otherwise recurse into the right sub-tree

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Deleting from a BST

A

Call root.delete(n)

No children: simply remove n

One child: Transfer child’s value to n and then delete child

Two children: Find the smallest node v that’s bigger than n, copy v’s data into n, and then delete v

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Time complexity of searching, inserting and deleting

A

Average: O(h) = O(log(n))
Worst: O(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Finding the smallest element in a BST

A

Always go left

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Finding the smallest element bigger than a given node in a BST

A

Start at that node, go right once and then always go left. Stop when you hit NULL.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is an AVL tree?

A

One where a given node’s subtrees do not differ in height by more than 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Tree.LeftRotate(x)

A

Remove x.right.left and store for later
Replace x with x.right
The new x.left should now be empty
x.left = old x, left: old x.left, right: old x.right.left

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is trinode restructuring

A

A combination of rotations, applied to a parent, child, and grandchild (z, y, x) depending on whether the child and grandchild are on the same side

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Trinode restructuring: single rotation (child and grandchild on same side)

A

Assuming both right:
Tree.LeftRotate(z)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Trinode restructuring: double rotation (child and grandchild on opposite sides)

A

Assuming right left:
Tree.RightRotate(y)
Tree.LeftRotate(z)
The middle one out of x,y,z always ends up as the root of the subtree

17
Q

Height of an AVL tree storing n keys

A

O(log(n))

18
Q

How to perform post-insertion fix-up

A

From the newly-inserted node p, find the lowest node which is unbalanced. If one exists, name it z and do trinode restructuring on z, y, and x.
You can stop when you reach a node whose height didn’t change

19
Q

How to perform post-deletion fix-up

A

Let p be the parent of the deleted node
From p, start searching for z (the first unbalanced node)
Let y be the taller child of z
Let x be the taller child of y (or if the children have the same height, on the same side as y)
Keep going up the tree, restructuring when you meet an unbalanced node
You can stop when you reach a node whose height didn’t change, or when restructuring results in the same height as before insertion/deletion

20
Q

AVL tree performance

A

Uses O(n) space to store n items
Restructuring = O(1)
Searching = O(log(n))
Insertion and removal = O(log(n)) for find + O(log(n)) for restructuring = O(log(n)) overall

21
Q

What is a priority queue

A

Data structure where each element is a tuple (key-value pair) and the key is an integer representing the priority of that item

22
Q

Priority queue operations

A

P.add(k, v): Insert an item with key k and value v
P.min(): Return the tuple (k, v) with the lowest priority
P.remove_min(): Remove P.min() from the queue and return it
P.is_empty()
len(P)

23
Q

What is a heap

A

A data structure like a tree, except they are stored in a flat array
The tree has to complete except for the lowest level
HeapSize(A) = number of elements in heap stored in A
Height of a heap = floor(log(len(A)))

24
Q

Max-heap vs Min-heap

A

Max-heap: Every node’s value is less than than or equal to its parent
Min-heap: Every node’s value is greater than or equal to its parent

25
Q

Inserting into a max-heap H

A

Step 1: Add the new element node to the bottom-left node on the left side
Step 2 (heapify): If the new node value is bigger than its parent, swap their values and set the parent node as the new node. Repeat this step until the new node value is smaller than the parent

26
Q

Array indices

A

Root is in A[1]
i.left = A[2i]
i.right = A[2i + 1]
i.parent = A[floor(i/2)]

27
Q

Heapification

A

Heapify(A, v, n): A = array, v = largest, n = length
Starting at the root, identify largest out of the current node (v) and its children, let the largest element be w. Exit if leaf
If v =/= w, swap A[w] and A[v] and then recurse into what used to be v
Running time: O(log(n))

28
Q

Building a heap from array A

A

for i=len(A) downto 1:
Heapify(A, i, n)
Running time: O(n)

29
Q

Extraction (deletion) from a heap

A

Elements are always deleted from the root of the heap
Step 1: Replace the root node’s value with the last node’s value so that H is still a complete binary tree
Step 2: Delete the last node
Step 3: Move the new root node’s value down until H satisfies the heap property

30
Q

HeapSort

A

Call BuildHeap on unsorted data
For i = Length(A) downto 2:
Swap A[i] and A[1]
Reduce HeapSize by 1
Heapify(A, 1, HeapSize(A))
Time complexity: O(n*log(n))

31
Q

What is a decision tree?

A

A full binary tree (all nodes have either 0 or 2 children) representing all comparisons between n elements made by a given algorithm
Internal nodes are labelled i:j meaning elements i and j are compared
Downward edges are marked with < or >
Leaves are labelled with some permutation of {1, …, n} (n! leaves)
Correct sorting algorithms must be able to produce each permutation of input

32
Q

Lower bound for worst case using decision trees

A

The length of the longest path from the root to a leaf represents the worst case number of comparisons for that value of n

33
Q

Using decision trees, prove that any comparison-based algorithm requires Ω(n*log(n)) comparisons in the worst case

A

Consider a decision tree of height h with L leaves corresponding to a comparison sort of n elements
Since each of the n! permutations appears as a leaf, L >= n!
A binary tree of height h has at most 2^h leaves: L <= 2^h
n! <= L <= 2^h, therefore 2^h >= n!
h >= log(n!) = Ω(n*log(n))

34
Q

What is an adversary

A

An algorithm that dynamically constructs bad inputs for an algorithm to test its running time
When the search algorithm accesses its input data (e.g. if i[2] < i[3], do blablabla) it generates a response
It gives answers so that there’s always a consistent input
It helps us find the lower bound on the runtime of an algorithm

35
Q

Proof that finding the max element in an unsorted array takes n-1 comparisons

A

After n-2 comparisons, 2 elements have never lost a comparison, therefore there is not enough info to tell which one is bigger

36
Q

Algorithm for finding the 2nd largest item in an unsorted list

A

Finds max in n-1 comparisons, there are ceil(log_2(n)) elements which lost directly to max.
2nd largest overall is largest out of these
Finding the largest out of these takes ceil(log_2(n)) - 1 comparisons
Altogether, this takes n + ceil(log_2(n)) - 2 comparisons

37
Q

Number of comparisons needed to find kth largest in array of size n

A

V(n, k) >= n - k + ceil(log_2(n choose k-1))