Big O - Data Structures Flashcards

1
Q

What is the time complexity of SEARCH for an ARRAY?

A

O(n) - okay

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the time complexity of INSERTION for an ARRAY?

A

O(n) - okay

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the time complexity of DELETION for an ARRAY?

A

O(n) - okay

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the worst space complexity of an array?

A

O(n) - okay

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the time complexity of ACCESS of a STACK?

A

O(n) - okay

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the time complexity of SEARCH of a STACK?

A

O(n) - okay

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the time complexity of INSERTION of a STACK?

A

O(1) - great

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the average time complexity of DELETION of a STACK?

A

O(1) - great

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the worst space complexity of a STACK?

A

O(n) - okay

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the time complexity of ACCESS of a LINKED LIST?

A

O(n) - okay

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the time complexity of SEARCH of a LINKED LIST?

A

O(n) - okay

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the time complexity of INSERTION of a LINKED LIST?

A

O(1) - great

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the time complexity of DELETION of a LINKED LIST?

A

O(1) - great

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the worst space complexity of a LINKED LIST?

A

O(n) - okay

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the average time complexity of access, search, insertion, and deletion of a SKIP LIST?

A

O(log(n)) - good

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the worst time complexity of access, search, insertion, and deletion of a SKIP LIST?

A

O(n) - okay

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is the worst space complexity of a SKIP LIST?

A

O(n log(n)) - bad!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the time complexity of ACCESS of a HASH TABLE?

A

Not applicable (why?)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is the average time complexity of SEARCH, INSERTION, and DELETION of a HASH TABLE?

A

O(1) - great!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What is the worst time complexity of SEARCH, INSERTION, and DELETION of a HASH TABLE?

A

O(n) - okay

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What is the worst space complexity of a HASH TABLE?

A

O(n) - okay

22
Q

What is the average time complexity of access, search, insertion, and deletion of a BINARY SEARCH TREE?

A

O(log(n)) - good

23
Q

What is the worst time complexity of access, search, insertion, and deletion of a BINARY SEARCH TREE?

A

O(n) - okay

24
Q

What is the worst space complexity of a BINARY SEARH TREE?

A

O(n) - okay

25
Q

What is the average time complexity of ACCESS for a CARTESIAN TREE or a SPLAY TREE?

A

Not applicable (why?)

26
Q

What is the average time complexity of search, insertion and deletion for a CARTESIAN TREE?

A

O(log(n)) - good

27
Q

What is the worst time complexity of search, insertion and deletion for a CARTESIAN TREE?

A

O(n) - okay

28
Q

What is the worst space complexity of a CARTESIAN TREE?

A

O(n) - okay

29
Q

What is the time complexity of access, search, insertion and deletion for a B-TREE?

A

O(log(n)) - good

30
Q

What is the time complexity of access, search, insertion and deletion for a RED-BLACK TREE?

A

O(log(n)) - good

31
Q

What is the time complexity of search, insertion and deletion for a SPLAY TREE?

A

O(log(n)) - good

32
Q

What is the time complexity of access, search, insertion and deletion for a AVL TREE?

A

O(log(n)) - good

33
Q

What is the worst space complexity of a B-TREE?

A

O(n) - okay

34
Q

What is the worst space complexity of a RED-BLACK TREE?

A

O(n)- okay

35
Q

What is the worst space complexity of a SPLAY TREE?

A

O(n) - okay

36
Q

What is the worst space complexity of a AVL TREE?

A

O(n) - okay

37
Q

What is an array?

A

A simple data structure consisting of a collection of elements, identified by at least one array index or key. [3,5,6,8,2]

38
Q

What is a stack?

A

An abstracted data type which is an ordered collection of elements with two operations, push (adds) and pop(removes last added).

39
Q

What data structures can be used to implement a STACK?

A

Arrays or linked lists.

40
Q

What is a linked list?

A

A data structure consisting of a group of nodes which represent a sequence. Each node consists of data and reference to the next node.

41
Q

What are advantages of linked lists?

A
  1. Dynamic data structure that allocates memory when needed 2. Insertion and deletion are easy to implement and FAST! 3. Can create stacks and queues easily. 4. Reduce access time and can expand in real time without memory overhead.
42
Q

What is a doubly linked list?

A

A linked list where each node contains a reference to the next and previous nodes, allowing forward and backward traversal.

43
Q

What is XOR-linking?

A

It uses bitwise XOR operation to compress the next and previous node address information into a single address field.

44
Q

What are some disadvantages of linked lists?

A
  1. Uses more memory because of pointers needing to store next node info. 2. Nodes need to be read in order from beginning. 3. Nodes stored incontiguously (entries are not stored next to each other in memory), which increases time of access. 4. Reverse traversal of singly-linked lists is cumbersome, and double linked lists take up a lot of space.
45
Q

What is a skip list?

A

A data structure that allows fast search within an ordered sequence of elements. Can be implemented with parallel linked lists.

46
Q

What is a hash table?

A

A data structure that maps keys to values (dictionary, associative array). It uses a hash function that takes a value and assigns an index to it. Usually implemented with arrays.

47
Q

What is a binary search tree?

A

A family of data structures that stores items (numbers, names, etc) in memory in sorted order so look up can use binary search.

48
Q

What is a Cartesian tree?

A

A Cartesian tree is a tree data structure created from a set of data that obeys the following structural invariants:

  1. The tree obeys in the min (or max) heap property – each node is less (or greater) than its children.
  2. An inorder traversal of the nodes yields the values in the same order in which they appear in the initial sequence.
  3. Cartesian Tree is not a height-balanced tree.
  4. Cartesian tree of a sequence of distinct numbers is always unique.

Time Complexity :
At first look, the code seems to be taking O(n2) time as there are two loop in buildCartesianTree(). But actually, it takes O(NlogN) time in average and O(n^2) for sorted preorder traversal.

Auxiliary Space:
We declare a structure for every node as well as three extra arrays- leftchild[], rightchild[], parent[] to hold the indices of left-child, right-child, parent of each value in the input array. Hence the overall O(4*n) = O(n) extra space.

Application of Cartesian Tree

Cartesian Tree Sorting

A range minimum query on a sequence is equivalent to a lowest common ancestor query on the sequence’s Cartesian tree. Hence, RMQ may be reduced to LCA using the sequence’s Cartesian tree.

Treap, a balanced binary search tree structure, is a Cartesian tree of (key,priority) pairs; it is heap-ordered according to the priority values, and an inorder traversal gives the keys in sorted order.

Suffix tree of a string may be constructed from the suffix array and the longest common prefix array. The first step is to compute the Cartesian tree of the longest common prefix array.

Cartesian tree of a sequence of distinct numbers is always unique.
We will prove this using induction. As a base case, empty tree is always unique. For the inductive case, assume that for all trees containing n’ < n elements, there is a unique Cartesian tree for each sequence of n’ nodes. Now take any sequence of n elements. Because a Cartesian tree is a min-heap, the smallest element of the sequence must be the root of the Cartesian tree. Because an inorder traversal of the elements must yield the input sequence, we know that all nodes to the left of the min element must be in its left subtree and similarly for the nodes to the right. Since the left and right subtree are both Cartesian trees with at most n-1 elements in them (since the min element is at the root), by the induction hypothesis there is a unique Cartesian tree that could be the left or right subtree. Since all our decisions were forced, we end up with a unique tree, completing the induction.

**How to construct Cartesian Tree?**
A O(n2) solution for construction of Cartesian Tree is discussed here (Note that the above program here constructs the “special binary tree” (which is nothing but a Cartesian tree)

A O(nlogn) Algorithm :
It’s possible to build a Cartesian tree from a sequence of data in O(NlogN) time on average. Beginning with the empty tree,

Scan the given sequence from left to right adding new nodes as follows:

Position the node as the right child of the rightmost node.

Scan upward from the node’s parent up to the root of the tree until a node is found whose value is greater than the current value.

If such a node is found, set its right child to be the new node, and set the new node’s left child to be the previous right child.

If no such node is found, set the new child to be the root, and set the new node’s left child to be the previous tree.

Source: https://www.geeksforgeeks.org/cartesian-tree/

49
Q

What is a red-black tree?

A

A red-black tree is a kind of self-balancing binary search tree where each node has an extra bit, and that bit is often interpreted as the colour (red or black). These colours are used to ensure that the tree remains balanced during insertions and deletions. Although the balance of the tree is not perfect, it is good enough to reduce the searching time and maintain it around O(log n) time, where n is the total number of elements in the tree. This tree was invented in 1972 by Rudolf Bayer.

It must be noted that as each node requires only 1 bit of space to store the colour information, these types of trees show identical memory footprint to the classic (uncoloured) binary search tree.

Rules That Every Red-Black Tree Follows:

Every node has a colour either red or black.

The root of tree is always black.

There are no two adjacent red nodes (A red node cannot have a red parent or red child).

Every path from a node (including root) to any of its descendant NULL node has the same number of black nodes.

Why Red-Black Trees?

Most of the BST operations (e.g., search, max, min, insert, delete.. etc) take O(h) time where h is the height of the BST. The cost of these operations may become O(n) for a skewed Binary tree. If we make sure that the height of the tree remains O(log n) after every insertion and deletion, then we can guarantee an upper bound of O(log n) for all these operations. The height of a Red-Black tree is always O(log n) where n is the number of nodes in the tree.

Sr. No.AlgorithmTime Complexity

  1. SearchO(log n)
  2. InsertO(log n)
  3. DeleteO(log n)

Comparison with AVL Tree:

The AVL trees are more balanced compared to Red-Black Trees, but they may cause more rotations during insertion and deletion. So if your application involves frequent insertions and deletions, then Red-Black trees should be preferred. And if the insertions and deletions are less frequent and search is a more frequent operation, then AVL tree should be preferred over Red-Black Tree.

Source: https://www.geeksforgeeks.org/red-black-tree-set-1-introduction-2/?ref=leftbar-rightbar

50
Q

What is a splay tree?

A

Summary

A self-adjusting binary search tree with the property that recently accessed elements are quick to access again, by reorganizing the accessed element to the root.

Definition

A Splay tree is a self-adjusting binary search tree invented by Sleator and Tarjan. Unlike an AVL tree (or a Red-Black tree), the structure of the splay tree changes even after the search operation. Every time we search an item x or insert x, it moves x to the root of the tree so that the next access of x is quick. The goal of the splay tree is not to make every operation fast rather make the sequence of operations fast. The individual operation in the splay tree can, sometimes, take O(n) time making the worst case running time linear. The sequence of operations, however, take O(logn) amortized time per operation. In other words, the sequence of M operations takes O(Mlogn)

time. Since the splay tree adjusts itself according to usage, it performs much more efficiently than other binary search trees if the usage pattern is skewed.

Unlike an AVL or a Red-Black tree where the tree must maintain their invariants all the time, the structure of the splay tree can be in any arbitrary state (although it should maintain the binary search tree invariants all the time) but during every operation, it restructures the tree to improve the efficiency of future (incoming) operations.

The splay tree moves a node x

to the root of the tree by performing series of single and double tree rotations. Each double rotations moves x to its grandparent’s place and every single rotation moves x to its parent’s place. We perform these rotations until x reaches to the root of the tree. This process is called splaying. Besides moving x to the root, splaying also shortens the height of the tree which makes the tree more balanced. There are two types of single rotations and four types of double rotations. Each of them is explained in detail below.

Zig Rotation

Zig is a single rotation. We do zig rotation on node x

if x is a left child and x does not have a grandparent (i.e. x’s parent is a root node). To make the zig rotation, we rotate x’s parent to the right. This is illustrated in figure 1.

Zag Rotation

Zag rotation is a mirror of zig rotation. We do zag rotation on node x

if x is a right child and x does not have a grandparent. To make the zag rotation, we perform a left rotation at x’s parent node. This is illustrated in figure 2.

Zig-Zig Rotation

Zig-Zig is a double rotation. We do a zig-zig rotation on x

when x is a left child and x’s parent node is also a left child. The zig-zig rotation is done by rotating x’s grandparent node to the right followed by right rotation at x’s parent node.

Zag-Zag Rotation

Zag-Zag rotation is a mirror of zig-zig rotation. We do zag-zag rotation on x

if x is a right child and x’s parent is also a right child. To make the zag-zag rotation, we first do a left rotation at x’s grandparent and then do the left rotation at x’s parent node. Figure 4 illustrates this.

Zig-Zag Rotation

Zig-zag rotation is also a double rotation. We perform zig-zag rotation on x

when x is a right child and x’s parent is a left child. Zig-zag rotation is done by doing a left rotation at x’s parent node followed by right rotating x grandparent (new parent) node. This is illustrated in figure 5.

Zag-Zig Rotation

The last rotation is the zag-zig rotation. It is a mirror of zig-zag rotation. To do zag-zig rotation on node x

, we do the right rotation at x’s parent node and left rotation at x grandparent (new parent) node. Figure 6 illustrates this.

Advantages and disadvantages of Splay Trees

Following are the advantages of splay trees over other binary search trees.

In an amortized sense, ignoring constant factors, they are never much worse than constrained structures, and since they adjust according to usage, they can be much more efficient if the usage pattern is skewed.

They need less space since no balance or other constraint information is stored.

Their access and update algorithms are conceptually simple and easy to implement.

Following are the disadvantages of splay trees.

They require more local adjustments, especially during accesses (look-up operations). (Explicitly constrained structures need adjusting only during updates, not during accesses.)

Individual operations within a sequence can be expensive, which may be a drawback in real-time applications.

Source: https://algorithmtutor.com/Data-Structures/Tree/Splay-Trees/

51
Q

What is an AVL tree?

A

Adelson-Velsky and Landis tree. Self balancing binary search tree that ensures the heights of two child subtrees of any node differ by at most one. Tree rotations rotate child and node to balance height and distribute children evenly.

Why AVL Trees?
Most of the BST operations (e.g., search, max, min, insert, delete.. etc) take O(h) time where h is the height of the BST. The cost of these operations may become O(n) for a skewed Binary tree. If we make sure that height of the tree remains O(Logn) after every insertion and deletion, then we can guarantee an upper bound of O(Logn) for all these operations. The height of an AVL tree is always O(Logn) where n is the number of nodes in the tree (See this video lecture for proof).

Source: https://www.geeksforgeeks.org/avl-tree-set-1-insertion/

52
Q

What is the time complexity of ACCESS for an ARRAY?

A

O(1) - great