Skip to content

Commit ac2bbcc

Browse files
authored
Merge pull request #91 from inzva/muratbiberoglu-patch
Migrate bundle 04-graph-1
2 parents dbc10fc + a5119d9 commit ac2bbcc

File tree

11 files changed

+385
-4
lines changed

11 files changed

+385
-4
lines changed

docs/data-structures/segment-tree.md

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -103,6 +103,7 @@ Previously, update function was called to update only a single value in array. P
103103
104104
### Lazy Propogation Algorithm
105105
We need a structure that can perform following operations on an array $[1,N]$.
106+
106107
- Add inc to all elements in the given range $[l, r]$.
107108
- Return the sum of all elements in the given range $[l, r]$.
108109
@@ -118,6 +119,7 @@ Trick is to be lazy i.e, do work only when needed. Do the updates only when you
118119
Let’s be <i>lazy</i> as told, when we need to update an interval, we will update a node and mark its children that it needs to be updated and update them when needed. For this we need an array $lazy[]$ of the same size as that of segment tree. Initially all the elements of the $lazy[]$ array will be $0$ representing that there is no pending update. If there is non-zero element $lazy[k]$ then this element needs to update node k in the segment tree before making any query operation, then $lazy[2\cdot k]$ and $lazy[2 \cdot k + 1]$ must be also updated correspondingly.
119120
120121
To update an interval we will keep 3 things in mind.
122+
121123
- If current segment tree node has any pending update, then first add that pending update to current node and push the update to it’s children.
122124
- If the interval represented by current node lies completely in the interval to update, then update the current node and update the $lazy[]$ array for children nodes.
123125
- If the interval represented by current node overlaps with the interval to update, then update the nodes as the earlier update function.
@@ -202,6 +204,7 @@ Notice that the only difference with the regular query function is pushing the l
202204
203205
## Binary Search on Segment Tree
204206
Assume we have an array A that contains elements between 1 and $M$. We have to perform 2 kinds of operations.
207+
205208
- Change the value of the element in given index i by x.
206209
- Return the value of the kth element on the array when sorted.
207210
@@ -240,8 +243,9 @@ This is of course, slow. Let’s use segment tree’s to improve it. First we wi
240243
<figure markdown = "span">
241244
![segment tree updates](img/updated_segtree.png){ width="100%" }
242245
<figcaption>Segment Tree After First Update</figcaption>
246+
</figure>
243247

244-
```c++
248+
```cpp
245249
void update(int i, int x) {
246250
update(1, 1, M, A[i], --F[A[i]]); // Decrement frequency of old value
247251
A[i] = x; // Update A[i] to new value
@@ -263,15 +267,15 @@ int query(int k) {
263267
264268
If you look at the code above you can notice that each update takes $\mathcal{O}(\log M)$ time and each query takes $\mathcal{O}(\log^{2} M)$ time, but we can do better.
265269
266-
### How To Speed Up?
270+
### How To Speed Up?
267271
If you look at the segment tree solution on preceding subsection you can see that queries are performed in $\mathcal{O}(\log^{2} M)$ time. We can make is faster, actually we can reduce the time complexity to $\mathcal{O}(\log M)$ which is same with the time complexity for updates. We will do the binary search when we are traversing the segment tree. We first will start from the root and look at its left child’s sum value, if this value is greater than k, this means our answer is somewhere in the left child’s subtree. Otherwise it is somewhere in the right child’s subtree. We will follow a path using this rule until we reach a leaf, then this will be our answer. Since we just traversed $\mathcal{O}(\log M)$ nodes (one node at each level), time complexity will be $\mathcal{O}(\log M)$. Look at the code below for better understanding.
268272
269273
<figure markdown = "span">
270274
![solution of first query](img/query_soln.png){ width="100%" }
271275
<figcaption>Solution of First Query</figcaption>
272276
</figure>
273277
274-
```c++
278+
```cpp
275279
void update(int i, int x) {
276280
update(1, 1, M, A[i], --F[A[i]]); // Decrement frequency of old value
277281
A[i] = x; // Update A[i] to new value
@@ -289,4 +293,4 @@ int query(int node, int start, int end, int k) {
289293
int query(int k) {
290294
return query(1, 1, M, k); // Public interface for querying
291295
}
292-
```
296+
```

docs/graph/binary-search-tree.md

Lines changed: 156 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,156 @@
1+
---
2+
title: Binary Search Tree
3+
tags:
4+
- Tree
5+
- Binary Search
6+
- BST
7+
---
8+
9+
A Binary tree is a tree data structure in which each node has at most two children, which are referred to as the left child and the right child.
10+
11+
For a binary tree to be a binary search tree, the values of all the nodes in the left sub-tree of the root node should be smaller than the root node's value. Also the values of all the nodes in the right sub-tree of the root node should be larger than the root node's value.
12+
13+
<figure markdown="span">
14+
![a simple binary search tree](img/binarytree.png)
15+
<figcaption>a simple binary search tree</figcaption>
16+
</figure>
17+
18+
## Insertion Algorithm
19+
20+
1. Compare values of the root node and the element to be inserted.
21+
2. If the value of the root node is larger, and if a left child exists, then repeat step 1 with root = current root's left child. Else, insert element as left child of current root.
22+
3. If the value of the root node is lesser, and if a right child exists, then repeat step 1 with root = current root's right child. Else, insert element as right child of current root.
23+
24+
## Deletion Algorithm
25+
- Deleting a node with no children: simply remove the node from the tree.
26+
- Deleting a node with one child: remove the node and replace it with its child.
27+
- Node to be deleted has two children: Find inorder successor of the node. Copy contents of the inorder successor to the node and delete the inorder successor.
28+
- Note that: inorder successor can be obtained by finding the minimum value in right child of the node.
29+
30+
## Sample Code
31+
32+
```c
33+
// C program to demonstrate delete operation in binary search tree
34+
#include<stdio.h>
35+
#include<stdlib.h>
36+
37+
struct node
38+
{
39+
int key;
40+
struct node *left, *right;
41+
};
42+
43+
// A utility function to create a new BST node
44+
struct node *newNode(int item)
45+
{
46+
struct node *temp = (struct node *)malloc(sizeof(struct node));
47+
temp->key = item;
48+
temp->left = temp->right = NULL;
49+
return temp;
50+
}
51+
52+
// A utility function to do inorder traversal of BST
53+
void inorder(struct node *root)
54+
{
55+
if (root != NULL)
56+
{
57+
inorder(root->left);
58+
printf("%d ", root->key);
59+
inorder(root->right);
60+
}
61+
}
62+
63+
/* A utility function to insert a new node with given key in BST */
64+
struct node* insert(struct node* node, int key)
65+
{
66+
/* If the tree is empty, return a new node */
67+
if (node == NULL) return newNode(key);
68+
69+
/* Otherwise, recur down the tree */
70+
if (key < node->key)
71+
node->left = insert(node->left, key);
72+
else
73+
node->right = insert(node->right, key);
74+
75+
/* return the (unchanged) node pointer */
76+
return node;
77+
}
78+
79+
/* Given a non-empty binary search tree, return the node with minimum
80+
key value found in that tree. Note that the entire tree does not
81+
need to be searched. */
82+
struct node * minValueNode(struct node* node)
83+
{
84+
struct node* current = node;
85+
86+
/* loop down to find the leftmost leaf */
87+
while (current->left != NULL)
88+
current = current->left;
89+
90+
return current;
91+
}
92+
93+
/* Given a binary search tree and a key, this function deletes the key
94+
and returns the new root */
95+
struct node* deleteNode(struct node* root, int key)
96+
{
97+
// base case
98+
if (root == NULL) return root;
99+
100+
// If the key to be deleted is smaller than the root's key,
101+
// then it lies in left subtree
102+
if (key < root->key)
103+
root->left = deleteNode(root->left, key);
104+
105+
// If the key to be deleted is greater than the root's key,
106+
// then it lies in right subtree
107+
else if (key > root->key)
108+
root->right = deleteNode(root->right, key);
109+
110+
// if key is same as root's key, then This is the node
111+
// to be deleted
112+
else
113+
{
114+
// node with only one child or no child
115+
if (root->left == NULL)
116+
{
117+
struct node *temp = root->right;
118+
free(root);
119+
return temp;
120+
}
121+
else if (root->right == NULL)
122+
{
123+
struct node *temp = root->left;
124+
free(root);
125+
return temp;
126+
}
127+
128+
// node with two children: Get the inorder successor (smallest
129+
// in the right subtree)
130+
struct node* temp = minValueNode(root->right);
131+
132+
// Copy the inorder successor's content to this node
133+
root->key = temp->key;
134+
135+
// Delete the inorder successor
136+
root->right = deleteNode(root->right, temp->key);
137+
}
138+
return root;
139+
}
140+
```
141+
142+
## Time Complexity
143+
144+
The worst case time complexity of search, insert, and deletion operations is $\mathcal{O}(h)$ where h is the height of Binary Search Tree. In the worst case, we may have to travel from root to the deepest leaf node. The height of a skewed tree may become $N$ and the time complexity of search and insert operation may become $\mathcal{O}(N)$. So the time complexity of establishing $N$ node unbalanced tree may become $\mathcal{O}(N^2)$ (for example the nodes are being inserted in a sorted way). But, with random input the expected time complexity is $\mathcal{O}(NlogN)$.
145+
146+
However, you can implement other data structures to establish Self-balancing binary search tree (which will be taught later), popular data structures that implementing this type of tree include:
147+
148+
- 2-3 tree
149+
- AA tree
150+
- AVL tree
151+
- B-tree
152+
- Red-black tree
153+
- Scapegoat tree
154+
- Splay tree
155+
- Treap
156+
- Weight-balanced tree

docs/graph/heap.md

Lines changed: 138 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,138 @@
1+
---
2+
title: Heap
3+
tags:
4+
- Heap
5+
- Priority Queue
6+
---
7+
8+
<figure markdown="span">
9+
![a simple binary search tree](img/360px-Max-Heap.png)
10+
<figcaption>an example max-heap with 9 nodes</figcaption>
11+
</figure>
12+
13+
The heap is a complete binary tree with N nodes, the value of all the nodes in the left and right sub-tree of the root node should be smaller than the root node's value.
14+
15+
In a heap, the highest (or lowest) priority element is always stored at the root. A heap is not a sorted structure and can be regarded as partially ordered. As visible from the heap-diagram, there is no particular relationship among nodes on any given level, even among the siblings. Because a heap is a complete binary tree, it has a smallest possible height. A heap with $N$ nodes has $logN$ height. A heap is a useful data structure when you need to remove the object with the highest (or lowest) priority.
16+
17+
## Implementation
18+
19+
Heaps are usually implemented in an array (fixed size or dynamic array), and do not require pointers between elements. After an element is inserted into or deleted from a heap, the heap property may be violated and the heap must be balanced by internal operations.
20+
21+
The first (or last) element will contain the root. The next two elements of the array contain its children. The next four contain the four children of the two child nodes, etc. Thus the children of the node at position n would be at positions $2*n$ and $2*n + 1$ in a one-based array. This allows moving up or down the tree by doing simple index computations. Balancing a heap is done by sift-up or sift-down operations (swapping elements which are out of order). So we can build a heap from an array without requiring extra memory.
22+
23+
<figure markdown="span">
24+
![example a heap as an array](img/Heap-as-array.png)
25+
<figcaption>example a heap as an array</figcaption>
26+
</figure>
27+
28+
## Insertion
29+
30+
Basically add the new element at the end of the heap. Then look it's parent if it is smaller or bigger depends on the whether it is max-heap or min-heap (max-heap called when Parents are always greater), swap with the parent. If it is swapped do the same operation for the parent.
31+
32+
## Deletion
33+
34+
If you are going to delete a node (root node or another one does not matter),
35+
36+
1. Swap the node to be deleted with the last element of heap to maintain a balanced structure.
37+
2. Delete the last element which is the node we want to delete at the start.
38+
3. Now you have a node which is in the wrong place, You have to find the correct place for the swapped last element, to do this starting point you should check its left and right children, if one them is greater than our node you should swap it with the greatest child(or smallest if it is min-heap).
39+
4. Still current node may in the wrong place, so apply Step 3 as long as it is not greater than its children(or smaller if it is min-heap).
40+
41+
<figure markdown="span" style="width: 40%;">
42+
![](img/heap1.png)
43+
![](img/heap2.png)
44+
<figcaption>an example deletion on a heap structure</figcaption>
45+
</figure>
46+
47+
```py
48+
class BinHeap:
49+
def __init__(self):
50+
self.heapList = [0]
51+
self.currentSize = 0
52+
53+
def percUp(self,i):
54+
while i // 2 > 0:
55+
if self.heapList[i] < self.heapList[i // 2]:
56+
tmp = self.heapList[i // 2]
57+
self.heapList[i // 2] = self.heapList[i]
58+
self.heapList[i] = tmp
59+
i = i // 2
60+
61+
def insert(self,k):
62+
self.heapList.append(k)
63+
self.currentSize = self.currentSize + 1
64+
self.percUp(self.currentSize)
65+
66+
def percDown(self,i):
67+
while (i * 2) <= self.currentSize:
68+
mc = self.minChild(i)
69+
if self.heapList[i] > self.heapList[mc]:
70+
tmp = self.heapList[i]
71+
self.heapList[i] = self.heapList[mc]
72+
self.heapList[mc] = tmp
73+
i = mc
74+
75+
def minChild(self,i):
76+
if i * 2 + 1 > self.currentSize:
77+
return i * 2
78+
else:
79+
if self.heapList[i*2] < self.heapList[i*2+1]:
80+
return i * 2
81+
else:
82+
return i * 2 + 1
83+
84+
def delMin(self):
85+
retval = self.heapList[1]
86+
self.heapList[1] = self.heapList[self.currentSize]
87+
self.currentSize = self.currentSize - 1
88+
self.heapList.pop()
89+
self.percDown(1)
90+
return retval
91+
92+
def buildHeap(self,alist):
93+
i = len(alist) // 2
94+
self.currentSize = len(alist)
95+
self.heapList = [0] + alist[:]
96+
while (i > 0):
97+
self.percDown(i)
98+
i = i - 1
99+
100+
bh = BinHeap()
101+
bh.buildHeap([9,5,6,2,3])
102+
103+
print(bh.delMin())
104+
print(bh.delMin())
105+
print(bh.delMin())
106+
print(bh.delMin())
107+
print(bh.delMin())
108+
```
109+
110+
## Complexity
111+
112+
Insertion $\mathcal{O}(logN)$, delete-min $\mathcal{O}(logN)$ , and finding minimum $\mathcal{O}(1)$. These operations depend on heap's height and heaps are always complete binary trees, basically the height is $logN$. (N is number of Node)
113+
114+
## Priority Queue
115+
Priority queues are a type of container adaptors, specifically designed so that its first element is always the greatest of the elements it contains, according to some strict weak ordering criterion.
116+
117+
While priority queues are often implemented with heaps, they are conceptually distinct from heaps. A priority queue is an abstract concept like "a list" or "a map"; just as a list can be implemented with a linked list or an array, a priority queue can be implemented with a heap or a variety of other methods such as an unordered array.
118+
119+
```cpp
120+
#include <iostream> // std::cout
121+
#include <queue> // std::priority_queue
122+
using namespace std;
123+
int main () {
124+
priority_queue<int> mypq;
125+
126+
mypq.push(30);
127+
mypq.push(100);
128+
mypq.push(25);
129+
mypq.push(40);
130+
131+
cout << "Popping out elements...";
132+
while (!mypq.empty()) {
133+
cout << ' ' << mypq.top();
134+
mypq.pop();
135+
}
136+
return 0;
137+
}
138+
```

docs/graph/img/360px-Max-Heap.png

20.6 KB
Loading

docs/graph/img/Heap-as-array.png

13.1 KB
Loading

docs/graph/img/binary-tree.png

22.6 KB
Loading

docs/graph/img/binarytree.png

15.8 KB
Loading

docs/graph/img/heap1.png

41.4 KB
Loading

docs/graph/img/heap2.png

20.8 KB
Loading

docs/graph/index.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,9 @@ title: Graph
99
### [Introduction](introduction.md)
1010
### [Definitions](definitions.md)
1111
### [Representing Graphs](representing-graphs.md)
12+
### [Tree Traversals](tree-traversals.md)
13+
### [Binary Search Tree](./binary-search-tree.md)
14+
### [Heap](heap.md)
1215
### [Depth First Search](depth-first-search.md)
1316
### [Breadth First Search](breadth-first-search.md)
1417
### [Cycle Finding](cycle-finding.md)

0 commit comments

Comments
 (0)