You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: recursion/README.md
+4-4
Original file line number
Diff line number
Diff line change
@@ -34,7 +34,7 @@ func fibonacci(n int) int {
34
34
35
35
When formulating recursive algorithms, it is essential to consider the following four rules of recursion:
36
36
37
-
1. It is imperative to establish a base case, or else the program will terminate abruptly
37
+
1. It is imperative to establish a base case, or else the program keep recursing and terminate abruptly after running out of stack memory
38
38
2. The algorithm should progress toward the base case at each recursive call
39
39
3. Recursive calls are presumed effective; thus, traversing every recursive call and performing bookkeeping is unnecessary
40
40
4. Use memoization, a technique that prevents redundant computation by caching previously computed results, can enhance the algorithm's efficiency.
@@ -45,9 +45,9 @@ Recursions are often inefficient in both time and space complexity. The number o
45
45
46
46
There are a few different ways of determining the time complexity of recursive algorithms:
47
47
48
-
1. Recurrence Relations: This approach involves defining a recurrence relation that expresses the algorithm's time complexity in terms of its sub-problems' time complexity. For example, for the recursive Fibonacci algorithm, the recurrence relation is T(n) = T(n-1) + T(n-2) + O(1), where T(n) represents the time complexity of the algorithm for an input of size n.
49
-
2. Recursion Trees: This method involves drawing a tree to represent the algorithm's recursive calls. The algorithm's time complexity can be calculated by summing the work done at each level of the tree. For example, for the recursive factorial algorithm, each level of the tree represents a call to the function with a smaller input size, and the work done at each level is constant.
50
-
3. Master Theorem: This approach is a formula for solving recurrence relations that have the form T(n) = aT(n/b) + f(n). The Master Theorem can be used to quickly determine the time complexity of some [Divide-and-conquer](../dnc) algorithms.
48
+
1. Recurrence Relations: This approach involves defining a recurrence relation that expresses the algorithm's time complexity in terms of its sub-problems' time complexity. For example, for the recursive Fibonacci algorithm, the recurrence relation is T(n) = T(n-1) + T(n-2) + O(1), where T(n) represents the time complexity of the algorithm for an input of size n
49
+
2. Recursion Trees: This method involves drawing a tree to represent the algorithm's recursive calls. The algorithm's time complexity can be calculated by summing the work done at each level of the tree. For example, for the recursive factorial algorithm, each level of the tree represents a call to the function with a smaller input size, and the work done at each level is constant
50
+
3. Master Theorem: This approach is a formula for solving recurrence relations that have the form T(n) = aT(n/b) + f(n). The Master Theorem can be used to quickly determine the time complexity of some [Divide-and-conquer](../dnc) algorithms
51
51
52
52
The space complexity of recursive calls is affected by having to store a copy of the state and variables in the stack with each recursion.
0 commit comments