Skip to content

Commit 76bedf6

Browse files
authored
Recursion review (#182)
1 parent b56bbde commit 76bedf6

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

recursion/README.md

+4-4
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ func fibonacci(n int) int {
3434

3535
When formulating recursive algorithms, it is essential to consider the following four rules of recursion:
3636

37-
1. It is imperative to establish a base case, or else the program will terminate abruptly
37+
1. It is imperative to establish a base case, or else the program keep recursing and terminate abruptly after running out of stack memory
3838
2. The algorithm should progress toward the base case at each recursive call
3939
3. Recursive calls are presumed effective; thus, traversing every recursive call and performing bookkeeping is unnecessary
4040
4. Use memoization, a technique that prevents redundant computation by caching previously computed results, can enhance the algorithm's efficiency.
@@ -45,9 +45,9 @@ Recursions are often inefficient in both time and space complexity. The number o
4545

4646
There are a few different ways of determining the time complexity of recursive algorithms:
4747

48-
1. Recurrence Relations: This approach involves defining a recurrence relation that expresses the algorithm's time complexity in terms of its sub-problems' time complexity. For example, for the recursive Fibonacci algorithm, the recurrence relation is T(n) = T(n-1) + T(n-2) + O(1), where T(n) represents the time complexity of the algorithm for an input of size n.
49-
2. Recursion Trees: This method involves drawing a tree to represent the algorithm's recursive calls. The algorithm's time complexity can be calculated by summing the work done at each level of the tree. For example, for the recursive factorial algorithm, each level of the tree represents a call to the function with a smaller input size, and the work done at each level is constant.
50-
3. Master Theorem: This approach is a formula for solving recurrence relations that have the form T(n) = aT(n/b) + f(n). The Master Theorem can be used to quickly determine the time complexity of some [Divide-and-conquer](../dnc) algorithms.
48+
1. Recurrence Relations: This approach involves defining a recurrence relation that expresses the algorithm's time complexity in terms of its sub-problems' time complexity. For example, for the recursive Fibonacci algorithm, the recurrence relation is T(n) = T(n-1) + T(n-2) + O(1), where T(n) represents the time complexity of the algorithm for an input of size n
49+
2. Recursion Trees: This method involves drawing a tree to represent the algorithm's recursive calls. The algorithm's time complexity can be calculated by summing the work done at each level of the tree. For example, for the recursive factorial algorithm, each level of the tree represents a call to the function with a smaller input size, and the work done at each level is constant
50+
3. Master Theorem: This approach is a formula for solving recurrence relations that have the form T(n) = aT(n/b) + f(n). The Master Theorem can be used to quickly determine the time complexity of some [Divide-and-conquer](../dnc) algorithms
5151

5252
The space complexity of recursive calls is affected by having to store a copy of the state and variables in the stack with each recursion.
5353

0 commit comments

Comments
 (0)