Recursion vs. Iteration All of you covered iteration back in CS I & II, and we c
ID: 3606027 • Letter: R
Question
Recursion vs. Iteration All of you covered iteration back in CS I & II, and we covered recursion in this course, focusing on recursion in one module, and seeing it used at various times in later modules. The main advantage of using recurtsion is that it works extremely well for problems that are recursive in nature. Recursion tends to provide very elegant, minimal code for those problems, while iterative solutions tend to provide code that is not so elegant and much more verbose. The minimal code required by recursion fits very nicely with the minimalist philosophy towards alorithm design, except for a couple of pitfalls. First, if care is not taken, a recursive algorithm can quickly reach exponential running times by performing redundant computations. We saw an example of this with the recursive Fibonacci algorithm. Although the technique of dynamic programming can be used to offset this, dynamic programming requires additional space to store previously computed values. Second, the stack space used to store the activation records of recursive calls is limited, which limits the depth to which recursion can reach. An exception to this is tail recursion, a special case of recursion where there are no instructions following the recursive call. With tail recursion, it is possible to reuse the current activation record instead of further deepening the stack by adding new activation records. Some compilers, in fact, are capable of transforming tail recursion into a loop. Whether or not either of these approaches can be used cannot always be controlled, however, since this decision is usually made by the compiler.
The big question then, is, when should recursion be used? As with most questions of this nature, the answer isn't always clear-cut. Some general guidelines that prove helpful are:
If building an iterative solution to a problem is too difficult, use a recursive solution, at least for the time being. If an iterative solution can be worked out later, it can be swapped with the recursive solution.
If you know the number of recursive calls will be small enough to avoid overflowing the stack space, a recursive solution might be appropriate. Keep in mind, though, that a recursive solution can still take a while to hit the base case, with the call stack fluctuating in size as the recursive algorithm proceeds. One of the Fibonacci labs I presented back in Module 2 illustrates how the number of records on the call stack can fluctuate.
When you write a recursive algorithm, always try to check for the base cases first, and the recursive cases last. If possible, write the algorithm so that it employs tail recursion. If the code is compiled by a compiler that is capable of optimizing the tail recursion, it allows you to write the algorithm recursively, yet get better than recursive performance. If the base cases are checked last, tail recursion is impossible.
Explanation / Answer
Iteration Vs. Recursion
In the event that a recursive technique is called with a base case, the strategy restores an outcome. On the off chance that a technique is called with a more intricate issue, the strategy partitions the issue into at least two applied pieces: a piece that the strategy knows how to do and a somewhat littler form of the first issue. Since this new issue resembles the first issue, the strategy dispatches a recursive call to chip away at the littler issue.
For recursion to end, each time the recursion strategy calls itself with a marginally less difficult variant of the first issue, the succession of littler and littler issues must unite on the base case. At the point when the strategy perceives the base case, the outcome is come back to the past technique call and an arrangement of profits guarantees as far as possible up the line until the point when the first call of the strategy in the long run restores the last outcome.
Both iteration and recursion depend on a control structure: Iteration utilizes a redundancy structure; recursion utilizes a choice structure.
Both iteration and recursion include reiteration: Iteration expressly utilizes a redundancy structure; recursion accomplishes redundancy through rehashed technique calls.
Iteration and recursion each include an end test: Iteration ends when the circle continuation condition comes up short; recursion ends when a base case is perceived.
Iteration and recursion can happen limitlessly: An unending circle happens with iteration if the circle continuation test never turns out to be false; vast recursion happens if the recursion step does not lessen the issue in a way that joins on the base case.
Recursion more than once conjures the component, and thus the overhead, of technique calls. This can be costly in both processor time and memory space.
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.