Previous Section
 < Free Open Study > 
Next Section


Summary

Recursion is a very powerful computing tool. Used appropriately, it can simplify the solution of a problem, often resulting in shorter, more easily understood source code. As usual in computing, tradeoffs are necessary: Recursive functions are often less efficient, in terms of both time and space, due to the overhead associated with many levels of function calls. How expensive this cost is depends on the computer system and the compiler.

A recursive solution to a problem must have at least one base case-that is, a case where the solution is derived nonrecursively. Without a base case, the function will recurse forever (or at least until the computer runs out of memory). The recursive solution also has one or more general cases that include recursive calls to the function. The recursive calls must involve a "smaller caller." One (or more) of the actual parameter values must change in each recursive call to redefine the problem to be smaller than it was on the previous call. Thus each recursive call leads the solution of the problem toward the base case(s).

A typical implementation of recursion involves the use of a stack. Each call to a function generates an activation record to contain its return address, parameters, and local variables. The activation records are accessed in a last in, first out manner. Thus a stack is the choice of data structure.

Recursion can be supported by systems and languages that use dynamic storage allocation. The function parameters and local variables are not bound to addresses until an activation record is created at run time. Thus multiple copies of the intermediate values of recursive calls to the function can be supported, as new activation records are created for them.

With static storage allocation, in contrast, a single location is reserved at compile time for each parameter and local variable of a function. No place is provided to store any intermediate values calculated by repeated nested calls to the same function. Therefore, systems and languages with only static storage allocation cannot support recursion.

When recursion is not possible or appropriate, a recursive algorithm can be implemented nonrecursively by using a looping structure and, in some cases, by pushing and popping relevant values onto a stack. This programmer-controlled stack explicitly replaces the system's run-time stack. While such nonrecursive solutions are often more efficient in terms of time and space, they usually involve a tradeoff in terms of the elegance of the solution.



Previous Section
 < Free Open Study > 
Next Section
Converted from CHM to HTML with chm2web Pro 2.85 (unicode)