In this column, we will discuss memoization – a technique to speed up recursive computations. RecursionIn mathematics and computer science, recursion is a way of specifying something (usually a mathematical object or part of a computer program) by reference to itself. More precisely (and to dispel the appearance of circularity in the definition), "complicated" instances are defined in terms of "simpler" instances, and the "simplest" instances are given explicitly. One interesting "application" of recursion is the definition of the set of natural numbers. We can define a natural number recursively as follows:
So, 0 is natural by definition. 1 is natural, because 0 is natural and 1 = 0 + 1. 2 is natural because 2 = 1 + 1 and 1 is natural, and so on. Fibonacci numbersA canonical example of recursion is the computation of Fibonacci numbers: 0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, 610, 987, 1597, 2584, 4181, 6765, 10946 ... This sequence can be defined recursively as follows (F(n) is the n^{th} Fibonacci number):
You can easily follow this definition and generate some Fibonacci numbers for yourself  you will get precisely the sequence given above. How would we generate this sequence in code? It's quite simple  recursive definitions are easily reflected in recursive function calls. Here is the C++ code that returns any Fibonacci number (constrained by executiontime and limitations of the C++ long type): long fib_rec( long index ) { if (index < 2) return index; else return fib_rec(index  1) + fib_rec(index  2); } Note how gracefully the mathematical definition is translated to the code of fib_rec. This is the beauty and elegance of recursion. Unfortunately, everything has its price. While often being the most natural way to express algorithms, recursion can suffer from performance problems. For example, finding the 40th Fibonacci number (which is, by the way, 102,334,155) using this routine takes about 4 seconds on my machine. The 42nd number (267,914,296) takes 11 seconds to compute, and the time grows very quickly (the 45th, which is 1,134,903,170, takes 47 seconds, etc.) One reason for this is the cost of function calls (of which there are many in the recursive solution). When a function is called, there is always a certain amount of overhead. For small functions, this overhead can be comparable to the time required to execute the function itself. This results in a performance hit. However, this is not the main reason for recursion being slow for the computation of Fibonacci numbers. The principal cause, in this case, is the vast amount of repetition involved. To demonstrate this, let's try to trace a sample execution of the recursive computation fib_rec, taking a call with index set to 5 as an example: fib_rec(5)  fib_rec(4)    fib_rec(3)      fib_rec(2)        fib_rec(1)    fib_rec(0)      fib_rec(1)    fib_rec(2)    fib_rec(1)  fib_rec(0)  fib_rec(3)   fib_rec(2)    fib_rec(1)  fib_rec(0)  fib_rec(1) When fib_rec(5) is called, it calls fib_rec with 4 and fib_rec with 3. Each of those makes the appropriate calls, et cetera. What you see above is the complete call tree that results from the fib_rec(5). You can generate it yourself by inserting a tracing printout in the beginning of the function. Now, do you notice anything funny about this call tree? It shouldn't be hard to spot the scandalous number of times the same calls are made. For instance, the call fib_rec(1) is made 5 times. The result of fib_rec(i) surely doesn't change between calls (the first Fibonacci number is, by definition, 1), so why is there a need for so much repetition? This, in fact, is the reason for the unfortunate inefficiency of recursive algorithms for many computations. So can we really write nice recursive algorithms and not be daunted by their performance problems? The answer to this question is fortunately positive! 
