Now, we're going to look at a performance pitfall with recursion that's often ignored, but it's very important, and it will also lead us to another important programming scheme. Let's start with the Fibonacci numbers. We've used it as an example before, and you're certainly familiar with it. It's the sequence defined with F0=0, F1=1, and then from then on, each number is computed in the sequence, is computed by adding the previous two. So, one plus two is three, but then the fifth one is two plus three is five. Three plus five is eight. Five plus eight is 13 and so forth. That's the Fibonacci numbers. Now, it's actually widely found in art and architecture, and we'll look at a couple of applications. Fibonacci studied it for actually studying rabbit populations, and we'll see plenty of other examples, and it's very interesting to study just within discrete mathematics. For example, it's known that the ratio between two Fibonacci numbers approaches the golden ratio. Renaissance in classical art and architecture contains this golden ratio, and we'll see some examples of that. Here's another way of looking at it. If you just add a square proportional to the previous Fibonacci number, you can arrange them in this kind of sequence. If you actually draw semicircle through every square, you get an interesting curve. Lots of facts that I'm just throwing up on this slide to show you the ubiquity of the Fibonacci numbers. So, the Parthenon comes in those ratios that appears in nature. That's a nautilus shell, sunflower, galaxy, Mona Lisa herself. This here is our friend, the Pascal's triangle. If you look at it, that kind of an angle, you get the Fibonacci numbers. Kind of amazing. Even Darth Vader involves the golden ratio and the Fibonacci numbers, so very interesting sequence. In the present context, what we're interested in is computing them. So, somebody might say, "Well, I've got a computer. What's the 60th Fibonacci number?" and a naive programmer might say, "Okay." That's obvious. You can write a recursive program for that and come up with that program. Recursive program will use long because it's probably going to be a big number, n=0 return zero. N=1, return one. Otherwise, recursively call for F(n-1), F(n-2). Looks pretty good. Take the number we want from the command line, and then just call this recursive function. Sure enough, it works pretty well. Get numbers that I recognized. The 12th ones went 44, so let's go for it. What's the 50th one? Actually on your computer, it's going to take a little while to compute that one, and if you try to do 60, now those programmer's going to say, "Wait a minute, there's something wrong with my computer." It's not really, what's wrong is understanding the cost of recursion properly. So, let's look at the recursive call tree for Fibonacci numbers, say when we're computing F(6). So to compute f of six, we have to compute F(5) and F(4), and so forth. That's what the tree looks like. So, let's take a look at the recursive call tree for F(60). We'll define the number of times that F(n) is called to be C sub n. So, if we're computing F(60), then we call F(60) just once. It will just note that that's F(1). That's the first Fibonacci number. F(59) also gets called just once. Just note that's F(2), the second Fibonacci number. Fifty eight gets called twice, and that turns out to be F(3). Fifty seven gets called three times. Again, you can actually, after you see the pattern approve that the number of times F(n) has called is in decrease. This goes up with the Fibonacci numbers, and so we have a sequence of Fibonacci numbers increasing, which is the number of times that F(n) is called. When we get down to f of zero, that's gets called F(61) times, and that's a problem because F(61) is a huge number, and it's easy to see why. If we've already computed F(58) at whatever cost, over here, we're computing it all over again. Similarly, it's exponentially wasteful to recompute all these values. It's trillions of calls on F(0), and not to mention, the calls on all the other ones. These are values that we already know, and we threw that information away. Remember F(n) grows as golden ratio to the nth power, so it's going to be an exponential calls just to know that F(0)=1. So, obviously, not an efficient way to proceed, and if you're going to engage in waste like this, you will not be able to solve a large problem. Giving this lecture in the 1970s, where we had a computer, maybe it would take minutes to compute F(30), and hours to compute F(50), and years to compute F(60). This is a small computer that maybe could do a million instructions a second. But nowadays, even though we have a machine that's 10,000 times faster, it just means that we can get just a little bit further. So, we used to say in the 70s, that program's not going to compute F(60) before you graduate. Now, we just say it's not going to compute F(80) before you graduate. So, the point is, you don't want to engage in exponential waste. Now, the thing is, it's very easy to avoid, and it's a technique called memoization, and so what we're gonna do is just remember all the computed values. We use an array memo, and if we already remember that value, we just return it, otherwise compute it, remember it, and then return it. So, this is a general technique that can be applied to lots of programs. So, we do know that the numbers are going to get huge, and so F(100) is not going to fit in along. So, that's a bound on the size of that array. So now, if we want to compute F(n), well, we can still have our base case for zero and one, but otherwise, we look to see if we've computed that value before. If we have not computed that value before, the memo for that will be zero, so we just go ahead and do the recursive calls, and then save away that value. Then, when we're done, we return it. So, that simple change, all of a sudden gives an extremely fast program. A very easy way to avoid exponential waste, then we can do 50, 60, or 80 with that simple change, so you want to know that. Now, in some of the cases that we did like the Tower of Hanoi, and the H tree, and the Brownie Cloud, we actually were running the thing. It was exponential in our parameter n, but we were cutting off way before we were wasting anything. We actually wanted to compute all those pixels or all those moves. But in a case like this, which is also typical, you cannot assume that your computer's fast enough to beat exponential waste. This is also important because it's a simple example of a very important programming technique called dynamic programming. It's related to recursion. It's kind of like recursion with memoization or working without doing the recursion, and that's what we're going to look at next.