Teacher resources and professional development across the curriculum
Teacher professional development and classroom resources across the curriculum
The iterative, discrete-time view of chaos is powerful because it allows us to see how a system evolves, step-by-step. Most nonlinear systems are chaotic only under certain circumstances. A discrete-time analysis can help pin down these circumstances. An example of this is the flow of water, or any fluid. As long as it is allowed to flow at a reasonable speed along a course free of obstacles, fluid flow is nice and predictable. However, as the speed of flow increases, or as obstacles are added in the path of the flow, the flow starts to get somewhat unpredictable. Eventually, under certain conditions, the fluid no longer behaves in a predictable way at all; this condition is called turbulence.
Turbulence is a good deal more complicated and less understood than classic chaos, but the point is that our system changes its qualitative behavior, depending on the specific parameters we assign to it. We expect that using different starting values will give us different results, but we also naturally tend to expect that those results, while different quantitatively, will be somewhat similar qualitatively. We might expect that doubling the weight of a moving particle would halve its velocity, given the same amount of force. We would probably also expect that the particle would still get to where it was headed initially; although it might take longer. In a chaotic system, however, doubling the weight might cause the particle to reverse direction, stop, oscillate between two or more values, or exhibit any number of qualitatively different behaviors.
The point at which a system changes from one fundamental type of behavior to another is called a bifurcation. An important question then is "for what values of our system's parameters does bifurcation occur?" Applied to our system of moving water, the question is "at what speed does the water flow become turbulent?" Answering this question and others like it is of great importance if you are designing boats, testing aircraft, trying to understand the fluctuations of the stock market, or trying to predict how populations of wild animals rise and fall.
Let's take a look at one specific iterative function, or map, to see bifurcation and chaos in action. The function we will investigate, often called the logistic map, represents a highly simplified model of population fluctuations. It takes an initial population level and tells you what the population will be after some fixed interval of time, or time step. The time step can be as long or as short as you care to make it, depending on what species you are studying. For our purposes, we'll just make it some arbitrary quantity representing a generation. The equation then, for some population pn+1 after an arbitrary time step, starting with population pn is:
pn+1 = rpn(1-pn)
In this equation, the parameters that we can modify are the growth rate, r, and the initial population, p0. In particular, we would like to know how the growth rate affects the overall behavior of the system.
For example, if r is less than 1, pn goes to zero as n goes to infinity. This means that the population diminishes to the point of extinction.
If r is between 1 and 3, the population eventually settles at some steady-state value. Although the population may wobble a bit over short time spans, the long-term behavior after many iterations is for the system to settle on one population size.
If we let r = 3, we see a surprising change in the system's behavior. Instead of settling on one value, the population oscillates between two different values forever. For our population this would mean, for example, that boom years are followed directly by bust years and vice versa. This change in behavior is a bifurcation from steady-state values to oscillations of period 2. We say "period 2" because it takes two iterations to return to the original value.
As r increases beyond 3, more interesting behavior emerges. We start to see more bifurcations, and they become more frequent. Each time, the period of oscillation doubles.
The population oscillates first with period 2 when r = 3. When r = 3.449, the period doubles to period 4, indicating that it now takes four iterations for the population to return to a value that it has had before. The period continues to double from 4 to 8 to 16, each time at a successively smaller increment of increase in r. Eventually, when r = 3.569946, the period becomes infinite. This means that the population fluctuates wildly, never regularly returning to any previous value.
These period-doubling bifurcations are quite fascinating. Why does a population that is stable at 2.999999 start swinging between two different values at 3? Also, why does this oscillation occur more and more rapidly as the r-value approaches the magic number of 3.569946? Furthermore, what happens if we let r get bigger than 3.569946?
It is tempting to think that as r increases, the more chaotic the population becomes, but the actual behavior is much more varied than this. The logistic map shows a range of behaviors. Above the magic number, the population becomes chaotic, never settling onto a fixed value and never falling into any periodicity. This is the same sort of behavior that we saw earlier in Lorenz's weather simulations.
There are certain "windows" of r-values, above the magic number, that give oscillating populations. It seems that the system bifurcates both into and out of chaos, depending on what r-values one chooses.
We can see the global behavior of the logistic map by looking at what is known as an orbit diagram. This type of diagram is different than the ones we have previously seen in this unit. Those previous diagrams showed how population evolved in time, step by step. An orbit diagram shows how the behavior of a system changes, depending on the r-value. It's a way to see the long-range global behavior of the entire system at a glance.
Looking at this diagram, we see r represented along the horizontal and a general p-value along the vertical. This tells us which values of p are accessible for a given value of r. For r between 1 and 3, p settles on one value (not shown in graph). At 3, the graph bifurcates into an oscillation between two values. Our graph picks up at r = 3.4, showing the two values between which p oscillates. A little bit further along, we see each of those values bifurcate into two more values at a little more than 3.4. This indicates that the population varies between four different values before it returns to where it started.
A little further along, we can see the system double, double again, and then double yet again. Eventually, around r = 3.6, it gets really messy. This is chaos, but notice that it does not last forever. As r continues to increase, we see the messiness clear up, at least for small windows of clean oscillations.
There are many different maps like the logistic map that show bifurcations and chaotic behavior. In addition to the surprising mixture of order and chaos revealed in the logistic map, there is a more-deeply-hidden surprise awaiting when the bifurcation behavior of all such maps is examined. This surprise was one of the first footholds that mathematicians established in the seemingly hopeless world of chaos.
Mitchell Feigenbaum was a fixture at Los Alamos National Laboratory in the 1970s. Known for his breadth of knowledge, he was a trusted resource when a colleague needed to bounce around ideas from any number of challenging fields. One of Feigenbaum's many interests was the bifurcation behavior of different maps. Specifically, he looked at the intervals at which successive bifurcations occur. In the logistic map, we saw that bifurcations did not occur at some steady rate, but rather tended to cluster together. In other words, a system might take a long time to evolve from steady-state values to oscillating behavior, but not nearly as long to have a period-doubling bifurcation. Feigenbaum was interested in the pattern behind these bifurcations, if there was any. Because these bifurcations occur before the onset of chaos, they can be thought of as "the road to chaos" in some sense. Feigenbaum felt that if he could understand the bifurcations, he would have made an in-road into understanding chaos.
He began by looking at the intervals between bifurcations. Although he found no regularity in the intervals themselves, he found an astonishing pattern in the differences between the intervals. For example, if one bifurcation occurred at 3, and the next occurred at 3.4, and the next at 3.5, the successive differences would be 0.4 and 0.1 respectively. When he looked at the ratio of these differences, he found that they tended toward a certain irrational number, the first few digits of which are 4.669. What is remarkable is that this number is the same no matter which map one looks at, as long as it has only one parameter, as does the logistic map.
Feigenbaum's constant, 4.669…, can be thought of as the ratio of successive bifurcation intervals in a system. It can be used to predict the onset of chaos in a system before it ever shows up. So, even though a chaotic system is fundamentally unpredictable, one can predict when the system will reach the chaotic state. This concept, known as universality, was an important step in the understanding of chaotic behavior.
Feigenbaum's work showed that the study of chaos was more than just an exercise in rationalizing our inability to predict certain phenomena. He showed that the onset of chaos itself could be predicted and thus, hopefully, better controlled. Furthermore, because of the notion of sensitive dependence, if chaos can be controlled, perhaps it can be manipulated to achieve some desirable end, instead of simply imposing a barrier to impede our ability to predict the future. In our final section, we will see how the concepts of chaos can be used to our benefit.
Next: 13.7 Fly Me to the Moon