Prologue: The Rolling of the Golden Apple In 1776, a year in which political rebels in Philadelphia
were proclaiming their independence and freedom, a physicist in Europe was
proclaiming total dependence and determinism. According to Pierre-Simon
The Laplacian universe is just a giant pool table. If you know where the balls were, and you hit and bank them correctly, the right ball will always go into the intended pocket. Laplace's hubris in his ability (or that of his "intelligence") to forecast the future was completely consistent with the equations and point of view of classical mechanics. Laplace had not encountered nonequilibrium thermodynamics, quantum physics, or chaos. Today some people are frightened by the very notion of chaos. (I have explored this at length in an essay devoted to chaos from a philosophical perspective. But the same is also true with respect to the somewhat related mathematical notion of chaos.) Today there is no justification for a Laplacian point of view. At the beginning of this century, the mathematician Henri Poincaré, who was studying planetary motion, began to get an inkling of the basic problem:
In other words, he began to realize "deterministic" isn’t what it’s often cracked up to be, even leaving aside the possibility of other, nondeterministic systems. An engineer might say to himself: "I know where a system is now. I know the location of this (planet, spaceship, automobile, fulcrum, molecule) almost precisely. Therefore I can predict its position X days in the future with a margin of error precisely related to the error in my initial observations." Yeah. Well, that’s not saying much. The prediction error
may explode off to infinity at an exponential rate (read the discussion of
The distant future? You’ll know it when you see it, and
that’s the first time you’ll have a clue. (This statement will be slightly
modified when we discuss a system’s
I Meet Chaos I first came across something called "dynamical systems"
while I was at the University of California at Berkeley. But I hadn't paid
much attention to them. I went through Berkeley very fast, and didn't have
time to screw around. But when I got to Harvard for grad school, I bought
René Thom's book Consider a But right there in Thom's book was a photo of a steel
ball that had been dropped into molten lead, along with the reactive
splash of the molten liquid. The lead splash was a The word "morphogenesis" refers to the forms things take
when they grow: bugs grow into a particular shape, as do human organs. I
had read a number of books on general systems theory by Ervin Laszlo and
Ludwig von Bertalanffy, which discuss the concepts of morphogenesis, so I
was familiar with the basic ideas. Frequent references were made to
biologist D’Arcy Thompson’s book Anyway, moving along, in grad school I was looking at the
forms taken by asset prices, foreign exchange rates in particular. A
foreign exchange rate is the price that one fiat currency trades for
another. But I could have been looking at stock prices, interest rates, or
commodity prices—the principles are the same. Here the assumption is that
the systems generating the prices are nondeterministic (stochastic,
random)—but that doesn’t prevent there being hidden form, hidden order, in
the shape of Reading up on price distributions, I came across some
references to Benoit Mandelbrot. Mandelbrot, an applied mathematician, had
made a splash in economics in the early-1960s with some heretical notions
of the probabilities involved in price distributions, and had acquired as
a disciple Eugene Fama [1] at the University of Chicago. But then Fama
abandoned this heresy (for alleged empirical reasons that I find
manifestly absurd), and everyone breathed a sigh of relief and returned to
the familiar world of least squares, and price distributions that were
In economics, when you deal with prices, you first take
logs, and then look at the changes between the logs of prices [2]. The
changes between these log prices are what are often referred to as the
At the time I first looked at I went over to the Harvard Business School library to read Mandelbrot’s early articles. The business school library was better organized than the library at the Economics Department, and it had a better collection of books and journals, and it was extremely close to where I lived on the Charles River in Cambridge. In one of the articles, Mandelbrot said that the ideas therein were first presented to an economic audience in Hendrik Houthakker’s international economics seminar at Harvard. Bingo. I had taken international finance from Houthakker and went to talk to him about Mandelbrot. Houthakker had been a member of Richard Nixon’s Council of Economic Advisors, and was famous for the remark: "[Nixon] had no strong interest in international economic affairs, as shown by an incident recorded on the Watergate tapes where Haldeman comes in and wants to talk about the Italian lira. His response was ‘[expletive deleted] the Italian lira!’" Houthakker told me he had studied the distribution of cotton futures prices and didn’t believe they had a normal distribution. He had given the same data to Mandelbrot. He told me Mandelbrot was back in the U.S. from a sojourn in France, and that he had seen him a few weeks previously, and Mandelbrot had a new book he was showing around. I went over to the Harvard Coop (that’s pronounced "coupe" as in "a two-door coupe", no French accent) and found a copy of Mandelbrot’s book. Great photos! That’s when I learned what a fractal was, and ended up writing two of the three essays in my PhD thesis on fractal price distributions [3]. Fractals led me back into chaos, because maps (graphics) of chaos equations create fractal patterns. |

Preliminary Pictures and Poems The easiest way to begin to explain an elephant is to
first show someone a picture. You point and say, "Look. Elephant." So
here’s a picture of a
Notice that it has a solid blue square in the center, with 8 additional smaller squares around the center one.
Each of the 8 smaller squares looks just like the original square. Multiply each side of a smaller square by 3 (increasing the area by 3 x 3 = 9), and you get the original square. Or, doing the reverse, divide each side of the original large square by 3, and you end up with one of the 8 smaller squares. At a scale factor of 3, all the squares look the same (leaving aside the disgarded center square). You get 8 copies of the original square at a scale factor
of 3. Later we will see that this defines a fractal dimension of log 8 /
log 3 = 1.8927. (I said Each of the smaller squares can also be divided up the
same way: a center blue square surrounded by 8 Meanwhile, without realizing it, we have just defined a
Or, taking logs, we have D = log N / log r. The same things keep appearing when we scale by r, because the object we are dealing with has a fractal dimension of D. Here is a poem about
Okay. So much for a preliminary look at fractals. Let’s take a preliminary look at chaos, by asking what a dynamical system is.
Dynamical Systems What is a
x(n+1) = x(n) + 2. See? Isn’t math simple? If we plug Johnny’s current height of x(n) = 38 inches in the right side of the equation, we get Johnny’s height next year, x(n+1) = 40 inches:
x(n+1) = x(n) + 2 = 38 + 2 = 40. Going from the right side of the equation to the left is
called an This is a
x(n+1) = x(n) + 2 + where Let's return to the original deterministic equation. The
original equation, x(n+1) = x(n) + 2, is
z(n+1) = z(n) + 5 y(n) –2 x(n) is linear, for example. But if you
x(n+1) = x(n) is nonlinear because x(n) is squared. The equation
z = xy is nonlinear because two variables, Okay. Enough of this. What is chaos? Here is a picture of
chaos. The lines show how a dynamical system (in particular, a
Notice also that the system keeps looping around two
general areas, as though it were drawn to them. The points from where a
system feels compelled to go in a certain direction are called the
Here’s an equation whose attractor is a single point, zero:
x(n+1) = .9 x(n) . No matter what value you start with for x(n), the next
value, x(n+1), is only 90 percent of that. If you keep iterating the
equation, the value of x(n+1) approaches zero. Since the attractor in this
case is only a single point, it is called a Some attractors are simple circles or odd-shaped closed
loops—like a piece of string with the ends connected. These are called
Other attractors, like the Lorenz attractor above, are
really weird. Strange. They are called Okay. Now let’s define chaos.
What is Chaos? What are the characteristics of chaos? First, chaotic
systems are I am going to repeat some things I said in the previous
section. Déjà vu. But, as in the movie Classical systems of equations from physics were
E(R) = a + b E(Rm). It says the expected return on a stock, Equations which The equation z = 5+ 3x-4y-10z is linear, because each
variable is multiplied only by a constant, and the terms are added
together. If we multiply this last equation by 7, it is still linear: 7z =
35 + 21x – 28y – 70z. If we multiply it by the variable The science of chaos looks for characteristic patterns
that appear in complex systems. Unless these patterns were exceedingly
simple, like a single equilibrium point ("the equilibrium price of gold is
$300"), or a simple closed or oscillatory curve (a circle or a sine wave,
for example), the patterns are referred to as Such patterns are traced out by self-organizing systems.
Names other than strange attractor may be used in different areas of
science. In biology (or sociobiology) one refers to The main feature of chaos is that simple deterministic
systems can generate what appears to be random behavior. Think of what
this means. On the good side, if we observe what appears to be
complicated, random behavior, perhaps it is being generated by a few
deterministic rules. And maybe we can discover what these are. Maybe life
isn't so complicated after all. On the bad side, suppose we have a simple
deterministic system. We may think we understand it¾ it looks so simple. But it may turn out to have
exceedingly complex properties. In any case, chaos tells us that whether a
given random-appearing behavior is at basis random or deterministic may be
undecidable. Most of us already know this. We may have used random number
generators (really
I’m Sensitive—Don’t Perturb Me Chaotic systems are
x(n+1) = 4 x(n) [1-x(n)]. The input is x(n). The output is x(n+1). The system is
nonlinear, because if you multiply out the right hand side of the
equation, there is an x(n)
4 (.75) [1- .75] = .75. That is, x(n+1) = .75. If this were an equation
describing the price behavior of a market, the market would be in
equilibrium, because today’s price (.75) would generate the same price
tomorrow. If x(n) and x(n+1) were expectations, they would be
self-fulfilling. Given today's price of x(n) = .75, tomorrow's price will
be x(n+1) = .75. The value .75 is called a But, suppose the market starts out at x(0) = .7499. The output is
4 (.7499) [1-.7499] = .7502 = x(1). Now using the previous day's output x(1) = .7502 as the next input, we get as the new output:
4 (.7502) [1-.7502] = .7496 = x(2). And so on. Going from one set of inputs to an output is
called an Finally, we repeat the entire process, using as our first
input x(0) = .74999. These results are also shown in Look at iteration number 20. If you started with x(0) =
.75, you have x(20) = .75. But if you started with A meteorologist name Lorenz discovered this phenomena in 1963 at MIT [6]. He was rounding off his weather prediction equations at certain intervals from six to three decimals, because his printed output only had three decimals. Suddenly he realized that the entire sequence of later numbers he was getting were different. Starting from two nearby points, the trajectories diverged from each other rapidly. This implied that long-term weather prediction was impossible. He was dealing with chaotic equations.
Table 1: First One Hundred Iterations of the Equation
The different solution trajectories of chaotic equations
form patterns called
Why Chaos? Why chaos? Does it have a physical or biological function? The answer is yes. One role of chaos is the prevention of
A chaotic world economic system is desirable in itself. It prevents the development of an international business cycle, whereby many national economies enter downturns simultaneously. Otherwise national business cycles may become harmonized so that many economies go into recession at the same time. Macroeconomic policy co-ordination through G7 (G8, whatever) meetings, for example, risks the creation of economic entrainment, thereby making the world economy less robust to the absorption of shocks. "A chaotic system with a strange attractor can actually dissipate disturbance much more rapidly. Such systems are highly initial-condition sensitive, so it might seem that they cannot dissipate disturbance at all. But if the system possesses a strange attractor which makes all the trajectories acceptable from the functional point of view, the initial-condition sensitivity provides the most effective mechanism for dissipating disturbance" [7]. In other words, because the system is so sensitive to initial conditions, the initial conditions quickly become unimportant, provided it is the strange attractor itself that delivers the benefits. Ary Goldberger of the Harvard Medical School has argued that a healthy heart is chaotic [8]. This comes from comparing electrocardiograms of normal individuals with heart-attack patients. The ECG’s of healthy patients have complex irregularities, while those about to have a heart attack show much simpler rhythms.
How Fast Do Forecasts Go Wrong?—The Lyapunov Exponent The We saw that a small change in the initial conditions of the logistic equation (Table 1) resulted in widely divergent trajectories after a few iterations. How fast these trajectories diverge is a measure of our ability to forecast. For a few iterations, the three trajectories of Table 1 look pretty much the same. This suggests that short-term prediction may be possible. A prediction of "x(n+1) = .75", based solely on the first trajectory, starting at x(0) = .75, will serve reasonably well for the other two trajectories also, at least for the first few iterations. But, by iteration 20, the values of x(n+1) are quite different among the three trajectories. This suggests that long-term prediction is impossible. So let's think about the short term. How short is it? How fast do trajectories diverge due to small observational errors, small shocks, or other small differences? That’s what the Lyapunov exponent tells us. Let e denote the error in our initial observation, or the difference in two initial conditions. In Table 1, it could represent the difference between .75 and .7499, or between .75 and .74999. Let R be a distance (plus or minus) around a reference
trajectory, and suppose we ask the question: how quickly does a second
trajectory¾ which includes the error e ¾ get outside the range R?
The answer is a function of the number of steps
R = e · exp(l n). For example, it can be shown that the Lyapunov exponent
of the logistic equation is l = log 2 = .693147
[9]. So in this instance, we have So, let’s do a sample calculation, and compare with the results we got in Table 1.
Sample Calculation Using a Lyapunov Exponent In Table 1 we used starting values of .75, .7499, and
.74999. Suppose we ask the question, how long (at what value of In this case the distance R = .01. For the second
trajectory, with a starting value of .7499, the change in the initial
condition is e
.01 = .0001 exp (.693147 n). Solving for n, we get n = 6.64. Looking at Table 1, we see that that for n = 7 (the 7th iteration), the value is x(7) = .762688, and that this is the first value that has gone outside the interval (.74, .76). Similarly, for the third trajectory, with a starting
value of .74999, the change in the initial condition is e
.01 = .00001 exp (.693147 n). Which solves to n = 9.96. Looking at Table 1, we see that for n = 10 (the 10th iteration), we have x(10) = .739691, and this is the first value outside the interval (.74, .76) for this trajectory. In this sample calculation, the system diverges because the Lyapunov exponent is positive. If it were the case the Lyapunov exponent were negative, l < 0, then exp(l n) would get smaller with each step. So it must be the case that l > 0 for the system to be chaotic. Note also that the particular logistic equation, x(n+1) =
4 x(n) [1-x(n)], which we used in Table 1, is a simple equation with only
one variable, namely x(n). So it has only one Lyapunov exponent. In
general, a system with The Lyapunov exponent for an equation
For example, the derivative of the right-hand side of the logistic equation
x(n+1) = 4 x(n)[1-x(n)] = 4 x(n) – 4
x(n) is
4 - 8 x(n) . Thus for the first iteration of the second trajectory in
Table 1, where x(n) = .7502, we have | d
Table 2: Empirical Calculation of Lyapunov Exponent from
Enough for Now In the next part of this series, we will discuss fractals some more, which will lead directly into economics and finance. In the meantime, here are some exercises for eager students.
Finally, here is a nice fractal graphic for you to enjoy:
Notes [1] Eugene F. Fama, "Mandelbrot and the Stable Paretian
Hypothesis," [2] If you really want to know why, read J. Aitchison and
J.A.C. Brown, [3] J. Orlin Grabbe, [4] The Sierpinski Carpet graphic and the following one, the Lorentz attractor graphic, were taken from the web site of Clint Sprott: http://sprott.physics.wisc.edu/ . [5] Ernest Lawrence Rossi, "Archetypes as Strange
Attractors," [6] E. N. Lorenz, "Deterministic Non-periodic Flow,"
[7] M. Conrad, "What is the Use of Chaos?", in Arun V.
Holden, ed., [8] Ary L. Goldberger, "Fractal Variability Versus
Pathologic Periodicity: Complexity Loss and Stereotypy In Disease,"
[9] Hans A. Lauwerier, "One-dimensional Iterative Maps,"
in Arun V. Holden, ed., J. Orlin Grabbe is the author of International Financial Markets, and is an internationally recognized derivatives expert. He has recently branched out into cryptology, banking security, and digital cash. His home page is located at http://www.aci.net/kalliste/homepage.html . -30-
from The Laissez Faire City
Times, Vol 3, No 22, May 31,
1999 |