...Examples of catastrophic and systemic changes have been gathering in a variety of fields, typically in specialized contexts with little cross-connection. Only recently have we begun to look for generic patterns in the web of linked causes and effects that puts disparate events into a common framework—a framework that operates on a sufficiently high level to include geologic climate shifts, epileptic seizures, market and fishery crashes, ....
The main themes of this framework are twofold: First, they are all complex systems of interconnected and interdependent parts. Second, they are nonlinear, non-equilibrium systems that can undergo rapid and drastic state changes.
... there is emerging agreement that ignoring the seemingly incomprehensible meshing of counterparty obligations and mutual interdependencies (an accountant’s nightmare, more recursive than Abbott and Costello’s “Who’s on first?”) prevented real pricing of risk premiums, which helped to propagate the current crisis.
A parallel situation exists in fisheries, where stocks are traditionally managed one species at a time. Alarm over collapsing fish stocks, however, is helping to create the current push for ecosystem-based ocean management. ... Though the geological record tells us that global temperatures can change very quickly, the models consistently underestimate that possibility. This is related to the next property, the nonlinear, non-equilibrium nature of systems.
Most engineered devices, consisting of mechanical springs, transistors, and the like, are built to be stable. That is, if stressed from rest, or equilibrium, they spring back. Many simple ecological models, physiological models, and even climate and economic models are built by assuming the same principle: a globally stable equilibrium. A related simplification is to see the world as consisting of separate parts that can be studied in a linear way, one piece at a time. These pieces can then be summed independently to make the whole. Researchers have developed a very large tool kit of analytical methods and statistics based on this linear idea, and it has proven invaluable for studying simple engineered devices. But even when many of the complex systems that interest us are not linear, we persist with these tools and models. It is a case of looking under the lamppost because the light is better even though we know the lost keys are in the shadows. Linear systems produce nice stationary statistics—constant risk metrics, for example. Because they assume that a process does not vary through time, one can subsample it to get an idea of what the larger universe of possibilities looks like. This characteristic of linear systems appeals to our normal heuristic thinking.
Nonlinear systems, however, are not so well behaved. They can appear stationary for a long while, then without anything changing, they exhibit jumps in variability—so-called “heteroscedasticity.” For example, if one looks at the range of economic variables over the past decade (daily market movements, GDP changes, etc.), one might guess that variability and the universe of possibilities are very modest. This was the modus operandi of normal risk management. As a consequence, the likelihood of some of the large moves we saw in 2008, which happened over so many consecutive days, should have been less than once in the age of the universe.
Our problem is that the scientific desire to simplify has taken over, something that Einstein warned against when he paraphrased Occam: “Everything should be made as simple as possible, but not simpler.” Thinking of natural and economic systems as essentially stable and decomposable into parts is a good initial hypothesis, current observations and measurements do not support that hypothesis—hence our continual surprise. Just as we like the idea of constancy, we are stubborn to change. The 19th century American humorist Josh Billings, perhaps, put it best: “It ain’t what we don’t know that gives us trouble, it’s what we know that just ain’t so.”
So how do we proceed? There are a number of ways to approach this tactically, including new data-intensive techniques that model each system uniquely but look for common characteristics. However, a more strategic approach is to study these systems at their most generic level, to identify universal principles that are independent of the specific details that distinguish each system. This is the domain of complexity theory.
Among these principles is the idea that there might be universal early warning signs for critical transitions, diagnostic signals that appear near unstable tipping points of rapid change. The recent argument for early warning signs is based on the following: 1) that both simple and more realistic, complex nonlinear models show these behaviors, and 2) that there is a growing weight of empirical evidence for these common precursors in varied systems.
A key phenomenon known for decades is so-called “critical slowing” as a threshold approaches. That is, a system’s dynamic response to external perturbations becomes more sluggish near tipping points. ... Another related early signaling behavior is an increase in “spatial resonance”: Pulses occurring in neighboring parts of the web become synchronized. Nearby brain cells fire in unison minutes to hours prior to an epileptic seizure, for example, and global financial markets pulse together. ...
My point in the above excerpts was not to convey the cogency of his arguments, just give a sense of his approach, so click on the hyperlink above and read the whole article.
What you will not find in the article (though he verges on it at moments) is how gravely he UNDERESTIMATES the complexity of what he is trying to analyze. Notice some of those big words, he may be waving them around like a torch in a tribal dance. But for cutting edge science this thinker (George Sugihara, a theoretical biologist is the McQuown Chair in Natural Science at Scripps) has done an outstanding job.
No comments:
Post a Comment