"Die-off and, in its final form, die-out, is a phenomenon common in the history of zoology and botany, and the dodo and the passenger pigeon are not exceptional. There is, for example, the everyday but suggestive experience of yeast cells introduced into a wine vat. Enormously successful as a species, they gobble up nutrients from the sugary crushed grapes around them and expand their population without a thought to the consequences of drawdown; within weeks, however, the 'pollution' they produce—alcohol and carbon dioxide, which of course is what the fermentation is all about—have so filled their environment that they are unable to survive. The resulting crash, in that vat at least, means an acute die-off and then extinction.
"Where along this ecological trajectory can we locate the modern—the theoretically sapient—human?" [Sale, 1991, p. 26]
WHERE WILL IT END?
by Jay Hanson
Three groups of scientists, using different methodologies and data sets, have found that we may have as little as 35 years before our socio-economic systems destroy our life-support system. Moreover, global oil production could peak in as little as four years! Yet our political leaders are calling for even more of the same: "With the . . . plan for economic growth, our economy will achieve its full potential with 3.5 percent—or higher—growth per year, putting our country back on the right track and giving every American family the chance to achieve the American Dream." Does this sound rational to you?
Plato, the most celebrated, honored and revered of all the Western philosophers, lived during the Golden Age of Athens that lasted from 445 to 431 B.C. His writings mark the first use of "reason" to decide moral questions.
From Plato to our present society, we can trace the ascendance of reason (as an organizing principle) through the ideas of Aristotle, Bacon, Descartes, and especially the English philosopher John Locke. In 1664, Locke argued that there is a natural law governing humans and that it can be known by human reason: "And reason . . . teaches all mankind who will but consult it, that being all equal and independent, no one ought to harm another in his life, health, liberty or possessions."
Locke's powerful call for reason justified the English Revolution of 1688 that ended the absolutism of the British monarchy and gave power to the people. Moreover, his words were pivotal in our own Declaration of Independence and Revolution of 1776, and the French Revolution of 1789.
In 1776, reason was placed at the heart of our present economic system by Adam Smith. Smith said that laissez-faire (let alone) economics would allow rational, self-interested individuals to raise the wealth of the working class automatically, as if by an "Invisible Hand."
Smith claimed that if people concentrated on rational self-interest, certain innate human attributes would regulate economic activity: "Every man, as long as he does not violate the laws of justice, is left perfectly free to pursue his own interests in his own way, and to bring both his industry and capital into competition with those of any other man, or order of men." But what if Smith's self-interest was not as rational as he thought it was? What then?
In fact, Locke and Smith were wrong—people are not "rational"*. For example, they routinely fail to make inferences according to Bayes' Theorem, which is a formula used to calculate the probability that a particular event will occur. People give recently presented information undue importance[Hamm, also see Forrester], thereby producing answers that are irrational (out of sight, out of mind).
Can irrational people sustain Democracy in a world that is now far over carrying capacity? In AN INQUIRY INTO THE HUMAN PROSPECT, Robert Heilbroner considered the damage that our economy is inflicting upon our life-support system and projected continuing (but gradually slowing) economic growth until approximately the year 2005. At that time, he sees the need for highly authoritarian governments to control the transition to worldwide economic decline [p. 2, Daly & Cobb, 1989].
So on the eve of our self-destruction, we find ourselves with socio-economic systems designed for a model of human that has never existed—led by people whose "mental system is failing to comprehend the modern world." Where will it end?
"In the end," says the Grand Inquisitor in Dostoevsky's parable, "in the end they will lay their freedom at our feet and say to us,"'Make us your slaves, but feed us.'" . . . and then the meek shall inherit the Earth . . .
Judge for Thyself Who is Right
by Jay Hanson
Dostoevsky's parable is set in sixteenth-century Seville—at the height of the Inquisition. On the day after a magnificent bonfire, in which nearly one hundred heretics were burned alive, Jesus descends and is immediately recognized. The cardinal—the Grand Inquisitor—has Him promptly arrested and thrown in prison. That evening, the door of Jesus' cell opens and the old, ascetic Inquisitor enters to confront Him. For a few minutes there is silence, then the Inquisitor delivers the most profound and terrible attack against Christianity.
The Inquisitor charges Jesus with betrayal of mankind, for deliberately rejecting the only ways in which men might have been happy. This singular moment occurred when "the wise and dread spirit, the spirit of self-destruction and non-existence," tempted Jesus in the wilderness by asking Him three questions.
First, the spirit asked Jesus to turn stones into bread. Jesus refused because He wanted mankind free, and what would obedience be worth if it were bought with bread? Thus, He denied men their deepest craving—to find someone who would take away the awesome burden of freedom.
Then, the spirit asked Jesus to throw Himself from the pinnacle of the temple, "for it is written: the angels shall hold Him up lest he fall". Again Jesus refused, rejecting miracles because He wanted faith given freely. But the Inquisitor explains that man cannot live without miracles, for if he is deprived of them, he immediately creates new ones. Man is weaker and baser by nature than Jesus thought. "By showing him so much respect, Thou didst … cease to feel for him…. "
Jesus' last temptation was to rule the world, to unite all mankind "in one unanimous and harmonious ant-heap, for the craving for universal unity is the third and last anguish of men…." He refused once again, and thereby rejected the only ways in which men might have been made happy.
The Inquisitor explains "We are not working with Thee but with him [the spirit]…. We have taken the sword of Caesar, and in taking it, of course, have rejected Thee and followed him. Oh, ages are yet to come of the confusion of free thought, of their science and cannibalism…. [But] we have corrected Thy work and have founded it upon miracle, mystery and authority. And men rejoiced that they were again led like sheep, and that the terrible gift that had brought them such suffering, was, at last, lifted from their hearts…. And all will be happy, all the millions of creatures except the hundred thousand who rule over them. For only we, who guard the mystery, shall be unhappy…. Peacefully they will die, peacefully they will expire in Thy name, and beyond the grave they will find nothing but death."
"And we alone shall feed them…." the Inquisitor continues, "Oh, never, never can they feed themselves without us! No science will give them bread so long as they remain free. In the end they will lay their freedom at our feet, and say to us, 'Make us your slaves, but feed us.'"
In concluding his terrible accusation, the Inquisitor tells Jesus that He will not be allowed to bring unhappiness to mankind a second time, that Jesus Himself will burn at the stake tomorrow!
To all of this Jesus has listened in silence, then He suddenly approaches the old man and softly kisses him on his thin, bloodless lips. The Grand Inquisitor shudders, goes to the door and opens it: "Go, and come no more . . . come not at all, never, never!" And the prisoner goes out into the night.
The mind is a squadron of simpletons. It is not unified, it is not rational, it is not well designed—or designed at all. It just happened, an accumulation of innovations of the organisms that lived before us. The mind evolved, through countless animals and through countless worlds.
Like the rest of biological evolution, the human mind is a collage of adaptations (the propensity to do the right thing) to different situations. Our thought is a pack of fixed routines—simpletons. We need them. It is vital to find the right food at the right time, to mate well, to generate children, to avoid marauders, to respond to emergency quickly. Mental routines to do so have evolved over millions of years and developed in different periods in our evolution, as Rumi noted.
We don't think of ourselves as of such humble origins. The triumphs that have occurred in the short time since the Industrial Revolution have completely distorted our view of ourselves. Hence, the celebrated triumph of humanity is its rationality: the ability to reason through events and act logically. to organize business. To plan for the future, to create science and technology. One influential philosopher, Daniel Dennet, wrote recently: "When a person falls short of perfect rationality … there is no coherent … description of the person's mental states."
Yet to characterize the mind as primarily rational is an injustice; it sells us short, it makes us misunderstand ourselves, it has perverted our understanding of our intelligence, our schooling, our physical and mental health. Holding up rationality, and its remorseless deliberation, as the model of the mind has, more important, set us along the wrong road to our future. Instead of the pinnacle, rationality is just one small ability in a compound of possibilities.
The mind evolved great breadth, but it is shallow, for it performs quick and dirty sketches of the world. This rough-and-ready perception of reality enabled our ancestors to survive better. The mind did not evolve to know the world or to know ourselves. Simply speaking, there has never been, nor will there ever be, enough time to be truly rational.
Rationality is one component of the mind, but it is used rarely, and in a very limited area. Rationality is impossible anyway. There isn't time for the mind to go through the luxurious exercises of examining alternatives. Consider the standard way of examining evidence, the truth table, a checklist of information about whether propositions are correct or not. To know whether Aristotle is a hamburger, you would look up "Aristotle" or "hamburger" in this table. Now think of the number of issues you immediately know well—what Yugoslavia is, whether skateboards are used at formal dinners, how chicken sandwiches should taste, what your spouse wore this morning—and you will see that your own truth table, if entered randomly, would have millions of entries just waiting![p.p. 2-3]
A mind built up with countless specific adaptations can never be rational. We piece together the results of a small set of probes to judge the world, picking up a few signals and making quick assessments of what is outside, in the case of marauders, and inside, in the case of memories and dreams. Such a mind will never be rational; but it will always try to adapt. And it cannot always be correct either. If we consider a mind that has evolved to meet most situations adequately, say 95 percent of them, we may have a better idea of what being correct is. [p. 221]
Since the mind evolved to select a few signals and then dream up a semblance, whatever enters our consciousness is overemphasized. It does not matter how the information enters, whether via a television program, a newspaper story, a friend's conversation, a strong emotional reaction, a memory—all is overemphasized. We ignore other, more compelling evidence, overemphasizing and overgeneralizing from the information close at hand to produce a rough-and-ready realty. [p. 258]
The [mental] system we recruited had the primary aim of reacting quickly to immediate danger—those who did lived long enough to produce us. Those who acted more thoughtfully and with due deliberation of the proper course, who could avoid panic when confronted by mild threats—who acted rationally, that is—probably lived shorter, and thus less generative, lives. The survival argument against rationality in primeval conditions is that payoff is very lopsided: Fail to respond to a real danger, even if that danger would kill you only 1/10,000 as often, and you will be dead. A few years later, you will be deader in evolutionary terms, for fewer of your genes will be around. However, an overreaction to danger produces only a little hysteria, a little stress, and maybe a little embarrassment—probably little or no loss of reproductive ability. Maybe the excitement would even recruit a little more reproductive effort!
Running from every snake or tiger or loud noise probably doesn't disrupt life too much. Not running, while it might kill you only slightly more often, can eventually produce major changes in the population. The same numbers hold in this example as for the height difference cited earlier. If panic in response to a threat in all cases improved survival by even 1/10,000, those who panicked would be 484 million times more populous than those who did not. And so it was good to respond emotionally and quickly to the average dangers threatening most of our ancestors. Rationality is a great idea and ideal, but we never had the time for it; we don't have time for it now, and thus we don't have the mind for it. [p. 262]
THE EVOLUTION OF CONSCIOUSNESS, Robert Ornstein; Prentice Hall, 1991, ISBN 0-13-587569-2
NEW WORLD NEW MIND
Robert Ornstein and Paul Ehrlich
IT ALL SEEMS to be happening at once. A small group of terrorists murder a few Americans far away—and fear of getting murdered changes the traveling habits of millions. But Americans continue to slaughter more people each day with handguns than all the people the terrorists have killed up to the writing of this book. No one does anything about it.
People swamp AIDS testing centers, desperate and anxious to know if they are carrying the virus. If they have it, it will likely kill them. Can society even care for AIDS victims?
Meanwhile populations explode, stockpiles of nuclear weapons grow, budget deficits mount, our education becomes more and more obsolete, and the environment—on which our very existence depends—deteriorates. But most people's attention is fixed upon eyecatching "images," such as the taking of the Iran hostages, horrible murders, airplane crashes, changes in stock prices, and football scores. Cancer terrifies us, yet we keep on smoking. Oliver North testifies that he lied—yet his good looks and smooth talk lead many people to propose that he run for President.
And the President operates the same way. Ronald Reagan, by his own admission, perverted an important U.S. global policy because his mind was similarly fixed on another set of hostages. He said, "I let my preoccupation with the hostages intrude into areas where it didn't belong. The image, the reality of Americans in chains, deprived of their freedom and families so far from home, burdened my thoughts. And this was a mistake." [italics ours]
Why does the growing budget deficit attract relatively little attention while the comparatively meaningless stock market "crash" makes headlines? Why do many popular writers yearn for a return to an education suitable for Oxford men before World War I, when the world has changed in critical ways to a greater extent since World War II than it changed between the birth of Christ and that war? Why do the numbers of nuclear weapons expand astronomically but largely unheralded, while a small girl trapped in a well commands the front pages? Why do we collectively spend billions on medical care while neglecting the simple preventative actions that, if we took them, would save many times the lives?
We believe it is no accident.
All these things are happening now, and are happening all at once, in part because the human mental system is failing to comprehend the modern world. So events will, in our opinion, continue to be out of control until people realize how selectively the environment impresses the human mind and how our comprehension is determined by the biological and cultural history of humanity. These unnoticed yet fundamental connections to our past, and how we can retrain ourselves for a "new world" of the future, one filled with unprecedented threats are what this book is all about. [p.p. 1-3]
Let's look more clearly at the nature of caricatures of thought.
Suppose you have decided that your new car will be an Italian "Provolone." You have read the automobile magazines; kept track of frequency-of-repair, recall, and resale-value statistics; and looked up the latest Consumer Reports, which cites statistics that the Provolone is a safe, reliable, good-handling, and adequately powered car. On these well-researched facts, you are on your way out the door to the Provolone dealership when your next-door neighbor drops by. He tells you in vivid detail about his new Provolone, saying, "It's a lemon!" You immediately change your mind about the Provolone and buy another car instead. Have you made a reasonable decision? Your decision not to buy the car is clearly not based on the real evidence. You make an immediate vivid caricature of reality and accept it in contrast to other reports with data from a large sample (the frequency-of-repair and other statistics).
This caricature of mind violates any reasonable approach to decision making. Since one default is to focus on the small world around us, we ignore the fact that large amounts of information summarized in statistics are more reliable than single personal experiences. The neighbor's story is the most recent and most available one in memory, and it is more emphatic than a published report, but this is not a good reason to give it more weight.
We often encounter problems where we must make decisions under conditions of uncertainty. We do not have full information, and there may be no single, correct answer, but the information that we have is probabilistic. How good are we at making such decisions? Not very, because the same kind of illusions that fool the visual system also fool the judgment system.
Psychologists Daniel Kahneman and Amos Tversky were among the first to study these "cognitive illusions" that demonstrate how easily we are misled. These effects can be particularly pronounced when decisions involve serious risks. One of their problems is this:
Imagine that the government is preparing for an outbreak of a rare disease that is expected to kill six hundred people. Two programs are available. If Program A is adopted, then two hundred people will be saved. If Program B is adopted, then there is a one-third chance that six hundred people will be saved, and a two-thirds chance that nobody will be saved. Which program should be adopted?
When the issue is framed this way, most people prefer Program A over Program B. They want to avoid the two-thirds risk that nobody will be saved. Now consider a similar problem involving the same disease and the same expectation that if nothing is done, six hundred people will die:
Two programs are available. If Program C is adopted, then four hundred people will die. If Program D is adopted, then there is a one-third chance that nobody will die, and a two-thirds chance that six hundred people will die.
When the issue is framed this way, most people prefer Program D over Program C, avoiding the certain consequence that four hundred people will die. This might seem reasonable to you until you realize that Programs A and C have the same outcomes. In both, of the six hundred people at risk, two hundred will live and four hundred will die. Programs B and D are also precisely the same; a one-third chance of six hundred people being saved is the same as a one-third chance of nobody dying. Illusions originate from just the wording of problems!
Decisions are often based on our beliefs about the relative likelihood of things happening—the price of real estate next year, the availability of jobs for engineers in the year you expect to graduate, whether the romance of the moment will last through marriage, and so on. People estimate the likelihood of such things by relying on oversimplified caricatures of reality.
Every student needs to be taught how they cut mental corners to make decisions. These "shortcuts" probably result in more efficient decision making overall, but they also lead to systematic caricatures that prevent us from being objective in certain kinds of judgments. Knowing about these common biases may help students to keep them from distorting their judgments.
They should learn that when people are asked to judge the relative frequency of different causes of death, they overestimate the frequency of well-publicized causes such as homicide, tornadoes, and cancer, and they underestimate the frequency of such less exceptional causes as diabetes, asthma, and emphysema. And they should know that this tendency causes disproportionate funds to go to the dramatic highly visible causes and relative neglect of the search for solutions to chronic problems.
In an important experiment that should be demonstrated in the curriculum, Tversky and Kahneman read to students lists of names of well-known people of both sexes. In each list the people of one sex were more famous than those of the other sex. When the subjects were asked to estimate the proportion of men and women on the lists, they overestimated the proportion of the sex with more famous people on the list. For example, if the list contained very famous women (such as Elizabeth Taylor) and only moderately well-known men (such as Alan Ladd), then subjects overestimated the proportion of women on the list. In both these cases, people's judgments were biased by how easily they could recall specific examples.
Students can learn how easily they caricature other people. Kahneman and Tversky told subjects to read the following passage. "This description of Tom W. was written by a psychologist when Tom was in his senior year of high school":
Tom W. is of high intelligence, although lacking in true creativity. He has a need for order and clarity, and for neat and tidy systems in which every detail finds its appropriate place. His writing is rather dull and mechanical, occasionally enlivened by somewhat corny puns and by flashes of imagination of the sci-fi type. He has a strong drive for competence. He seems to have little feel and little sympathy for other people and does not enjoy interacting with others. Self-centered, he nonetheless has a deep moral sense.
The subjects were then told that Tom W. is now a graduate student. Rank the following categories in order of the likelihood that they are Tom's area of graduate specialization:
- Business administration
- Computer science
- Humanities and education
- Law library science
- Medicine and physical and life sciences
- Social science and social work
Most students probably will choose computer science or engineering as Tom's most likely area of specialization, as did the subjects in the experiment, who thought that humanities and education and social science and social work were least likely. The character description probably fits the caricature of what "typical" computer science or engineering students are like. Using this kind of caricature to simplify judgments leads you to think these are likely categories for his field of study. But there are many more graduate students in humanities, education, social science, and social work than there are in computer science or engineering. Even people who know this fact, and who have very little faith in the predictive value of character sketches disregard such proportions in making their predictions. Everybody, including experts, is strongly influenced by his or her caricatures.
An imaginary coin is all that is needed to teach students that caricaturing things on the basis of their representativeness is also responsible for another common type of error. Ask them if they tossed a coin six times in a row, which of the following sequences would be more likely to occur, A or B:
A: Head, Head, Head, Tail, Tail, Tail.
B: Head, Tail, Tail, Head, Tail, Head.
Most will choose B because it looks more like our caricature of a random sequence than does A. They can easily be shown, however, that A and B are equally likely. With any given proportion of heads and tails, any one sequence is just as likely as any other.
This caricature is partly responsible for another common error. Suppose that you and a friend are tossing coins and you can bet on each coin toss. Assume that the coin is fair, and so over the long run will come down 50 percent heads, 50 percent tails. During one run of tosses, the coin has come down heads twenty times in a row. What's your bet on the twenty-first toss: Is it more likely to come down heads or tails? Most people feel strongly that tails are far more likely because of the "law of averages": Because the chances of heads and tails in the long run are equal, we expect a shift after a long run so that things will balance out. But the long-run averages have nothing to do with any one particular event. Just because heads came up twenty times in a row, the twenty-first toss is still an independent event, with a fifty-fifty chance of being heads or tails. This error is known as the gambler's fallacy-the belief that a forthcoming event will turn out a particular way because it's "past due."
Teachers can demonstrate another critical feature of the mind's caricatures—insensitivity to sample size. Students can be presented with a problem like this one:
A town has two hospitals, a large one and a small one. About fifty babies are born every day in the large hospital, and ten babies every day in the small hospital. As everyone knows, about so percent of all babies born are boys, but of course the exact percentage will fluctuate from day to day, sometimes being higher than so percent, sometimes lower. In one particular year both hospitals kept a record of the days on which more than 60 percent of the babies born were boys. Which hospital was more likely to record such days?
About half of the college students who were actually given this problem by Tversky and Kahneman judged that the likelihood was the same for both hospitals, while the other half split evenly between choosing the larger and the smaller hospitals. The correct judgment is that the smaller hospital is much more likely to have such deviant days. This becomes explicit in the extreme case of a really small hospital that records just one birth a day. If half the babies born in a year are boys, then on half of the days of the year this hospital would record 100 percent boys! If a hospital had two babies born a day, then (assuming the odds on each birth were exactly fifty-fifty) on one quarter of the days of a year it would record 100 percent boys (on half of the days of the year, on average, one boy and one girl would be born; on the other half, two boys or two girls). [p.p. 209-215]
NEW WORLD NEW MIND, Robert Ornstein and Paul Ehrlich; Touchstone, 1989; ISBN-0-671-69606-8
See also: THE MORAL ANIMAL
Berger & Luckmann,1966: THE SOCIAL CONSTRUCTION OF REALITY; Anchor Books, ISBN 0-385-05898-5
"Reification is the apprehension of human phenomena as if they were things, that is, in non-human or possibly supra-human terms. Another way of saying this is that reification is the apprehension of the products as if they something else than human products—such as facts of nature, results of cosmic laws, or manifestations of divine will. Reification implies that man is capable of forgetting his own authorship of the human world, and further, that the dialectic between man, the producer, and his products is lost to consciousness. The reified world is, by definition, a dehumanized world. It is experienced by man as a strange facticity, an opus alienum over which he has no control rather than as the opus proprium of his own productive activity." [p. 89]
Miller, 1985: POPPER SELECTIONS: The Problem With Induction, Princeton Paperbacks; ISBN 0-691-02031-0
Sir Karl Popper wrote: "Hume's negative result establishes for good that all our universal laws or theories remain for ever, guesses, conjectures, hypothesis." [p. 111]
Murphy, 1994: RATIONALITY AND NATURE, Westview Press; ISBN 0-81332169-7
Nature's Equilibrium Disrupted by Human Rationality
"Although humans believe that they are rationalizing nature—ordering it according to human ends—they may be destroying more order than they construct. In their attempt to rationalize nature, humans upset the order nature has achieved, thereby creating disorder. There may be a social equivalent to the second law of thermodynamics: the disorder (entropy) of our planet is increasing as a result of human action. Humans have created local pockets of order, but their actions have had global disordering effects. Coal has powered an ordered industrial infrastructure, but the stripmining to acquire it has destroyed vegetation, and the factories using it are disordering nature's atmospheric equilibrium. Rivers were diverted in the Soviet Union to provide irrigation for a rational production of rice, and the consequence was the destruction of the Aral Sea. Humans mine Uranium-235 in order to produce energy, thereby contaminating the planet with intensely radioactive waste, whereas in the natural environment the 99.3 percent Uranium-238 in Uranium ore blocks the fission reaction of the 0.7 percent Uranium-235 from starting (Commoner 1976: 85-6). Ballast water taken on by cargo ships in one ecosystem and released in another has introduced species (e.g. mussels in the Great Lakes) having no predators nor reproductive controls. Economically rational shipping is breaking down nature's bioregional equilibrium of local ecosystems, thereby fuming the planet into one big ecosystem, with ecologically (and perhaps long-term economically) irrational results.
"The notion of transforming planet earth into 'spaceship Earth' implies the replacement of the regulatory mechanisms of nature by those of human planning. Humans have come to believe that progress consists of their socially constructed projects taking over the functions previously undertaken by nature. But it is unclear how this constitutes progress rather than merely an expression of human vanity. Ecologists warn that human planning may well be inferior to nature's regulation. In his discussion of the dangers of nuclear power and the advantages of using solar energy, Commoner (1976: 125) reminds us that "the sun is a huge, essentially eternal nuclear reactor, assembled by the play of cosmic forces rather than by the hand of man." Better, concludes Commoner, to use the reactor perfected by nature than to construct our own. Nature has established over time through photosynthesis and metabolic oxidation a finely-tuned equilibrium of twenty percent oxygen in the atmosphere that sustains oxygen-using organisms (Commoner 1976: 42). Less than this amount and present oxygen-using organisms could not survive. More and the planet would be ravaged by terrible fires. There is as yet no indication that human reason could achieve an atmospheric equilibrium on such a planetary scale.
"By destroying the capacity of nature to maintain the human-sustaining natural environment, humans would be forced to do it themselves in place of nature, a laborious, probably impossible, task. As the World Commission on Environment and Development (1987: 32-3) concluded Nature is 'fragile and finely balanced. There are thresholds that cannot be crossed without endangering the basic integrity of the system.' Ecologists have demonstrated that nature has its own requirements and if humans attempt to reshape nature as if it were plastic, they do so at their peril." [p.p. 18-19]
Gleck, 1987: CHAOS, Penguin; ISBN 0-14-009250-1; Phone: 800-332-4624; FAX: 212-366-2666
"Watch two bits of foam flowing side by side at the bottom of a waterfall. What can you guess about how close they were at the top? Nothing. As far as standard physics was concerned, God might just as well have taken all those water molecules under the table and shuffled them personally. Traditionally, when physicists saw complex results, they looked for complex causes. When they saw a random relationship between what goes into a system and what comes out, they assumed that they would have to build randomness into any realistic theory, by artificially adding noise or error. The modern study of chaos began with the creeping realization in the 1960s that quite simple mathematical equations could model systems every bit as violent as a waterfall. Tiny differences in input could quickly become overwhelming differences in output—a phenomenon given the name 'sensitive dependence on initial conditions.' In weather, for example, this translates into what is only half-jokingly known as the Butterfly Effect—the notion that a butterfly stirring the air today in Peking can transform storm systems next month in New York." [p. 8]
Catton, 1982: OVERSHOOT, University of Illinois Press, 800-545-4703, Fax 217-244-8082; ISBN 0-252-00988-6
"It was thus becoming apparent that nature must, in the not far distant future, institute bankruptcy proceedings against industrial civilization, and perhaps against the standing crop of human flesh, just as nature had done many times to other detritus- consuming species following their exuberant expansion in response to the savings deposits their ecosystems had accumulated before they got the opportunity to begin the drawdown… Having become a species of superdetritovores, mankind was destined not merely for succession, but for crash." [p. 172]