What chaos theory has to teach us about human events

The 21st century has been defined by unexpected shocks—major upheavals that have upended the world many of us have known and made our lives feel like the playthings of chaos. Every few years comes a black swan–style event: September 11, the financial crisis, the Arab Spring, Brexit, the election of Donald Trump, the coronavirus pandemic, wars in Ukraine and Gaza. Even daily life can feel like a roll of the dice: With regularity, some Americans go to school, the grocery store, church, a concert, or the movies and get gunned down in a random act of mass murder.

Many of these events were triggered by flukes: small, chance happenings that were arbitrary, even random, and could easily have turned out otherwise. The Arab Spring started after one vegetable vendor in central Tunisia set himself on fire, sparking a conflagration that toppled tyrants and set the region ablaze. Trump may have decided to run for president after Barack Obama humiliated him with a joke at the White House Correspondents’ Dinner in 2011. And no matter what the origin story of COVID-19, a single virus, infecting a single individual in Wuhan, China, jumbled the lives of billions of people—for years. One fluke can change everything, everywhere, all at once.

From the December 2020 issue: The next decade could be even worse

The world feels like it’s falling apart—faster and more unexpectedly than ever before. The frenetic uncertainty of modern life requires new words, such as doomscrolling, to describe the passive, addictive consumption of bad news about a seemingly never-ending supply of calamity. The pace of shocks seems to be accelerating. Economists, politicians, pundits, and political scientists offer few explanations and seem just as walloped as everyone else. To understand why this is happening—and what to do about it—calls for a combination of science and social science, drawing lessons from chaos theory, evolutionary biology, and physics.

Edward Lorenz was a weatherman during World War II, tasked with forecasting cloud cover before American bombing raids in the Pacific. But meteorology in those days was largely guesswork and produced only crude predictions. After the war ended, Lorenz decided to try to unlock the secrets of the weather using more sophisticated methods and harnessing the nascent power of computing. He created a simplified, miniature world on his LGP-30 computer: Instead of the millions of different variables that affect weather systems in the real world, his model had just 12 variables.

One day, Lorenz decided to rerun a simulation he’d done earlier. To save time, he decided to start midway through, plugging in the data points from the prior snapshot. He figured that so long as he set the variables at the same levels, the weather patterns would be repeated just as they were before: same conditions, same outcomes.

But something strange happened instead. The weather in his rerun simulation was different in every way. After a lot of scowling over the data, Lorenz realized what had happened. His computer printouts had rounded data to three decimal places. If, for example, the exact wind speed was 3.506127 miles an hour, the printout displayed it as 3.506 miles an hour. When he plugged the slightly truncated values from the printouts back into the simulation, he was always off by a tiny amount (in this case, just 0.000127 miles an hour). These seemingly meaningless alterations—these tiny rounding errors—were producing major changes.

That observation led Lorenz to a breakthrough discovery. Minuscule changes could make enormous differences: Raising the temperature one-millionth of a degree could morph the weather two months later from clear blue skies into a torrential downpour, even a hurricane. Lorenz’s findings were the origin of the “butterfly effect” concept—the notion that a butterfly flapping its wings in Brazil could trigger a tornado in Texas—and, ultimately, of chaos theory. They also explain why meteorologists are still unable to forecast the weather beyond a short time frame with much accuracy; if any calculation is off by a tiny amount, the longer-term forecast will be useless.

Chaos theory is employed almost exclusively in science and in the study of dynamical systems: the unpredictable motion of particles, the arbitrary movement of smoke, or seemingly random turbulence in the oceans. But humans are subject to the same laws of physics, so chaos theory affects societies and lives, not just weather. A close look at any major historical event—or at the history of the species—reveals instantly that humans are the puppets of small, seemingly arbitrary or accidental events.

On October 30, 1926, Mr. and Mrs. Henry L. Stimson stepped off a steam train in Kyoto, Japan, and checked into room number 56 at the nearby Miyako Hotel. They strolled through the city, soaking up the city’s autumnal explosion of color, as the Japanese maples turned crimson and the ginkgo trees burst into a golden shade. On a six-day vacation, the Stimsons fell in love with Kyoto.

Nineteen years later, in July 1945, Henry Stimson was America’s secretary of war, and he received a memo that alarmed him. The U.S. Target Committee had agreed that the first atomic bomb would be dropped on a strategically important target: Kyoto.

Stimson tried to save the city from destruction. The generals from the Target Committee were unmoved. (They didn’t know about the Miyako Hotel, the majestic Japanese maples, or the golden ginkgo trees.) Finally, Stimson went to the top: He met twice with President Harry Truman, demanding that Kyoto be removed from the list. Truman relented. The first bomb was dropped on Hiroshima instead. One hundred thousand people lived in one city and died in another because of a vacation that one couple had taken 19 years earlier.

Read: Coincidences and the meaning of life

The second bomb was to be dropped on the city of Kokura. But as the B-29 bomber approached the city, cloud cover made the ground below difficult to see. The problem was unexpected, as a team of Army meteorologists had predicted clear skies. (Whether Lorenz, who was a meteorologist in the Pacific at the time, was involved in this forecasting is unknown.) The bomber went to the secondary target instead. Nagasaki was destroyed. To this day, the Japanese refer to “Kokura’s luck” whenever someone unknowingly escapes from disaster. Chaos theory in action.

Flukes haven’t defined just modern history. Sixty-six million years ago, an oscillation in a distant reach of space—the Oort cloud—flung a gargantuan space rock toward Earth. It wiped out the dinosaurs, which allowed mammals to thrive. If that asteroid had been even slightly delayed, humans would not exist. And if not for an evolutionary accident, perhaps humans would lay eggs: New findings suggest that the origin of the placenta—and, by extension, live births—comes from a single shrewlike creature that evidently got infected with a single retrovirus about 100 million years ago.

Most people like to imagine that we can understand, predict, and control the world. Humans crave a rational explanation to make sense of the chaos of life. The world isn’t supposed to be a place where hundreds of thousands of people live or die because of one couple’s decades-old nostalgia for a pleasant vacation, or because clouds flitted across the sky at just the right moment. We don’t want our existence to be predicated on an infected shrewlike creature. But that’s how the world works.

The power of seemingly random events to sway trajectories is therefore not new. But modern society has amplified this contingency, making apparently insignificant changes more likely than ever to upend everything in an unexpected instant. Black swans are becoming more frequent, and human life is more vulnerable to them.

Western modernity is defined by an unquenchable thirst for optimization and efficiency. But physics provides us with a cautionary tale about the perils of such endless optimization, in the form of the “sandpile model,” a subset of a realm of complexity science known as “self-organized criticality.” Despite the sophisticated name, the general principle is simple. If you add grains of sand to a pile one by one, eventually the pile will reach a state of criticality, in which a single grain of sand can cause an avalanche. The system teeters on a precipice—changing anything can change everything. By contrast, a slightly smaller sandpile runs a much lower risk that one additional grain will cause a collapse.

Modern social systems are designed to push the sandpile to its limit. Interconnection and interdependence create conditions where a single mistake in one part of the system can instantly produce devastating ripples far away. In 2021, a ship got hit by a strong gust of wind, twisted sideways, and got stuck in the Suez Canal. One estimate suggests that the impact of the event was $54 billion in trade loss—and a reduction of global GDP by up to 0.4 percent, all from one boat. Similarly, on May 6, 2010, a single rogue trader in London decided to manipulate the stock market for fun. He wiped out a trillion dollars of value in five minutes. The combination of chaos and criticality is dangerous—a breeding ground for black swans.

For most of the 250,000 or so years that Homo sapiens have graced the planet, things ticked along more or less the same way from one generation to the next. Day to day, however, life was dangerous and unpredictable. Childbirth was a death trap. Starvation was a constant threat, as crops might inexplicably fail, or animals that were once abundant were suddenly nowhere to be found. Most of the human story is one of local instability but global stability. Where the next meal would come from wasn’t always clear, but parents and children lived in the same kind of world, generation after generation.

Today the dynamic is inverted. Most people in rich, industrialized societies live according to routines, patterns, and a rigid sense of daily order. In one study, researchers using geolocation data from cellphones found that they could predict, with 93 percent accuracy, where a given person would be at any specific time of day. But the familiar routines take place within a superstructure that is constantly shifting. Children now teach parents how to use technology, not the reverse. Three decades ago, few people had heard of the internet; now no one can function without it. We have the opposite of our ancestors: local stability, but global instability. In this upside-down world, Starbucks remains unchanged while rivers dry up and democracies collapse.

The human brain evolved not to apprehend such a complex reality but to detect straightforward patterns of cause and effect in a simpler world. Our brains have now become mismatched to modern life in a social system of 8 billion people that is too complex to fully comprehend. In an effort to make sense of this, modern industrialized societies are built on an endless array of models that seek to separate “signal” from “noise.” These efforts reduce the world to a fun-house-mirror version of itself, in which a few key variables—always involving big, obvious factors—determine what happens next. But the noise matters: It’s where the black swans come from.

Nonetheless, relying on ever more sophisticated models, forecasters, pundits, and policy makers have developed a dangerous hubris about their ability to control the world. They are constantly proved wrong but rarely learn the lesson. Looking at the world through such a distorted prism conveys an illusion of control, in which just one policy intervention with the right variable might be enough to slay risk and tame an untameable world.

Consider, by contrast, what might follow from accepting the uncertainty of a world where one couple’s vacation determines whether some 100,000 people live or die 19 years later, or where one vegetable vendor can set an entire region on fire. Appreciating the power of flukes teaches an important lesson: A slightly smaller sandpile produces fewer catastrophes. Resilience above optimization.

Read: Preparing your mind for uncertain times

When catastrophe comes, people instinctively search for straightforward patterns and clear-cut explanations. “Everything happens for a reason” isn’t just a mantra stitched on pillows; it’s also a flawed assumption underlying a good deal of social research, including in economics and political science. There’s just one problem: It’s not true. Some things … just happen.

In the 1980s, a relatively obscure evolutionary biologist named Motoo Kimura challenged the conventional wisdom in his field, demonstrating that a significant amount of change happening at the molecular level wasn’t because of natural selection but rather was neutral. Many changes were neither positive nor negative but were driven by random drift. The noise mattered. Kimura’s findings reshaped how scientists understood change in the natural world.

But that isn’t the only lesson Kimura left about chaos, randomness, and the arbitrary movement of events. In August 1945, Kimura was a student at Kyoto University. If Mr. and Mrs. H. L. Stimson had vacationed somewhere other than Kyoto in 1926, he and his ideas would likely have been obliterated in a blinding flash of atomic light.

This essay is adapted from Brian Klaas’s new book, Fluke: Chance, Chaos, and Why Everything We Do Matters.

​When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

QOSHE - The World Is Falling Apart. Blame the Flukes. - Brian Klaas
menu_open
Columnists Actual . Favourites . Archive
We use cookies to provide some features and experiences in QOSHE

More information  .  Close
Aa Aa Aa
- A +

The World Is Falling Apart. Blame the Flukes.

13 37
16.01.2024

What chaos theory has to teach us about human events

The 21st century has been defined by unexpected shocks—major upheavals that have upended the world many of us have known and made our lives feel like the playthings of chaos. Every few years comes a black swan–style event: September 11, the financial crisis, the Arab Spring, Brexit, the election of Donald Trump, the coronavirus pandemic, wars in Ukraine and Gaza. Even daily life can feel like a roll of the dice: With regularity, some Americans go to school, the grocery store, church, a concert, or the movies and get gunned down in a random act of mass murder.

Many of these events were triggered by flukes: small, chance happenings that were arbitrary, even random, and could easily have turned out otherwise. The Arab Spring started after one vegetable vendor in central Tunisia set himself on fire, sparking a conflagration that toppled tyrants and set the region ablaze. Trump may have decided to run for president after Barack Obama humiliated him with a joke at the White House Correspondents’ Dinner in 2011. And no matter what the origin story of COVID-19, a single virus, infecting a single individual in Wuhan, China, jumbled the lives of billions of people—for years. One fluke can change everything, everywhere, all at once.

From the December 2020 issue: The next decade could be even worse

The world feels like it’s falling apart—faster and more unexpectedly than ever before. The frenetic uncertainty of modern life requires new words, such as doomscrolling, to describe the passive, addictive consumption of bad news about a seemingly never-ending supply of calamity. The pace of shocks seems to be accelerating. Economists, politicians, pundits, and political scientists offer few explanations and seem just as walloped as everyone else. To understand why this is happening—and what to do about it—calls for a combination of science and social science, drawing lessons from chaos theory, evolutionary biology, and physics.

Edward Lorenz was a weatherman during World War II, tasked with forecasting cloud cover before American bombing raids in the Pacific. But meteorology in those days was largely guesswork and produced only crude predictions. After the war ended, Lorenz decided to try to unlock the secrets of the weather using more sophisticated methods and harnessing the nascent power of computing. He created a simplified, miniature world on his LGP-30 computer: Instead of the millions of different variables that affect weather systems in the real world, his model had just 12 variables.

One day, Lorenz decided to rerun a simulation he’d done earlier. To save time, he decided to start midway through, plugging in the data points from the prior snapshot. He figured that so long as he set the variables at the same levels, the weather patterns would be repeated just as they were before: same conditions, same outcomes.

But something strange happened instead. The weather in his rerun simulation was different in every way. After a lot of scowling over the data, Lorenz realized what had happened. His computer printouts had rounded data to three decimal places. If, for example, the exact wind speed was 3.506127 miles an hour, the printout displayed it as 3.506 miles an hour. When he plugged the slightly truncated values from the printouts back into the simulation, he was always off by a tiny amount (in........

© The Atlantic


Get it on Google Play