Niall Ferguson

Nobody Knows How Long Inflation Will Last. That’s Life.

Have we just passed peak inflation? That was the question economists were debating last week, when the US Labor Department published the latest consumer price inflation rate. The index in June was 9.1% above the level a year before — the highest figure since December 1981.

Is that the top? We are all entitled to guess, of course. But the idea that the average economist could know the answer to this question is laughable. Only a handful — former Treasury Secretary Larry Summers, most famously, but also my Hoover Institution colleague Michael Bordo and my old friend and fellow Bloomberg Opinion columnist Mohamed El-Erian — correctly foresaw in early 2021 that inflation was about to take off. And even they didn’t venture to predict that inflation would exceed 9% by now.

Mainstream economists, as well as central bankers, had come to believe that inflation was driven not by the growth of the money supply and the velocity of circulation but by the expectations of consumers — which in turn could be “anchored” by a credible inflation target. If the Federal Reserve said that inflation would be 2%, then it pretty much would be. In any case, the problem for most of the last 20 years was its tendency to be below, not above, that goal — hence the innovation of an “average inflation target,” which would implicitly allow inflation to be a little above 2% for a time, to compensate for having been a little below it for a time.

“Frankly we welcome slightly higher … inflation,” declared Federal Reserve Chairman Jerome Powell in January 2021. “The kind of troubling inflation people like me grew up with seems unlikely in the domestic and global context we’ve been in for some time.” This was what his staff economists were telling him. This was what their models told them.

But the models used by economists turned out to be as good at forecasting inflation in 2022 as they were at forecasting growth in 2009, which even after the failure of Lehman Brothers Holdings Inc. they predicted would not turn negative. “I don’t think we’ve seen a significant change in the basic outlook,” reported Fed chief economist David J. Stockton to the Federal Open Markets Committee on Sept. 16 (the day after Lehman declared bankruptcy), “and certainly the story behind our forecast is … that we’re still expecting a very gradual pickup in GDP growth over the next year.” In fact, the economy shrank by 2.6%. A major recession was already well underway.

To understand why the models fail, we need to accept that they are designed to simulate processes that are mind-blowingly complex. To do so, they must engage in deliberate simplification. But consider for a moment what we are implicitly asking when we pose the question: Has inflation peaked? We are not only asking about the supply of and demand for 94,000 different commodities, manufactures and services. We are also asking about the future path of interest rates set by the Fed, which — despite the much-vaunted policy of “forward guidance” — is far from certain. We are asking about how long the strength of the dollar will be sustained, as it is currently holding down the price of US imports.

But there’s more. We are at the same time implicitly asking how long the war in Ukraine will last, as the disruption caused since February by the Russian invasion has significantly exacerbated energy and food price inflation. We are asking whether oil-producing countries will respond to pleas from Western governments to pump more crude. We are asking how much damage President Xi Jinping’s policy of “Zero Covid” will do to the Chinese economy, and hence to East Asian demand for oil and other commodities.

We should probably also ask ourselves what the impact on Western labor markets will be of the latest Covid omicron sub-variant, BA.5. UK data indicate that BA.5 is 35% more transmissible than its predecessor BA.2, which in turn was over 20% more transmissible than the original omicron.

Good luck adding all those variables to your model. It is in fact just as impossible to be sure about the future path of inflation as it is to be sure about the future path of the war in Ukraine and the future path of the Covid pandemic.

Subconsciously, if not consciously, we would all like these three phenomena to be “transitory.” Our cognitive bias in favor of things going back to normal has been exacerbated by the near-universal attention deficit disorder of the TikTok era. Not only are inflation, war in Ukraine and Covid nasty; we are also bored of them — so bored, in the case of Covid, that we no longer pay much attention to the latest wave currently sweeping the US (until we ourselves test positive).

The reality is that all three forms of disorder — economic, public health and geopolitical — seem likely to be protracted, not just for months but potentially for years. And the longer they last, the more disruption we shall see.

Where will be the next Sri Lanka, where economic crisis has led to political chaos? Albania? Argentina? Kenya? Panama? Which political leader will be next to follow British Prime Minister Boris Johnson through the exit door? (It was nearly Italy’s Mario Draghi.) Who will be next to fall to an assassin’s bullet, as the former Japanese Prime Minister Shinzo Abe did on July 8? Feel free to send me your guesses. Just don’t dignify them by calling them forecasts or predictions.

The central problem is that the world we have built has, over time, become an increasingly complex system prone to all kinds of stochastic behavior, non-linear relationships and “fat- tailed” distributions. When I am asked about the future path of inflation, or war, or plague, my answer does not begin, “It’s complicated.” My answer begins, “It’s complex.”

Complexity is a term now widely used by natural scientists as well as computer scientists to make sense of a wide range of different systems, such as the spontaneously organized behavior of half a million ants or termites, which allows them to construct complex hills and nests; the production of human intelligence from the interaction of a hundred billion neurons in the “enchanted loom” of the central nervous system; the action of the antibodies in the human immune system to combat alien bacteria and viruses; the “fractal geometry” whereby simple water molecules form themselves into intricate snowflakes, with myriad variants of sixfold symmetry; and the elaborate biological order that knits together multiple species of flora and fauna within a rain forest.

There is every reason to think that man-made economies, societies and polities share many of the features of such complex adaptive systems. Economists such as W. Brian Arthur have been arguing along these lines for more than 20 years, going beyond Adam Smith’s 18th-century idea that an “invisible hand” caused markets to work through the interaction of profit-maximizing individuals, or Friedrich von Hayek’s later critique of economic planning and demand management.

For Arthur, a complex economy is characterized by the dispersed interaction of multiple agents, a lack of any central control, multiple levels of organization, continual adaptation, the incessant creation of new niches, and an absence of general equilibrium. In this version of economics, Silicon Valley is a complex adaptive system. So is the internet itself.

Researchers at the Santa Fe Institute, where Arthur is an external faculty member, have for years labored to see how such insights can be applied to other aspects of collective human activity.

Consider the following features that are characteristic of complex systems. First, a small input can produce major changes. Second, causal relationships are often (though not always) nonlinear, so conventional methods of generalizing from observations to theory about their behavior, such as trend analysis and sampling, are of little use. Indeed, some theorists of complexity would go so far as to say that complex systems are wholly nondeterministic. When complex systems experience disruption, the scale of the disruption is therefore well-nigh impossible to predict.

A complex system operates somewhere between order and disorder — “on the edge of chaos,” in the phrase of the computer scientist Christopher Langton. The system can operate for an extended period very nicely, apparently in equilibrium, in fact adapting all the time. However, there can come a moment when the system reaches a critical state. A very small catalyst can trigger a “phase transition” from one state to another.

The best-known illustration of complexity in action is the weather. Edward Lorenz, the pioneer of chaos theory, famously suggested that the flapping of a butterfly’s wings in Brazil could set off a tornado in Texas. Even a tiny disturbance, he argued, could have huge effects in a complex system governed by nonlinear relationships.

Lorenz discovered the butterfly effect in 1961, when he was experimenting at the Massachusetts Institute of Technology with a computer model he had designed to simulate weather patterns. He was repeating a simulation he had run before, but he had rounded off one variable from 0.506127 to 0.506. To his amazement, this tiny change drastically transformed the simulated weather generated by the computer.

Almost no one read Lorenz’s pathbreaking paper on the subject when it was published in the Journal of the Atmospheric Sciences as “Deterministic Nonperiodic Flow.” It was not until a decade later that he translated his insight into layman’s language in a lecture with the title, “Predictability: Does the Flap of a Butterfly’s Wings in Brazil Set Off a Tornado in Texas?”

“Two particular weather situations,” he argued, “differing by as little as the immediate influence of a single butterfly, will generally after sufficient time evolve into two situations differing by as much as the presence of a tornado.”

Lorenz, however, added an important caveat: “If the flap of a butterfly’s wings can be instrumental in generating a tornado, it can equally well be instrumental in preventing a tornado.” In Lorenz’s view, this was what made long-range weather prediction so very difficult.

The same applies to economic forecasting. In 1966, the Nobel Prize-winning economist Paul Samuelson joked that declines in US stock prices had correctly predicted “nine out of the last five recessions.” Economic forecasters are far worse at their jobs than weather forecasters. Of 469 downturns in national economies between 1988 and 2019, the International Monetary Fund predicted only four by the spring of the year before they began. As for the global financial crisis of 2008-9, only a handful of economists foresaw it with any real precision. Most, as Her Majesty the Queen pointed out, did not “see it coming.”

Both the weather and the economy are complex systems — and, in the case of the economy, the system has been growing steadily more complex since the Industrial Revolution. Such systems are not governed by the same rules as individual members of our species, which is one reason we find them hard to think about.

For example, we human beings at adulthood are all roughly the same height. A histogram of human stature is a classic bell curve, with most of us somewhere between five and six feet tall and nobody shorter than about two feet or taller than about eight. There are no ant-size people and no human skyscrapers.

But now consider the realm of international politics. Two mega-states — China and India — account for 36% of the world’s population. Then come 11 big states, from the US down to the Philippines, each with more than 100 million people, accounting for just over a quarter of the world’s population. Seventy-five medium-size states have between 10 million and 100 million inhabitants: another third of the world’s population. But then there are 71 with between one million and 10 million (5% of humanity), 41 states with between 100,000 and a million (0.2%), and a further 33 with fewer than 100,000 residents. Each of these states has a seat in the United Nations General Assembly. In the real world, however, what happens in the mega-states affects vastly more people than what happens in the small fry.

Just as the sizes of states are not normally distributed, so, too, with their lifespans. Unless our lives are cut short by violence, accidents or disease, we humans can count on living from infancy through maturity to senility. Because there is a life cycle for us as individual organisms, we tend to assume the same is true for polities.

Yet the lifespan of empires (the biggest human polities) range from a millennium (the Roman Empire) to just over a decade (Hitler’s Third Reich). As for cycles of history, these are artificial constructs, superimposed on the complexity and chaos of the past by authors desperate for more order and predictability than exist.

History, broadly conceived, is the interaction of natural and man-made complexity. It would be very remarkable if this process resulted in predictable cycles. Even a relatively simple man-made edifice such as a bridge can fail (to quote the recently deceased engineering professor Yacov Haimes), “from deterioration of the bridge deck, corrosion or fatigue of structural elements, or an external loading such as floodwater. None of these failure modes is independent of the others in probability or consequence.”

All complexity carries with it the potential for collapse — hence the Yale sociologist Charles Perrow’s idea of the “normal accident,” i.e., the normalization of accidents as a result of ubiquitous complexity.

Historians are partly to blame for our inability to understand complexity. The major upheavals — wars, revolutions, plagues — that we love to study are low-frequency, high-impact events located in the tails of distributions that are anything but normal. Often, like Lorenz’s tornado, they can have quite small, proximate triggers.

Not long after some big phase transition, however, the historians arrive on the scene. Misunderstanding complexity, they proceed to explain the huge calamity in terms of long-run causes, often dating back decades. A world war breaks out in the summer of 1914, to the avowed amazement of most contemporaries. Before long, the historians have devised a storyline commensurate with the disaster, involving power-hungry Germans and the navy they began building in 1898, the waning of Ottoman power in the Balkans dating back to the 1870s, and a treaty governing the neutrality of Belgium that was signed in 1839. This is what Nassim Nicholas Taleb has rightly condemned as the “narrative fallacy” — the construction of psychologically satisfying stories on the principle of post hoc, ergo propter hoc.

Telling such stories is an age-old habit that is very hard to break. Recent versions of the retrospective fallacy trace the 9/11 terrorist attacks back to the 1966 execution of Sayyid Qutb, the Islamist writer who inspired the Muslim Brotherhood; or attribute the 2008 financial crisis to measures of financial deregulation dating back to the 1980s.

But the proximate triggers of a crisis often suffice to explain the sudden phase transition. As Taleb has argued, the global economy by 2007 had come to resemble an overoptimized electricity grid. The relatively small surge represented by defaults on subprime mortgages in the US sufficed to tip the entire world economy into the financial equivalent of a blackout. Blaming such a crash on financial deregulation under President Ronald Reagan is about as illuminating as blaming World War I on the naval plans of Admiral Alfred von Tirpitz.

A key facet of complexity is the role played by networks, arguably the most important feature of both natural and man-made complexity. The natural world is to a bewildering extent made up of “optimized, space-filling, branching networks,” in the words of the physicist Geoffrey West — another Santa Fe Institute sage.

In prehistory, Homo sapiens evolved as a cooperative ape, with a unique ability to network — to communicate and to act collectively — that sets us apart from all other animals. In the words of the evolutionary anthropologist Joseph Henrich, we are not simply bigger-brained, less hairy chimpanzees; the secret of our success as a species “resides … in the collective brains of our communities.”

Social networks are the structures that human beings naturally form, beginning with knowledge itself and the various kinds of representation we use to communicate it, as well as the family trees to which we all necessarily belong. They come in all shapes and sizes, from exclusive secret societies to open-source mass movements. Some have a spontaneous, self-organizing character; others are more systematic and structured. All that has happened — beginning with the invention of written language — is that successive information and communication technologies have facilitated our innate, ancient urge to network.

The key point is that social networks today are much larger and faster than at any time in history. That is why the complex system we know as humanity is more vulnerable than ever to various forms of contagion. In the words of the sociologist Duncan Watts, the key to assessing the likelihood of a contagion is “to focus not on the stimulus itself but on the structure of the network the stimulus hits.”

This helps explain why, for every idea that goes viral, there are countless others that fizzle out in obscurity because they began with the wrong node, cluster or network. The same often goes for infectious microbes, only a very few of which succeed in generating pandemics.

Large social networks are themselves complex systems, with their own kinds of phase transition. A seemingly random network can evolve with astounding speed into a hierarchy. The number of steps between the revolutionary crowd and the totalitarian state has more than once proved to be surprisingly small. The seemingly rigid structures of a hierarchical order can disintegrate with equal rapidity.

As I argued in “Doom: The Politics of Catastrophe” (out now in paperback with a new afterword), a disaster such as a pandemic is not a single, discrete event. It invariably leads to other forms of disaster — economic, social, political — as well as to other forms of contagion (such as viral conspiracy theories). There can be, and often are, cascades or chain reactions of disaster. The more networked the world becomes, the more we see this.

I ended the book by predicting both economic and geopolitical crises in the wake of the worst phase of the pandemic. Here, too, I drew inspiration from Santa Fe, because Edward D. Lee’s concept of a “conflict avalanche” — which he and his co-authors derived from their research on modern conflicts in Africa — seemed to me to have a global applicability.

Inflation, as the writer Matthew C. Klein has argued, may be about to fade as prices of everything from graphics cards to stainless-steel skillets and ocean freight come down and inventories continue to rise, even if there remain significant supply constraints in energy, automobiles and air travel.

Larry Summers has recently restated his belief in “secular stagnation,” suggesting that the period of rising interest rates may prove short-lived as demographic and technological forces reassert themselves.

Yet research by both the Institute for Economics and Peace and the International Monetary Fund points to rising levels of social unrest and violent demonstrations around the globe. The scenario of a global conflict avalanche can certainly not be discounted, especially when political assassination returns to Japan, for decades one of the world’s most politically stable countries. And nothing is more certain to keep inflation going around the world than an avalanche of war and political violence, disrupting the supply of money, goods and labor.

History shows this. But complexity explains it.