Human beings are rarely rational—so it’s time we all stopped pretending they are.
In 1998, as the Asian financial crisis was ravaging what had been some of the fastest-growing economies in the world, the New Yorker ran an article describing the international rescue efforts. It profiled the super-diplomat of the day, a big-idea man the Economist had recently likened to Henry Kissinger. The New Yorker went further, noting that when he arrived in Japan in June, this American official was treated “as if he were General [Douglas] MacArthur.” In retrospect, such reverence seems surprising, given that the man in question, Larry Summers, was a disheveled, somewhat awkward nerd then serving as the U.S. deputy treasury secretary. His extraordinary status owed, in part, to the fact that the United States was then (and still is) the world’s sole superpower and the fact that Summers was (and still is) extremely intelligent. But the biggest reason for Summers’s welcome was the widespread perception that he possessed a special knowledge that would save Asia from collapse. Summers was an economist.
During the Cold War, the tensions that defined the world were ideological and geopolitical. As a result, the superstar experts of that era were those with special expertise in those areas. And policymakers who could combine an understanding of both, such as Kissinger, George Kennan, and Zbigniew Brzezinski, ascended to the top of the heap, winning the admiration of both politicians and the public. Once the Cold War ended, however, geopolitical and ideological issues faded in significance, overshadowed by the rapidly expanding global market as formerly socialist countries joined the Western free trade system. All of a sudden, the most valuable intellectual training and practical experience became economics, which was seen as the secret sauce that could make and unmake nations. In 1999, after the Asian crisis abated, Time magazine ran a cover story with a photograph of Summers, U.S. Treasury Secretary Robert Rubin, and U.S. Federal Reserve Chairman Alan Greenspan and the headline “The Committee to Save the World.”
In the three decades since the end of the Cold War, economics has enjoyed a kind of intellectual hegemony. In the three decades since the end of the Cold War, economics has enjoyed a kind of intellectual hegemony. It has become first among equals in the social sciences and has dominated most policy agendas as well. Economists have been much sought after by businesses, governments, and society at large, their insights seen as useful in every sphere of life. Popularized economics and economic-type thinking have produced an entire genre of best-selling books. At the root of all this influence is the notion that economics provides the most powerful lens through which to understand the modern world.
That hegemony is now over. Things started to change during the 2008 global financial crisis, which had a far greater impact on the discipline of economics than is commonly understood. As Paul Krugman noted in a September 2009 essay in the New York Times Magazine, “Few economists saw our current crisis coming, but this predictive failure was the least of the field’s problems. More important was the profession’s blindness to the very possibility of catastrophic failures in a market economy.” The left-wing Krugman was not the only one to make this observation. In October 2008, Greenspan, a lifelong libertarian, admitted that “the whole intellectual edifice … collapsed in the summer of last year.”
For Krugman, the reason was clear: Economists had mistaken “beauty, clad in impressive-looking mathematics, for truth.” In other words, they’d fallen in love with the supposed rigor that derives from the assumption that markets function perfectly. But the world had turned out to be more complex and unpredictable than the equations.
The crisis of 2008 may have been the wake-up call, but it was only the latest warning sign. Modern-day economics had been built on certain assumptions: that countries, companies, and people seek to maximize their income above all else, that human beings are rational actors, and that the system works efficiently. But over the last few decades, compelling new work by scholars such as Daniel Kahneman, Richard Thaler, and Robert Shiller has begun to show that human beings are not predictably rational; in fact, they’re predictably irrational. This “behavioral revolution” landed a debilitating blow to mainstream economics by arguing that what was perhaps the centerpiece assumption of modern economic theory was not only wrong but, even worse, unhelpful.
In the social sciences, it is generally understood that theoretical assumptions never mirror reality—they’re abstractions designed to simplify—but do provide a powerful way to understand and predict. What the behavioral economists showed is that the assumption of rationality actually produces misunderstandings and bad predictions. It is worth noting that one of the very few economists who predicted both the dot-com bubble that caused the crash of 2000 and the housing bubble that caused the crash of 2008 was Shiller, who won the Nobel Prize in 2013 for his work in behavioral economics.
Recent events have hammered still more nails into the coffin of traditional economics. If the great divide of 20th-century politics was over free markets, the key splits that have emerged in the past few years involve immigration, race, religion, gender, and a whole set of related cultural and identity issues. Where in the past one could predict a voter’s choice based on his or her economic standing, today voters are driven more by concerns about social status or cultural coherence than by economic self-interest.
If economics has failed to accurately capture the motives of the modern individual, what about modern countries? These days, the quest to maximize profit does not seem like a helpful way to understand why states act the way they do. Many European countries, for example, have higher labor productivity than the United States. Yet citizens there choose to work fewer hours and take longer vacations, decreasing their output—because, they might argue, they prioritize contentment or happiness over economic output. Bhutan has explicitly decided to pursue “gross national happiness” rather than gross domestic product. Many countries have replaced purely GDP-oriented goals with strategies that also stress environmental sustainability. China still puts economic growth at the center of its planning, but even it has other, equal priorities, such as preserving the Communist Party’s monopoly on power—and it uses non-free market mechanisms to do so. Meanwhile, populists everywhere now place greater value on preserving jobs than on increasing efficiency.
Let me be clear: Economics remains a vital discipline, one of the most powerful ways we have to understand the world. Economics remains a vital discipline, one of the most powerful ways we have to understand the world.But in the heady days of post-Cold War globalization, when the world seemed to be dominated by markets and trade and wealth creation, it became the dominant discipline, the key to understanding modern life. That economics has since slipped from that pedestal is simply a testament to the fact that the world is messy. The social sciences differ from the hard sciences because “the subjects of our study think,” said Herbert Simon, one of the few scholars who excelled in both. As we try to understand the world of the next three decades, we will desperately need economics but also political science, sociology, psychology, and perhaps even literature and philosophy. Students of each should retain some element of humility. As Immanuel Kant said, “Out of the crooked timber of humanity, no straight thing was ever made.”
This article originally appeared in the Winter 2019 issue of Foreign Policy magazine.
Source: Foreign Policy