It is remarkable how the economic debate that has dominated political life over the past decade in Britain and much of Europe—how austere should we be?—is completely irrelevant to our current crisis, as if that argument sits in a parallel economic universe that we no longer inhabit.
Instead of arguing about how swiftly governments should balance the books and lower their debt levels, leading economists, who are typically better known for their rhetorical sobriety, are suddenly making the case for much more aggressive public spending. People such as Mario Draghi, a former president of the European Central Bank, Olivier Blanchard, previously a chief economist at the International Monetary Fund, and the economist Kenneth Rogoff, who has often argued that too much debt leads to far lower growth, have all pushed for what they have described as warlike spending. The U.K.’s Office for Budget Responsibility, traditionally a guardian of fiscal prudence, now advises the British government to spend “what you need to spend to deal with this. In some ways it’s like a wartime situation.”
If this is right, what lessons might we learn from “wartime economics”—beyond a cautionary reminder that the standard economic tools available to us (cutting interest rates, carefully increasing government spending) are no match for the magnitude of the moment?
First, we should acknowledge the peculiar paradox at the core of this crisis, that although we are confronting an economic calamity, no actual economic weaknesses are to blame. The economy has not been leveled by bombs—we have simply turned it off. This creates a unique challenge: How do you support people during this moment of suspended animation, and ensure that, when we switch the economy “back on,” it is able to propel itself into action, unscarred by this pause? Some of the traditional lessons of war, thus, don’t apply. Today, for example, a crucial economic challenge is that consumer demand has been decimated by the virus—people are stuck at home across Britain, Europe, the United States, and elsewhere, unable to buy coffees or croissants, eat at their local restaurant, or purchase many new products. In contrast, what troubled John Maynard Keynes, the British economist, at the start of the Second World War was the possibility of too much demand. That combined with a shortage of supplies due to the war effort, he feared, would lead to explosive inflation. His solution, a compulsory saving scheme, is precisely the opposite of what we need.
One lesson that we have learned, more from necessity than historical reflection, is the need for effective “big government,” for competent top-down planners to take the place of chaotic bottom-up markets. We have seen the state step forward in many countries to offer huge amounts of support for workers—in the U.K., for example, by underwriting almost the entire private sector through wage guarantees. And we have seen the state take a role in mobilizing and redirecting resources—medical staff, volunteers, equipment, and much else—toward the “war effort.”
Yet, in practice, the most important lessons we can learn from wartime economics are likely not those that teach us what to do now but those that give us a glimpse of the challenges and debates we will face in the future, once the war is over.
To begin with, few countries have been able to respond to this crisis alone. Almost none has had the domestic capacity to produce sufficient tests, masks, medication, or ventilators when the moment required. The U.S., for instance, relies on China for 90 percent of its antibiotics. Just as Keynes found himself, in the interwar period, reflecting on the merits of economic self-sufficiency, we will soon likely find ourselves further ensnared in debates about “strategic protectionism,” about the merit of intentionally building and shielding particular nationally important parts of the economy, even if to do so runs counter to the basic economic principle of comparative advantage.
If the U.K. had done this, we might have avoided the ongoing scrambles to secure sufficient ventilators, tests, and protective equipment; we could have built up the domestic capability to produce far more ourselves. This is after all, as one writer put it, no “black swan” event—pandemics are right at the top of Britain’s National Risk Register, an assessment of the most significant emergencies the country might face in the next five years. On testing, for instance, the U.K.’s health secretary, Matt Hancock, noted last week that “we didn’t go into this crisis with a huge diagnostics industry.” In preparing for the next pandemic, we may want to change that. Likewise, we must now turn to the other issues on that list. For instance, the risk of cyberattacks has increased, according to the report. Does the U.K. yet have the domestic capabilities in artificial intelligence and related technologies to fend for ourselves in the event of a future global attack?
We should see a transformation in the way we view certain types of work, as well. The Second World War catalyzed a shift in women’s treatment in the labor market, and this crisis may similarly force us to address another working-world flaw: the gap between the great social value of so many jobs, and the comparatively small market value (in the form of a salary) that they receive. In Britain, for instance, labeling doctors, nurses, care workers, social workers, teachers, criminal lawyers, and others as “key workers” betrays a two-fold irony: Though these roles are key (and have been for some time), that status is in many cases not reflected in their pay; and some of them are precisely the sorts of so-called low-skilled workers that post-Brexit immigration controls would keep out. In some countries, narrowing this gap will be easier: In the U.K., for instance, the state is the main employer for these particular jobs. It could swiftly narrow the difference between their social value and their market value, should it want to.
A third lesson is to monitor market competition, or the lack of it. A common complaint in the first half of the 20th century was that war conditions were particularly favorable to monopolization: Certain companies were either favored by the state or well placed to take advantage of the war effort, and many smaller firms struggled, eventually being bought out by larger ones with deeper pockets. We can see similar trends unfolding in this crisis: Companies such as Amazon, Netflix, Facebook, and Zoom find themselves providing the goods and services that are necessary at the moment, while others, typically smaller businesses without the financial wherewithal to stay afloat, are in trouble through no fault of their own. During the First and Second World Wars, excess profit taxes, levied on profits above prewar levels, were used to make sure no company benefited disproportionately at a time of great national suffering. We shouldn’t just consider using these taxes again, but, in time, we should also survey the general health of different sectors, checking that they are still characterized by healthy competition, rather than dominated by a lucky few who have managed to survive.
The final lesson from wartime economics—and the most significant—relates to the future of the state. Having stepped forward to deliver so much in the crisis, the state seems unlikely to simply retreat to its former shape and size once this is all over, given it has not done so in the past. Many of Europe’s strongest welfare states found their initial form in the postwar moments of the first half of the 20th century. The same, I imagine, will happen in our time. Citizens will ask why our preexisting economic crises—of inequality, poverty, and homelessness—did not, as the COVID-19 crisis has, demand a suspension of the economic reasoning that has kept most states in Europe so austere for the past decade. And, much as wartime economics did not cease when the world wars ended, these demands will ensure that the consequences of this “war” will be felt for long after it concludes.
No comments:
Post a Comment