In the 1960s, French Finance Minister Valéry Giscard d’Estaing complained that the dominance of the U.S. dollar gave the United States an “exorbitant privilege” to borrow cheaply from the rest of the world and live beyond its means. U.S. allies and adversaries alike have often echoed the gripe since. But the exorbitant privilege also entails exorbitant burdens that weigh on U.S. trade competitiveness and employment and that are likely to grow heavier and more destabilizing as the United States’ share of the global economy shrinks. The benefits of dollar primacy accrue mainly to financial institutions and big businesses, but the costs are generally borne by workers. For this reason, continued dollar hegemony threatens to deepen inequality as well as political polarization in the United States.
Dollar hegemony isn’t foreordained. For years, analysts have warned that China and other powers might decide to abandon the dollar and diversify their currency reserves for economic or strategic reasons. To date, there is little reason to think that global demand for dollars is drying up. But there is another way the United States could lose its status as issuer of the world’s dominant reserve currency: it could voluntarily abandon dollar hegemony because the domestic economic and political costs have grown too high.
The United States has already abandoned multilateral and security commitments during the administration of President Donald Trump—prompting international relations scholars to debate whether the country is abandoning hegemony in a broader strategic sense. The United States could abandon its commitment to dollar hegemony in a similar way: even if much of the rest of the world wants the United States to maintain the dollar’s role as a reserve currency—just as much of the world wants the United States to continue to provide security—Washington could decide that it can no longer afford to do so. It is an idea that has received surprisingly little discussion in policy circles, but it could benefit the United States and ultimately, the rest of the world.
THE PRICE OF DOLLAR DOMINANCE
The dollar’s dominance stems from the demand for it around the world. Foreign capital flows into the United States because it is a safe place to put money and because there are few other alternatives. These capital inflows dwarf those needed to finance trade many times over, and they cause the United States to run a large current account deficit. In other words, the United States is not so much living beyond its means as accommodating the world’s excess capital.
Dollar hegemony also has domestic distributional consequences—that is, it creates winners and losers within the United States. The main winners are the banks that act as the intermediaries and recipients of the capital inflows and that exercise excessive influence over U.S economic policy. The losers are the manufacturers and the workers they employ. Demand for the dollar pushes up its value, which makes U.S. exports more expensive and curtails demand for them abroad, thus leading to earnings and job losses in manufacturing.
The costs have been borne disproportionately by swing states in regions such as the Rust Belt—a consequence that in turn has deepened socioeconomic divisions and fueled political polarization. Manufacturing jobs that were once central to the economies of these regions have been offshored, leaving poverty and resentment in their wake. It is little surprise that many of the hardest-hit states voted for Trump in 2016.
The domestic costs of accommodating large capital flows are likely to increase and become more destabilizing for the United States in the future. As China and other emerging economies continue to grow and the United States’ slice of the global economy continues to shrink, capital inflows to the United States will grow relative to the size of the U.S. economy. This will amplify the distributional consequences of dollar hegemony, further benefiting U.S. financial intermediariesat the expense of the country’s industrial base. It will likely also make U.S. politics even more fraught.
Given these mounting economic and political pressures, it will become increasingly difficult for the United States to create more balanced and equitable growth while remaining the destination of choice for the world’s excess capital, with the overvalued currency and deindustrialization this implies. At some point, the United States may have little alternative but to limit capital imports in the interests of the broader economy—even if doing so means voluntarily giving up the dollar’s role as the world’s dominant reserve currency.
THE BRITISH PRECEDENT
The United States would not be the first country to abdicate monetary hegemony. From the mid-nineteenth century until World War I, the United Kingdom was the world’s dominant creditor, and the pound sterling was the dominant means of financing international trade. During this period, the value of money was based on its redeemability for gold under the so-called gold standard. The United Kingdom held the largest gold reserves in the world, and other countries held their reserves in gold or in pounds.
The United States would not be the first country to abdicate monetary hegemony.
In the first half of the twentieth century, the British economy declined, and its exports became less competitive. But because the United Kingdom adhered to the gold standard, running a trade deficit meant transferring gold abroad, which reduced the amount of money in circulation and forced down domestic prices. The United Kingdom suspended the gold standard during World War I, along with several other countries. But by the end of the war, it was a debtor nation and the United States, which had accumulated huge gold reserves, had replaced it as the world’s principal creditor.
The United Kingdom returned to the gold standard in 1925, but it did so at the prewar exchange rate, which meant that the pound sterling was highly overvalued, and with much-depleted gold reserves. British exports continued to suffer, and the country’s remaining gold holdings dwindled, forcing it to cut wages and prices. The country’s industrial competitiveness declined, and unemployment soared, causing social unrest. In 1931, the United Kingdom abandoned the gold standard for good—which in effect meant abandoning sterling hegemony.
In 1902, Joseph Chamberlain, then secretary of state for the colonies, famously described the United Kingdom as a “weary titan.” Today, the term aptly fits a United States that sees its economic might waning relative to that of other powers, particularly China. International relations theorists and foreign policy analysts debate the grade and extent of the U.S. decline and even the outlook for a “post-American” world.
Some argue that under Trump, the United States has deliberately abandoned the project of “liberal hegemony”—for example, by creating uncertainty about U.S. security commitments. Others describe the U.S. retreat from hegemony as part of a longer-term structural retrenchment. Either scenario makes wholly conceivable that the United States will follow the British precedent and voluntarily relinquish monetary hegemony. Whether and how this might happen has surprisingly been little discussed.
THE CASE FOR TAXING SPECULATIVE CAPITAL
At the moment, the dollar looks more dominant than ever. Even as the U.S. economy has plunged into recession and shed millions of jobs, the demand for dollars has increased—just as it did after the 2008 financial crisis. Foreigners sold large numbers of U.S. Treasury bonds in March, but they exchanged them for U.S. dollars. The Federal Reserve injected trillions of dollars into the global economy in order to prevent international financial markets from seizing up, expanding the system of swap lines with other central banks that it used in 2008. Even as the Trump administration’s mishandling of the pandemic reinforced the view that the United States is a declining power, the actions of the Federal Reserve and investors around the world have underscored the centrality of the dollar in the global economy.
Yet this should not reassure the United States. The influx of capital will continue to harm U.S. manufacturers, and the pandemic-induced downturn will only compound the pain felt by workers. In order to alleviate the mounting economic and political pressures in regions such as the Rust Belt, the United States should consider taking steps to limit capital imports. One option would be to supply fewer dollars to the global economy, pushing up the value of the currency to a point where foreigners would balk at buying it. Doing so would make U.S. trade less competitive, however, and weigh down already excessively low inflation.
The influx of capital will continue to harm U.S. manufacturers
Alternatively, the United States could call the bluff of those powers, including China and the European Union, that have called for a diminished global role for the dollar. There is no obvious successor to the United States as the purveyor of the world’s dominant reserve currency. To allow capital to flow freely in and out of China, for instance, would require a fundamental—and politically difficult—restructuring of that country’s economy. Nor can the eurozone take over so long as it depends on export-led growth and the corresponding export of capital. But the absence of a clear successor shouldn’t necessarily stop the United States from abandoning dollar hegemony.
The United States could impose a levy or tax that penalizes short-term, speculative foreign investments but exempts longer-term ones. Such a policy would get at the origin of trade imbalances by reducing capital inflows (trade barriers hit at the symptoms rather than the cause). It would also mitigate the current backlash against free trade and reduce the economically unproductive profits of financial institutions.
In an optimistic scenario, the world’s three economic hubs—China, the United States, and the European Union—would agree to construct a currency basket along the lines of the International Monetary Fund’s special drawing rights and either empower the IMF to regulate it or create a new international monetary institution to do so. The pessimistic but probably more likely outcome is that tensions—especially between China and the United States—would make cooperation impossible and increase the likelihood of conflict between them around economic issues.
Even if it is impossible to find a cooperative solution, it may make sense for the United States to unilaterally abandon dollar hegemony. Doing so would force China and the eurozone to deploy their excess savings at home, which would require them to make major adjustments to their economic models so that they produce more balanced and equitable growth. It would also limit the excessive profits of U.S. financial intermediaries and benefit American workers by bringing down the value of the dollar and making U.S. exports more competitive. In short, abandoning dollar hegemony could open the way for a more stable and equitable U.S. economy and global economy.
That era is drawing to a close. In many countries, interest rates are so low, even negative, that central banks can’t lower them further. Tepid economic growth and low inflation mean they can’t raise rates, either.
Since World War II, every recovery was ushered in with lower rates as the Fed moved to stimulate growth. Every recession was preceded by higher interest rates as the Fed sought to contain inflation.
But with interest rates now stuck around zero, central banks are left without their principal lever over the business cycle. The Eurozone economy is stalling, but the European Central Bank, having cut rates below zero, can’t or won’t do more. Since 2008, Japan has had three recessions with the Bank of Japan, having set rates around zero, largely confined to the sidelines.
The U.S. may not be far behind. “We are one recession away from joining Europe and Japan in the monetary black hole of zero rates and no prospect of escape,” said Harvard University economist Larry Summers. The Fed typically cuts short-term interest rates by 5 percentage points in a recession, he said, yet that is impossible now with rates below 2%.
Workers, companies, investors and politicians may need to prepare for a world where the business cycle rises and falls largely without the influence of central banks.
“The business cycle we’re used to is a bad guide to business cycles going forward,” said Ray Dalio, founder of Bridgewater Associates LP, the world’s biggest hedge fund.
In November, Fed chairman Jerome Powell warned Congress that “the new normal now is lower interest rates, lower inflation, probably lower growth…all over the world.” As a result, he said, the Fed is studying ways to alter its strategy and develop tools that can work when interest rates approach zero.
Fed chairman Jerome Powell on Capitol Hill in November.PHOTO: SAM CORUM/EPA/SHUTTERSTOCK
Central banks are calling on elected officials to employ taxes, spending and deficits to combat recessions. “It’s high time I think for fiscal policy to take charge,” Mario Draghi said in September, shortly before stepping down as ECB president.
There are considerable doubts that any new tools can restore the influence of central banks, or that countries can overcome obstacles to more robust fiscal policy, particularly political opposition and steep debt.
Business cycles in the future may resemble those of the 19th century, when monetary policy didn’t exist. From 1854 to 1913, the U.S. had 15 recessions, according to the National Bureau of Economic Research, the academic research group that dates business cycles. Many were severe. One slump lasted from 1873 to 1879, and some historians argue it lingered until 1896.
Fed’s Fading Influence
U.S. recessions were more frequent before the Federal Reserve took control over interest rates, using them as a lever to slow inflation or boost the economy. Low rates have weakened the central bank by giving it little room to reduce rates further.
The Fed’s sway over the economy has also been weakened by a decline in durable manufacturing and construction, which are sensitive to rates, and the growth in services, which aren’t.
Sources: National Bureau of Economic Research (recessions); Sidney Homer and Richard Sylla (interest rates 1854-1933); Federal Reserve (interest rates 1934-present); U.S. Commerce Department (value-added shares of GDP)
Kathryn Tam/THE WALL STREET JOURNAL
The causes of business cycles were diverse, Wesley Claire Mitchell, an NBER founder, wrote in 1927. They included “the weather, the uncertainty which beclouds all plans that stretch into the future, the emotional aberrations to which business decisions are subject, the innovations characteristic of modern society, the ‘progressive’ character of our age, the magnitude of savings, the construction of industrial equipment, ‘generalized overproduction,’ the operations of banks, the flow of money incomes, and the conduct of business for profits.”
He didn’t mention monetary or fiscal policy because, for all practical purposes, they didn’t exist. Until 1913, the U.S. hadn’t had a central bank, except for two brief periods. As for fiscal policy, U.S. federal spending and taxation were too small to matter.
When central banks were established, they didn’t engage in monetary policy, which means adjusting interest rates to counter recession or rein in inflation. Many countries were on the gold standard which, by tying the supply of currency to the stock of gold, prevented sustained inflation.
The Fed was established in 1913 to act as lender of last resort, supplying funds to commercial banks that were short of cash, not to manage inflation or unemployment. Not until the Great Depression did that change.
In 1933, Franklin D. Roosevelt took the U.S. off the gold standard, giving the Fed much more discretion over interest rates and the money supply. Two years later, Congress centralized Fed decision-making in Washington, better equipping it to manage the broader economy.
Modern times
Macroeconomics, the study of the economy as a whole instead of individuals and firms, was born from the work of British economist John Maynard Keynes. He showed how individuals and firms, acting rationally, could together spend too little to keep everyone employed.
In those circumstances, monetary or fiscal policy could generate more demand for a nation’s goods and services, Mr. Keynes argued. Just as a dam regulates the flow of a river to counter flooding and drought, monetary and fiscal-policy makers must try to regulate the flow of aggregate demand to counter inflation and recession.
The Employment Act of 1946 committed the U.S. to the idea of using fiscal and monetary policy to maintain full employment and low rates of inflation.
The next quarter-century followed a textbook script. In postwar America, rapid economic growth and falling unemployment yielded rising inflation. The Fed responded by raising interest rates, reducing investment in buildings, equipment and houses.
The economy would slide into recession, and inflation would fall.
The Fed then lowered interest rates, investment would recover, and growth would resume.
The textbook model began to fray at the end of the 1960s. Economists thought low interest rates and budget deficits could permanently reduce unemployment in exchange for only a modest uptick in inflation. Instead, inflation accelerated, and the Fed induced several deep and painful recessions to get it back down.
By the late 1990s, new challenges emerged. One was at first a good thing. Inflation became both low and unusually stable, barely fluctuating in response to economic growth and unemployment.
The second change was less beneficial. Regular prices were more stable, but asset prices became less so. The recessions of 2001 and 2008 weren’t caused by the Fed raising rates. They resulted from a boom and bust in asset prices, first in technology stocks, then in house prices and mortgage debt.
After the last bust, the Fed kept interest rates near zero from 2008 until 2015. The central bank also purchased government bonds with newly created money—a new monetary tool dubbed quantitative easing—to push down long-term interest rates.
Despite such aggressive stimulus, economic growth has been slow. Unemployment has fallen to a 50-year low, but inflation has persistently run below the 2% target the Fed set. A similar situation prevails abroad.
In Japan, Britain and Germany, unemployment is down to historic lows. But despite short-and long-term interest rates near and sometimes below zero, growth has been muted. Since 2009, inflation has averaged 0.3% in Japan and 1.3% in the Eurozone.
The textbook model of monetary policy is barely operating, and economists have spent the last decade puzzling why.
One explanation focuses on investment, the main driver of long-term economic growth. Investment is financed out of saving. When investment is high relative to saving, that pushes interest rates up because more people and businesses want to borrow. If saving is high relative to investment, that pushes rates down. That means structurally low investment coupled with high saving by businesses and aging households can explain both slow growth and low interest rates.
Richard Clarida, the Fed’s vice chairman, cited another reason during a speech in November. Investors in the past, he said, demanded an interest rate premium for the risk that inflation would turn out higher than they expected. Investors are now so confident central banks will keep inflation low that they don’t need that premium. Thus, central banks’ success at eradicating fear of inflation is partly responsible for the low rates that currently limit their power.
While the Fed’s grip on growth and inflation may be slipping, it can still sway markets. Indeed, Mr. Dalio said, the central bank’s principal lever for sustaining demand has been its ability to drive up asset prices as well as the debt to finance assets, called leverage. Since the 2008 crisis, low rates and quantitative easing have elevated prices of stocks, private equity, corporate debt and real estate in many cities. As prices rise, their returns, such a bond or dividend yield, decline.
That dynamic, he said, has reached its limit. Once returns have fallen close to the return on cash or its equivalent, such as Treasury bills, “there is no incentive to lend, or invest in these assets.” At that point, the Fed is no longer able to stimulate spending.
Less than zero
A central bank can always raise rates enough to slow growth in pursuit of lower inflation; but it can’t always lower them enough to ensure faster growth and higher inflation.
The European Central Bank has tried—cutting interest rates to below zero, in effect charging savers. Its key rate went to minus 0.5% from minus 0.4% in September. At that meeting and since, resistance has grown inside the ECB to even more negative rates for fear that would reduce bank lending or have other side effects.
In December, Sweden’s central bank, which implemented negative rates in 2015, ended the experiment and returned its key policy rate to zero. Fed officials have all but ruled out ever implementing negative rates.
In a new research paper, Mr. Summers, who served as President Clinton’s Treasury secretary and President Obama’s top economic adviser, and Anna Stansbury, a Ph.D. student in economics at Harvard, say very low or negative rates are “at best only weakly effective…and at worst counterproductive.”
They cited several reasons why. Some households earn interest from bonds, money-market funds and bank deposits. If rates go negative, that source of purchasing power shrinks. Some people nearing retirement may save more to make up for the erosion of their principal by very low or negative rates.
Moreover, the economy has changed in ways that weaken its response to interest-rate cuts, they wrote. The economy’s two most interest-sensitive sectors, durable goods manufacturing, such as autos, and construction, fell to 10% of national output in 2018 from 20% in 1967, in part because America’s aging population spends less on houses and cars. Over the same period, financial and professional services, education and health care, all far less interest sensitive, grew to 47% from 26%.
They concluded the response of employment to interest rates has fallen by a third, meaning it is harder for the Fed to generate a boom.
The U.S. isn’t likely to plunge into another financial crisis like 2008, Mr. Dalio said, as long as interest rates remain near zero. Such low rates allow households and companies to easily refinance their debts.
More likely, he said, are shallow recessions and sluggish growth, similar to what Japan has experienced—what he called a “big sag.”
Former Fed chairman Ben Bernanke this month estimated that through quantitative easing and “forward guidance,” committing to keep interest rates low until certain conditions are met, the Fed could deliver the equivalent of 3 percentage points of rate cuts, enough, in addition to two to three points of regular rate cuts, to counteract most recessions.
Mr. Clarida warned, however, that quantitative easing may suffer from diminishing returns in the next recession. Moreover, the next recession is likely to be global, he said this month, and if all major countries weaken at the same time, it will push rates everywhere toward zero. That would make it harder for the Fed or any other central bank to support its own economy than if only one country were in trouble.
Fiscal fix
With central banks so constrained, economists say fiscal policy must become the primary remedy to recessions.
History shows that aggressive fiscal policy can raise growth, inflation and interest rates. The U.S. borrowed heavily in World War II. With help from the Fed, which bought some of the debt and kept rates low, the economy vaulted out of the Great Depression. Once wartime controls on prices and interest rates were lifted, both rose.
Today, mainstream academic economists are again recommending higher inflation and deficits to escape the low-growth, low-rate trap.
Advocates of what is called modern monetary theory say the Fed should create unlimited money to finance government deficits until full employment is reached. Some economists call for dialing up “automatic stabilizers,” the boost that federal spending gets during downturns, via payments to individuals and state governments as well as infrastructure investment.
Yet fiscal policy is decided not by economists but by elected officials who are more likely to be motivated by political priorities that conflict with the economy’s needs. In 2011, when unemployment was 9%, a Republican-controlled Congress forced Mr. Obama to agree to deficit cuts. In 2018, when unemployment had fallen to 4%, President Trump and the GOP-controlled Congress slashed taxes and boosted spending, sharply raising the budget deficit. Mr. Trump has pressured Mr. Powell to cut rates more and resume quantitative easing, which the Fed chairman has resisted.
Fiscal policy in the Eurozone is hampered by rules that limit the debt and deficits of its member countries. It is also hamstrung by divergent interests: Germany, the country that can most easily borrow, needs it least. In recent years it has refused to open the taps to help out its neighbors.
Still, Mr. Dalio predicted that a weakened Fed will eventually join hands with the federal government to stimulate demand by directly financing deficits.
Once central banks have agreed to finance whatever deficits politicians wish to run, however, they may have trouble saying no when the need has passed.
The experience abroad and in the U.S.’s past suggests that once politicians are in charge of monetary policy, inflation often follows. In the 1960s and 1970s, presidents Johnson and Nixon pressured the Fed against raising rates, setting the stage for the surge in inflation in the 1970s. Such a scenario seems remote today, but it may not always be.
Almost exactly one year has passed since Donald Trump declared, “I am a Tariff Man.” Uncharacteristically, he was telling the truth.
At this point I’ve lost count of how many times markets have rallied in the belief that Trump was winding down his trade war, only to face announcements that a much-anticipated deal wasn’t happening or that tariffs were being slapped on a new set of products or countries. Over the past week it happened again: Markets bet on an outbreak of trade peace between the U.S. and China, only to get body slammed by Trump’s declaration that there might be no deal before the election and by his new tariffs on Brazil and Argentina.
So Trump really is a Tariff Man. But why? After all, the results of his trade war have been consistently bad, both economically and politically.
I’ll offer an answer shortly. First, however, let’s talk about what the Trump trade war has actually accomplished.
A peculiar aspect of the Trump economy is that while overall growth has been solid, the areas of weakness have come precisely in those things Trump tried to stimulate.
Remember, Trump’s only major legislative accomplishment was a huge tax cut for corporations that was supposed to lead to a surge in investment. Instead, corporations pocketed the money, and business investment has been falling.
At the same time, his trade war was supposed to shrink the trade deficit and revive U.S. manufacturing. But the trade deficit has widened, and manufacturing output is shrinking.
The truth is that even economists who opposed Trump’s tax cuts and tariffs are surprised by how badly they’re working out. The most commonly given explanation for these bad results is that Trumpian tariff policy is creating a lot of uncertainty, which is giving businesses a strong incentive to postpone any plans they might have for building new factories and adding jobs.
It’s important to realize that Trumpian protectionism wasn’t a response to a groundswell of public opinion. As best as I can tell from the endless series of interviews with white guys in diners — who are, we all know, the only Americans who matter — these voters are driven more by animosity toward immigrants and the sense that snooty liberals look down on them than by trade policy.
And public opinion seems to have become far less protectionist even as Trump has raised tariffs, with the percentage of Americans saying that free trade agreements are a good thing as high as it’s ever been.
So Trump’s trade war is losing, not gaining, support. And one recent analysis finds that it was a factor hurting Republicans in the 2018 midterm elections, accounting for a significant number of lost congressional seats.
Nevertheless, Trump persists. Why?
One answer is that Trump has long had a fixation on the idea that tariffs are the answer to America’s problems, and he’s not the kind of man who reconsiders his prejudices in the light of evidence. But there’s also something else: U.S. trade law offers Trump more freedom of action — more ability to do whatever he wants — than any other policy area.
The basic story is that long ago — in fact, in the aftermath of the disastrous Smoot-Hawley tariff of 1930 — Congress deliberately limited its own role in trade policy. Instead, it gave the president the power to negotiate trade deals with other countries, which would then face up-or-down votes without amendments.
It was always clear, however, that this system needed some flexibility to respond to events. So the executive branch was given the power to impose temporary tariffs under certain conditions: import surges, threats to national security, unfair practices by foreign governments. The idea was that nonpartisan experts would determine whether and when these conditions existed, and the president would then decide whether to act.
This system worked well for many years. It turned out, however, to be extremely vulnerable to someone like Trump, for whom everything is partisan and expertise is a four-letter word. Trump’s tariff justifications have often been self-evidently absurd — seriously, who imagines that imports of Canadian steel threaten U.S. national security? But there’s no obvious way to stop him from imposing tariffs whenever he feels like it.
And there’s also no obvious way to stop his officials from granting individual businesses tariff exemptions, supposedly based on economic criteria but in fact as a reward for political support. Tariff policy isn’t the only arena in which Trump can practice crony capitalism — federal contracting is looking increasingly scandalous — but tariffs are especially ripe for exploitation.
So that’s why Trump is a Tariff Man: Tariffs let him exercise unconstrained power, rewarding his friends and punishing his enemies. Anyone imagining that he’s going to change his ways and start behaving responsibly is living in a fantasy world.