KIRIL SOKOLOFF: Well, coming back to David Hume’s famous paper, and you’ve referred to me a number of times. His conclusion was that governments should essentially run surpluses because there’s always a crisis or problematic. If you look at corporations coming into this with the worst balance sheets in history, with profits having been flat since 2012, 40% of Americans barely able to get $500 in an emergency, it seems to me that as we come out of this, one of the implications is going to be we need to be better prepared for the future. Do you think that that’s the way we’re going to go, that we will be, look, we got a real wakeup call here, we need to have money for a rainy day?
LACY HUNT: I think that’s the case for the private sector, but I don’t feel that that’s the case for the government sector. Remember, net national saving has the private sector and the private sector saving was fine. 8.50%, not bad. The problem was that we had a 6.50% government dissaving and so we only ended up with two. The real problem is that you have to have a net saving from the private, the government, the net foreign sector, and if the government, even though maybe well-intended, maybe the actions are very popular.
All of these measures that were taken to deal with the pandemic, they were very popular. The measures– and they were essential. They were humane, they had to be done. It would have been far better if we’ve been following Hume’s advice, which is you run surpluses so when you have an emergency, you have– and by the way, Adam Smith in the Wealth of Nations followed Hume, made the same recommendation. Of course, no one remembers that anymore. No one reads Hume and Smith.
KIRIL SOKOLOFF: To get to a positive net national savings, presumably that’s– I’m not saying it’s a governmental objective, but it should be a national objective in some form based on your point that in order to have money to invest, you’ve got to be able to save out of income. If government is going to keep on running deficits, at pick the number, 15 plus percent GDP, in order to have a positive net savings rate, that implies very significant private savings rate.
LACY HUNT: Unachievable, and let’s say that we bring back a lot of our operations, in other words. If that happens, then the current account deficit will start to shrink, but the current account deficit is always the opposite of the capital account. When you have a current account deficit, you have a capital account surplus, but the capital account surplus is net foreign saving. This is one of the problems. If you repatriate businesses, and you shrink the current account deficit, then you’re going to shrink net foreign saving. It’s quite possible that we will have two of saving movements. One for the increased level of government dissaving and also less positive foreign saving. We import a lot of saving from the rest of the world.
KIRIL SOKOLOFF: Given the output cap, not having the savings to invest, is that better than it might otherwise have been? Because we got five, six, seven years to fill that output gap, or is the transformative event that is the easiest way out, if that’s the right word. Is that dependent on having the capital and savings to invest? Does that mean that in order to have that transformative, new economy, we’re going to have to have a net national savings? Am I taking that–
LACY HUNT: The thing about those transformative events is that they often create a surge in income, and thus in saving that they finance themselves, but that’s why I use the term transformative. It cannot be evolutionary. Let’s think about the situation that we were in the late 1920s, early 1930s. We take on a great deal of debt in the ’20s and ’30s. We struggle throughout the 1930s. When Germany invades Poland, we still have an unemployment rate of 17% or 18%. We’ve come off the peak levels, but we still have a very high unemployment rate and we have a substantial number of people underemployed.
In other words, we really made very little progress to turning the economy. We stabilized things, and we ended the worst aspects of the Great Depression. Then World War II comes along. A lot of folks including the great JM Keynes believe that it was the deficit spending of World War II that shored the problems of the Great Depression. That’s not my reading. It is true that on a national income accounts basis, we ran deficits of 14%, 15% which is what we’re now running again by my calculation, national product account basis.
However, we had two other events that occurred simultaneously. Number one,
- we had a surge in our exports, and
- we had mandatory rationing.
If you wanted to buy 10 pounds of sugar, you couldn’t. You could buy one maybe, you had a ration, and so people were paid to produce exports and to produce military goods. The private saving rate went up to 25% of net national income. We were able to cover the federal budget dissaving and we paid off the debts of the 1920s and 1930s. When World War II ended, Keynes suggested if we didn’t continue running the budget deficits of World War II, we would go back into the Great Depression, but that didn’t happen.
The budget was basically balanced all the way through and to the early 1960s. We ran small deficits, but we ran small surpluses on balance. It was close to balance. We opened up our export market, we opened up markets to the US, we financed the reconstruction in Europe and Japan, we had a tremendous resurgence. If you look at McKinsey’s 24 cases of overindebted economies and how they got out of it by austerity, one of their cases is the US during World War II. They labeled it a fortuitous circumstance. We didn’t go into World War II to get rid of the debt problem of the 1920s and ’30s.
It was something that happened as a result of the policy mixes but the folks in America were okay with the austerity because it was a great national endeavor. I don’t think that they would be willing to do that today. Everyone is relying on more government activity to solve the problem, not realizing that that’s the source of our deteriorating rate of economic performance.
KIRIL SOKOLOFF: Well, on that note, it’s been a really instructive ending [?].
Leaders in the public and private sector in advanced economies, typically highly credentialed, have with very few exceptions shown abject incompetence in dealing with coronavirus as a pathogen and as a wrecker of economies. The US and UK have made particularly sorry showings, but they are not alone.
It’s become fashionable to blame the failure to have enough medical stockpiles and hospital beds and engage in aggressive enough testing and containment measures on capitalism. But as I will describe shortly, even though I am no fan of Anglosphere capitalism, I believe this focus misses the deeper roots of these failures.
After all the country lauded for its response, South Korea, is capitalist. Similarly, reader vlade points out that the Czech Republic has had only 2 coronavirus deaths per million versus 263 for Italy. Among other things, the Czech Republic closed its borders in mid-March and made masks mandatory. Newscasters and public officials wear them to underscore that no one is exempt.
Even though there are plenty of examples of capitalism gone toxic, such as hospitals and Big Pharma sticking doggedly to their price gouging ways or rampant production disruptions due to overly tightly-tuned supply chains, that isn’t an adequate explanation. Government dereliction of duty also abound. In 2006, California’s Governor Arnold Schwarznegger reacted to the avian flu by creating MASH on steroids. From the LA Times:
They were ready to roll whenever disaster struck California: three 200-bed mobile hospitals that could be deployed to the scene of a crisis on flatbed trucks and provide advanced medical care to the injured and sick within 72 hours.
Each hospital would be the size of a football field, with a surgery ward, intensive care unit and X-ray equipment. Medical response teams would also have access to a massive stockpile of emergency supplies: 50 million N95 respirators, 2,400 portable ventilators and kits to set up 21,000 additional patient beds wherever they were needed…
“In light of the pandemic flu risk, it is absolutely a critical investment,” he [Governor Schwarznegger] told a news conference. “I’m not willing to gamble with the people’s safety.”
They were dismantled in 2011 by Governor Jerry Brown as part of post-crisis belt tightening.
The US for decades has as a matter of policy tried to reduce the number of hospital beds, which among other things has led to the shuttering of hospitals, particularly in rural areas. Hero of the day, New York’s Governor Andrew Cuomo pursued this agenda with vigor, as did his predecessor George Pataki.
And even though Trump has made bad decision after bad decision, from eliminating the CDC’s pandemic unit to denying the severity of the crisis and refusing to use government powers to turbo-charge state and local medical responses, people better qualified than he is have also performed disastrously. America’s failure to test early and enough can be laid squarely at the feet of the CDC. As New York Magazine pointed out on March 12:
In a functional system, much of the preparation and messaging would have been undertaken by the CDC. In this case, it chose not to simply adopt the World Health Organization’s COVID-19 test kits — stockpiling them in the millions in the months we had between the first arrival of the coronavirus in China and its widespread appearance here — but to try to develop its own test. Why? It isn’t clear. But they bungled that project, too, failing to produce a reliable test and delaying the start of any comprehensive testing program by a few critical weeks.
The testing shortage is catastrophic: It means that no one knows how bad the outbreak already is, and that we couldn’t take effectively aggressive measures even we wanted to. There are so few tests available, or so little capacity to run them, that they are being rationed for only the most obvious candidates, which practically defeats the purpose. It is not those who are very sick or who have traveled to existing hot spots abroad who are most critical to identify, but those less obvious, gray-area cases — people who may be carrying the disease around without much reason to expect they’re infecting others…Even those who are getting tested have to wait at least several days for results; in Senegal, where the per capita income is less than $3,000, they are getting results in four hours. Yesterday, apparently, the CDC conducted zero tests…
[O]ur distressingly inept response, kept bringing to mind an essay by Umair Haque, first published in 2018 and prompted primarily by the opioid crisis, about the U.S. as the world’s first rich failed state
And the Trump Administration has such difficulty shooting straight that it can’t even manage its priority of preserving the balance sheets of the well off. Its small business bailouts, which are as much about saving those enterprises as preserving their employment, are off to a shaky start. How many small and medium sized ventures can and will maintain payrolls out of available cash when they aren’t sure when and if Federal rescue money will hit their bank accounts?
How did the US, and quite a few other advanced economies, get into such a sorry state that we are lack the operational capacity to engage in effective emergency responses? Look at what the US was able to do in the stone ages of the Great Depression. As Marshall Auerback wrote of the New Deal programs:
The government hired about 60 per cent of the unemployed in public works and conservation projects that
- planted a billion trees,
- saved the whooping crane,
- modernized rural America, and
- built such diverse projects as the Cathedral of Learning in Pittsburgh,
- the Montana state capitol,
- much of the Chicago lakefront,
- New York’s Lincoln Tunnel and Triborough Bridge complex,
- the Tennessee Valley Authority and
- the aircraft carriers Enterprise and Yorktown. It also
- built or renovated 2,500 hospitals,
- 45,000 schools,
- 13,000 parks and playgrounds,
- 7,800 bridges,
- 700,000 miles of roads, and
- a thousand airfields. And it
- employed 50,000 teachers,
- rebuilt the country’s entire rural school system, and
- hired 3,000 writers,
- sculptors and painters,
- including Willem de Kooning and Jackson Pollock.
What are the deeper causes of our contemporary generalized inability to respond to large-scale threats? My top picks are a lack of respect for risk and the rise of symbol manipulation as the dominant means of managing in the private sector and government.
Risk? What Risk?
Thomas Hobbes argued that life apart from society would be “solitary, poor, nasty, brutish and short.” Outside poor countries and communities, advances in science and industrialization have largely proven him right.
It was not long ago, in historical terms, that even aristocrats would lose children to accidents and disease. Only four of Winston Churchill’s six offspring lived to be adults. Comparatively few women now die in childbirth.
But it isn’t just that better hygiene, antibiotics, and vaccines have helped reduce the scourges of youth. They have also reduced the consequences of bad fortune. Fewer soldiers are killed in wars. More are patched up, so fewer come back in coffins and more with prosthetics or PTSD. And those prosthetics, which enable the injured to regain some of their former function, also perversely shield ordinary citizens from the spectacle of lost limbs.1
Similarly, when someone is hit by a car or has a heart attack, as traumatic as the spectacle might be to onlookers, typically an ambulance arrives quickly and the victim is whisked away. Onlookers can tell themselves he’s in good hands and hope for the best.
With the decline in manufacturing, fewer people see or hear of industrial accidents, like the time a salesman in a paper mill in which my father worked stuck his hand in a digester and had his arm ripped off. And many of the victims of hazardous work environments suffer from ongoing exposures, such as to toxic chemicals or repetitive stress injuries, so the danger isn’t evident until it is too late.
Most also are oddly disconnected from the risks they routinely take, like riding in a car (I for one am pretty tense and vigilant when I drive on freeways, despite like to speed as much as most Americans). Perhaps it is due in part to the illusion of being in control while driving.
Similarly, until the coronavirus crisis, even with America’s frayed social safety nets, most people, particularly the comfortably middle class and affluent, took comfort in appearances of normalcy and abundance. Stores are stocked with food. Unlike the oil crisis of the 1970, there’s no worry about getting petrol at the pump. Malls may be emptying out and urban retail vacancies might be increasing, but that’s supposedly due to the march of Amazon, and not anything amiss with the economy. After all, unemployment is at record lows, right?
Those who do go to college in America get a plush experience. No thin mattresses or only adequately kept-up dorms, as in my day. The notion that kids, even of a certain class, have to rough it a bit, earn their way up and become established in their careers and financially, seems to have eroded. Quite a few go from pampered internships to fast-track jobs. In the remote era of my youth, even in the prestigious firms, new hires were subjected to at least a couple of years of grunt work.
So the class of people with steady jobs (which these days are well-placed members of the professional managerial class, certain trades and those who chose low-risk employment with strong civil service protections) have also become somewhat to very removed from the risks endured when most people were subsistence farmers or small town merchants who served them.
Consider this disconnect, based on an Axios-Ipsos survey:
The coronavirus is spreading a dangerous strain of inequality. Better-off Americans are still getting paid and are free to work from home, while the poor are either forced to risk going out to work or lose their jobs.
Generally speaking, the people who are positioned to be least affected by coronavirus are the most rattled. That is due to the gap between expectations and the new reality. Poor people have Bad Shit Happen on a regular basis. Wealthy people expect to be able to insulate themselves from most of it and then have it appear in predictable forms, like cheating spouses and costly divorces, bad investments (still supposedly manageable if you are diversified!), renegade children, and common ailments, like heart attacks and cancer, where the rich better the odds by advantaged access to care.
The super rich are now bunkered, belatedly realizing they can’t set up ICUs at home, and hiring guards to protect themselves from marauding hordes, yet uncertain that their mercenaries won’t turn on them.
The bigger point is that we’ve had a Minksy-like process operating on a society-wide basis: as daily risks have declined, most people have blinded themselves to what risk amounts to and where it might surface in particularly nasty forms. And the more affluent and educated classes, who disproportionately constitute our decision-makers, have generally been the most removed.
The proximity to risk goes a long way to explaining who has responded better. As many have pointed out, the countries that had meaningful experience with SARS2 had a much better idea of what they were up against with the coronavirus and took aggressive measures faster.
But how do you explain South Korea, which had only three cases of SARS and no deaths? It doesn’t appear to have had enough experience with SARS to have learned from it.
A related factor may be that developing economies have fresh memories of what life was like before they became affluent. I can’t speak for South Korea, but when I worked with the Japanese, people still remembered the “starving times” right after World War II. Japan was still a poor country in the 1960s.3 South Korea rose as an economic power after Japan. The Asian Tigers were also knocked back on their heels with the 1997 emerging markets crisis. And of course Seoul is in easy nuke range of North Korea. It’s the only country I ever visited, including Israel, where I went through a metal detector to enter and saw lots of soldiers carrying machine guns in the airport. So they likely have a keen appreciation of how bad bad can be.
The Rise and Rise of the Symbol Economy
Let me start with an observation by Peter Drucker that I read back in the 1980s, but will then redefine his take on “symbol economy,” because I believe the phenomenon has become much more pervasive than he envisioned.
A good recap comes in Fragile Finance: Debt, Speculation and Crisis in the Age of Global Credit by A. Nesvetailova:
The most significant transformation for Drucker was the changed relationship between the symbolic economy of capital movements, exchange rates, and credit flows, and the real economy of the flow of goods and services:
…in the world economy of today, the ‘real economy’ of goods and services and the ‘symbol economy’ of money, credit, and capital are no longer bound tightly to each other; they are indeed, moving further and further apart (1986: 783)
The rise of the financial sphere as the flywheel of the world economy, Drucker noted, is both the most visible and the least understood change of modern capitalism.
What Drucker may not have sufficiently appreciated was money and capital flows are speculative and became more so over time. In their study of 800 years of financial crises, Carmen Reinhart and Ken Rogoff found that high levels of international capital flows were strongly correlated with more frequent and more severe financial crises. Claudio Borio and Petit Disyatat of the Banks of International Settlements found that on the eve of the 2008 crisis, international capital flows were 61 times as large as trade flows, meaning they were only trivially settling real economy transactions.
Now those factoids alone may seem to offer significant support to Drucker’s thesis. But I believe he conceived of it too narrowly. I believe that modeling techniques, above all, spreadsheet-based models, have removed decision-makers from the reality of their decisions. If they can make it work on paper, they believe it will work that way.
When I went to business school and started on Wall Street, financiers and business analysts did their analysis by hand, copying information from documents and performing computations with calculators. It was painful to generate financial forecasts, since one error meant that everything to the right was incorrect and had to be redone.
The effect was that when managers investigated major capital investments and acquisitions, they thought hard about the scenarios they wanted to consider since they could look at only a few. And if a model turned out an unfavorable-looking result, that would be hard to rationalize away, since a lot of energy had been devoted to setting it up.
By contrast, when PCs and Visicalc hit the scene, it suddenly became easy to run lots of forecasts. No one had any big investment in any outcome. And spending so much time playing with financial models would lead most participants to a decision to see the model as real, when it was a menu, not a meal.
When reader speak with well-deserved contempt of MBA managers, the too-common belief that it is possible to run an operation, any operation, by numbers, appears to be a root cause. For over five years, we’ve been running articles from the Health Renewal Blog decrying the rise of “generic managers” in hospital systems (who are typically also spectacularly overpaid) who proceed to grossly mismanage their operations yet still rake in the big bucks.
The UK version of this pathology is more extreme, because it marries managerial overconfidence with a predisposition among British elites to look at people who work hard as “must not be sharp.” But the broad outlines apply here. From Clive, on a Brexit post, when Brexit was the poster child of UK elite incompetence:
What’s struck me most about the UK government’s approach to the practical day-to-day aspects of Brexit is that it is exemplifying a typically British form of managerialism which bedevilles both public sector and private sector organisations. It manifests itself in all manner of guises but the main characteristic is that some “leader” issues impractical, unworkable, unachievable or contradictory instructions (or a “strategy”) to the lower ranks. These lower ranks have been encouraged to adopt the demeanour of yes-men (or yes-women). So you’re not allowed to question the merits of the ask. Everyone keeps quiet and takes the paycheck while waiting for the roof to fall in on them. It’s not like you’re on the breadline, so getting another year or so in isn’t a bad survival attitude. If you make a fuss now, you’ll likely be replaced by someone who, in the leadership’s eyes is a lot more can-do (but is in fact just either more naive or a better huckster).
Best illustrated perhaps by an example — I was asked a day or two ago to resolve an issue I’d reported using “imaginative” solutions. Now, I’ve got a a vivid imagination, but even that would not be able to comply with two mutually contradictory aims at the same time (“don’t incur any costs for doing some work” and “do the work” — where because we’ve outsourced the supply of the services in question, we now get real, unhideable invoices which must be paid).
To the big cheeses, the problem is with the underlings not being sufficiently clever or inventive. The real problem is the dynamic they’ve created and their inability to perceive the changes (in the same way as swinging a wrecking ball is a “change”) they’ve wrought on an organisation.
May, Davies, Fox, the whole lousy lot of ’em are like the pilot in the Airplane movie — they’re pulling on the levers of power only to find they’re not actually connected to anything. Wait until they pull a little harder and the whole bloody thing comes off in their hands.
Americans typically do this sort of thing with a better look: the expectations are usually less obviously implausible, particularly if they might be presented to the wider world. One of the cancers of our society is the belief that any problem can be solved with better PR, another manifestation of symbol economy thinking.
I could elaborate further on how these attitudes have become common, such as the ability of companies to hide bad operating results and them come clean every so often as if it were an extraordinary event, short job tenures promoting “IBG/YBG” opportunism, and the use of lawyers as liability shields (for the execs, not the company, natch).
But it’s not hard to see how it was easy to rationalize away the risks of decisions like globalization. Why say no to what amounted to a transfer from direct factory labor to managers and execs? Offshoring and outsourcing were was sophisticated companies did. Wall Street liked them. Them gave senior employees an excuse to fly abroad on the company dime. So what if the economic case was marginal? So what if the downside could be really bad? What Keynes said about banker herd mentality applies:
A sound banker, alas! is not one who foresees danger and avoids it, but one who, when he is ruined, is ruined in a conventional and orthodox way along with his fellows, so that no one can really blame him.
It’s not hard to see how a widespread societal disconnect of decision-makers from risk, particularly health-related risks, compounded with management by numbers as opposed to kicking the tires, would combine to produce lax attitude toward operations in general.
I believe a third likely factor is poor governance practices, and those have gotten generally worse as organizations have grown in scale and scope. But there is more country-specific nuance here, and I can discuss only a few well, so adding this to my theory will have to hold for another day. But it isn’t hard to think of some in America. For instance, 40 years ago, there were more midsized companies, with headquarters in secondary cities like Dayton, Ohio. Executives living in and caring about their reputation in their communities served as a check on behavior.
Before you depict me as exaggerating about the change in posture toward risks, I recall reading policy articles in the 1960s where officials wrung their hands about US dependence on strategic materials found only in unstable parts of Africa. That US would never have had China make its soldiers’ uniforms, boots, and serve as the source for 80+ of the active ingredients in its drugs. And America was most decidedly capitalist in the 1960s. So we need to look at how things have changed to explain changes in postures towards risk and notions of what competence amounts to.
1 One of my early memories was seeing a one-legged man using a crutch, with the trouser of his missing leg pinned up. I pointed to him and said something to my parents and was firmly told never to do anything like that again.
2 The US did not learn much from its 33 cases. But the lack of fatalities may have contributed.
3 Japan has had a pretty lame coronavirus response, but that is the result of Japan’s strong and idiosyncratic culture. While Japanese are capable of taking action individually when they are isolated, in group settings, no one wants to act or even worse take responsibility unless their is an accepted or established protocol.
The prelude to all of that was the 1930s, when the nation’s intellectuals first grappled with the meaning and significance of Russia’s revolution. And it was in this decade that Ayn Rand came to political consciousness, reworking her opposition to Soviet Communism into a powerful defense of the individual
.. The Great Depression had cast its dark shadow over the American dream.
.. In this moment, Soviet Russia stood out to the nation’s thinking class as a sign of hope. Communism, it was believed, had helped Russia avoid the worst ravages of the crash.
.. Rand had taken for granted there would be “pinks” in America, but she hadn’t known they would matter, certainly not in New York City, one of the literary capitals of the world.
.. a drama that would shape American thought and politics for the rest of the century: a bitter love triangle between Communists, ex-Communists and anti-Communists.
.. another Soviet inheritance: agitprop novels, dedicated to showcasing heroic individualists and entrepreneurs.