Leaders in the public and private sector in advanced economies, typically highly credentialed, have with very few exceptions shown abject incompetence in dealing with coronavirus as a pathogen and as a wrecker of economies. The US and UK have made particularly sorry showings, but they are not alone.
It’s become fashionable to blame the failure to have enough medical stockpiles and hospital beds and engage in aggressive enough testing and containment measures on capitalism. But as I will describe shortly, even though I am no fan of Anglosphere capitalism, I believe this focus misses the deeper roots of these failures.
After all the country lauded for its response, South Korea, is capitalist. Similarly, reader vlade points out that the Czech Republic has had only 2 coronavirus deaths per million versus 263 for Italy. Among other things, the Czech Republic closed its borders in mid-March and made masks mandatory. Newscasters and public officials wear them to underscore that no one is exempt.
Even though there are plenty of examples of capitalism gone toxic, such as hospitals and Big Pharma sticking doggedly to their price gouging ways or rampant production disruptions due to overly tightly-tuned supply chains, that isn’t an adequate explanation. Government dereliction of duty also abound. In 2006, California’s Governor Arnold Schwarznegger reacted to the avian flu by creating MASH on steroids. From the LA Times:
They were ready to roll whenever disaster struck California: three 200-bed mobile hospitals that could be deployed to the scene of a crisis on flatbed trucks and provide advanced medical care to the injured and sick within 72 hours.
Each hospital would be the size of a football field, with a surgery ward, intensive care unit and X-ray equipment. Medical response teams would also have access to a massive stockpile of emergency supplies: 50 million N95 respirators, 2,400 portable ventilators and kits to set up 21,000 additional patient beds wherever they were needed…
“In light of the pandemic flu risk, it is absolutely a critical investment,” he [Governor Schwarznegger] told a news conference. “I’m not willing to gamble with the people’s safety.”
They were dismantled in 2011 by Governor Jerry Brown as part of post-crisis belt tightening.
The US for decades has as a matter of policy tried to reduce the number of hospital beds, which among other things has led to the shuttering of hospitals, particularly in rural areas. Hero of the day, New York’s Governor Andrew Cuomo pursued this agenda with vigor, as did his predecessor George Pataki.
And even though Trump has made bad decision after bad decision, from eliminating the CDC’s pandemic unit to denying the severity of the crisis and refusing to use government powers to turbo-charge state and local medical responses, people better qualified than he is have also performed disastrously. America’s failure to test early and enough can be laid squarely at the feet of the CDC. As New York Magazine pointed out on March 12:
In a functional system, much of the preparation and messaging would have been undertaken by the CDC. In this case, it chose not to simply adopt the World Health Organization’s COVID-19 test kits — stockpiling them in the millions in the months we had between the first arrival of the coronavirus in China and its widespread appearance here — but to try to develop its own test. Why? It isn’t clear. But they bungled that project, too, failing to produce a reliable test and delaying the start of any comprehensive testing program by a few critical weeks.
The testing shortage is catastrophic: It means that no one knows how bad the outbreak already is, and that we couldn’t take effectively aggressive measures even we wanted to. There are so few tests available, or so little capacity to run them, that they are being rationed for only the most obvious candidates, which practically defeats the purpose. It is not those who are very sick or who have traveled to existing hot spots abroad who are most critical to identify, but those less obvious, gray-area cases — people who may be carrying the disease around without much reason to expect they’re infecting others…Even those who are getting tested have to wait at least several days for results; in Senegal, where the per capita income is less than $3,000, they are getting results in four hours. Yesterday, apparently, the CDC conducted zero tests…
[O]ur distressingly inept response, kept bringing to mind an essay by Umair Haque, first published in 2018 and prompted primarily by the opioid crisis, about the U.S. as the world’s first rich failed state
And the Trump Administration has such difficulty shooting straight that it can’t even manage its priority of preserving the balance sheets of the well off. Its small business bailouts, which are as much about saving those enterprises as preserving their employment, are off to a shaky start. How many small and medium sized ventures can and will maintain payrolls out of available cash when they aren’t sure when and if Federal rescue money will hit their bank accounts?
How did the US, and quite a few other advanced economies, get into such a sorry state that we are lack the operational capacity to engage in effective emergency responses? Look at what the US was able to do in the stone ages of the Great Depression. As Marshall Auerback wrote of the New Deal programs:
The government hired about 60 per cent of the unemployed in public works and conservation projects that
- planted a billion trees,
- saved the whooping crane,
- modernized rural America, and
- built such diverse projects as the Cathedral of Learning in Pittsburgh,
- the Montana state capitol,
- much of the Chicago lakefront,
- New York’s Lincoln Tunnel and Triborough Bridge complex,
- the Tennessee Valley Authority and
- the aircraft carriers Enterprise and Yorktown. It also
- built or renovated 2,500 hospitals,
- 45,000 schools,
- 13,000 parks and playgrounds,
- 7,800 bridges,
- 700,000 miles of roads, and
- a thousand airfields. And it
- employed 50,000 teachers,
- rebuilt the country’s entire rural school system, and
- hired 3,000 writers,
- sculptors and painters,
- including Willem de Kooning and Jackson Pollock.
What are the deeper causes of our contemporary generalized inability to respond to large-scale threats? My top picks are a lack of respect for risk and the rise of symbol manipulation as the dominant means of managing in the private sector and government.
Risk? What Risk?
Thomas Hobbes argued that life apart from society would be “solitary, poor, nasty, brutish and short.” Outside poor countries and communities, advances in science and industrialization have largely proven him right.
It was not long ago, in historical terms, that even aristocrats would lose children to accidents and disease. Only four of Winston Churchill’s six offspring lived to be adults. Comparatively few women now die in childbirth.
But it isn’t just that better hygiene, antibiotics, and vaccines have helped reduce the scourges of youth. They have also reduced the consequences of bad fortune. Fewer soldiers are killed in wars. More are patched up, so fewer come back in coffins and more with prosthetics or PTSD. And those prosthetics, which enable the injured to regain some of their former function, also perversely shield ordinary citizens from the spectacle of lost limbs.1
Similarly, when someone is hit by a car or has a heart attack, as traumatic as the spectacle might be to onlookers, typically an ambulance arrives quickly and the victim is whisked away. Onlookers can tell themselves he’s in good hands and hope for the best.
With the decline in manufacturing, fewer people see or hear of industrial accidents, like the time a salesman in a paper mill in which my father worked stuck his hand in a digester and had his arm ripped off. And many of the victims of hazardous work environments suffer from ongoing exposures, such as to toxic chemicals or repetitive stress injuries, so the danger isn’t evident until it is too late.
Most also are oddly disconnected from the risks they routinely take, like riding in a car (I for one am pretty tense and vigilant when I drive on freeways, despite like to speed as much as most Americans). Perhaps it is due in part to the illusion of being in control while driving.
Similarly, until the coronavirus crisis, even with America’s frayed social safety nets, most people, particularly the comfortably middle class and affluent, took comfort in appearances of normalcy and abundance. Stores are stocked with food. Unlike the oil crisis of the 1970, there’s no worry about getting petrol at the pump. Malls may be emptying out and urban retail vacancies might be increasing, but that’s supposedly due to the march of Amazon, and not anything amiss with the economy. After all, unemployment is at record lows, right?
Those who do go to college in America get a plush experience. No thin mattresses or only adequately kept-up dorms, as in my day. The notion that kids, even of a certain class, have to rough it a bit, earn their way up and become established in their careers and financially, seems to have eroded. Quite a few go from pampered internships to fast-track jobs. In the remote era of my youth, even in the prestigious firms, new hires were subjected to at least a couple of years of grunt work.
So the class of people with steady jobs (which these days are well-placed members of the professional managerial class, certain trades and those who chose low-risk employment with strong civil service protections) have also become somewhat to very removed from the risks endured when most people were subsistence farmers or small town merchants who served them.
Consider this disconnect, based on an Axios-Ipsos survey:
The coronavirus is spreading a dangerous strain of inequality. Better-off Americans are still getting paid and are free to work from home, while the poor are either forced to risk going out to work or lose their jobs.
Generally speaking, the people who are positioned to be least affected by coronavirus are the most rattled. That is due to the gap between expectations and the new reality. Poor people have Bad Shit Happen on a regular basis. Wealthy people expect to be able to insulate themselves from most of it and then have it appear in predictable forms, like cheating spouses and costly divorces, bad investments (still supposedly manageable if you are diversified!), renegade children, and common ailments, like heart attacks and cancer, where the rich better the odds by advantaged access to care.
The super rich are now bunkered, belatedly realizing they can’t set up ICUs at home, and hiring guards to protect themselves from marauding hordes, yet uncertain that their mercenaries won’t turn on them.
The bigger point is that we’ve had a Minksy-like process operating on a society-wide basis: as daily risks have declined, most people have blinded themselves to what risk amounts to and where it might surface in particularly nasty forms. And the more affluent and educated classes, who disproportionately constitute our decision-makers, have generally been the most removed.
The proximity to risk goes a long way to explaining who has responded better. As many have pointed out, the countries that had meaningful experience with SARS2 had a much better idea of what they were up against with the coronavirus and took aggressive measures faster.
But how do you explain South Korea, which had only three cases of SARS and no deaths? It doesn’t appear to have had enough experience with SARS to have learned from it.
A related factor may be that developing economies have fresh memories of what life was like before they became affluent. I can’t speak for South Korea, but when I worked with the Japanese, people still remembered the “starving times” right after World War II. Japan was still a poor country in the 1960s.3 South Korea rose as an economic power after Japan. The Asian Tigers were also knocked back on their heels with the 1997 emerging markets crisis. And of course Seoul is in easy nuke range of North Korea. It’s the only country I ever visited, including Israel, where I went through a metal detector to enter and saw lots of soldiers carrying machine guns in the airport. So they likely have a keen appreciation of how bad bad can be.
The Rise and Rise of the Symbol Economy
Let me start with an observation by Peter Drucker that I read back in the 1980s, but will then redefine his take on “symbol economy,” because I believe the phenomenon has become much more pervasive than he envisioned.
A good recap comes in Fragile Finance: Debt, Speculation and Crisis in the Age of Global Credit by A. Nesvetailova:
The most significant transformation for Drucker was the changed relationship between the symbolic economy of capital movements, exchange rates, and credit flows, and the real economy of the flow of goods and services:
…in the world economy of today, the ‘real economy’ of goods and services and the ‘symbol economy’ of money, credit, and capital are no longer bound tightly to each other; they are indeed, moving further and further apart (1986: 783)
The rise of the financial sphere as the flywheel of the world economy, Drucker noted, is both the most visible and the least understood change of modern capitalism.
What Drucker may not have sufficiently appreciated was money and capital flows are speculative and became more so over time. In their study of 800 years of financial crises, Carmen Reinhart and Ken Rogoff found that high levels of international capital flows were strongly correlated with more frequent and more severe financial crises. Claudio Borio and Petit Disyatat of the Banks of International Settlements found that on the eve of the 2008 crisis, international capital flows were 61 times as large as trade flows, meaning they were only trivially settling real economy transactions.
Now those factoids alone may seem to offer significant support to Drucker’s thesis. But I believe he conceived of it too narrowly. I believe that modeling techniques, above all, spreadsheet-based models, have removed decision-makers from the reality of their decisions. If they can make it work on paper, they believe it will work that way.
When I went to business school and started on Wall Street, financiers and business analysts did their analysis by hand, copying information from documents and performing computations with calculators. It was painful to generate financial forecasts, since one error meant that everything to the right was incorrect and had to be redone.
The effect was that when managers investigated major capital investments and acquisitions, they thought hard about the scenarios they wanted to consider since they could look at only a few. And if a model turned out an unfavorable-looking result, that would be hard to rationalize away, since a lot of energy had been devoted to setting it up.
By contrast, when PCs and Visicalc hit the scene, it suddenly became easy to run lots of forecasts. No one had any big investment in any outcome. And spending so much time playing with financial models would lead most participants to a decision to see the model as real, when it was a menu, not a meal.
When reader speak with well-deserved contempt of MBA managers, the too-common belief that it is possible to run an operation, any operation, by numbers, appears to be a root cause. For over five years, we’ve been running articles from the Health Renewal Blog decrying the rise of “generic managers” in hospital systems (who are typically also spectacularly overpaid) who proceed to grossly mismanage their operations yet still rake in the big bucks.
The UK version of this pathology is more extreme, because it marries managerial overconfidence with a predisposition among British elites to look at people who work hard as “must not be sharp.” But the broad outlines apply here. From Clive, on a Brexit post, when Brexit was the poster child of UK elite incompetence:
What’s struck me most about the UK government’s approach to the practical day-to-day aspects of Brexit is that it is exemplifying a typically British form of managerialism which bedevilles both public sector and private sector organisations. It manifests itself in all manner of guises but the main characteristic is that some “leader” issues impractical, unworkable, unachievable or contradictory instructions (or a “strategy”) to the lower ranks. These lower ranks have been encouraged to adopt the demeanour of yes-men (or yes-women). So you’re not allowed to question the merits of the ask. Everyone keeps quiet and takes the paycheck while waiting for the roof to fall in on them. It’s not like you’re on the breadline, so getting another year or so in isn’t a bad survival attitude. If you make a fuss now, you’ll likely be replaced by someone who, in the leadership’s eyes is a lot more can-do (but is in fact just either more naive or a better huckster).
Best illustrated perhaps by an example — I was asked a day or two ago to resolve an issue I’d reported using “imaginative” solutions. Now, I’ve got a a vivid imagination, but even that would not be able to comply with two mutually contradictory aims at the same time (“don’t incur any costs for doing some work” and “do the work” — where because we’ve outsourced the supply of the services in question, we now get real, unhideable invoices which must be paid).
To the big cheeses, the problem is with the underlings not being sufficiently clever or inventive. The real problem is the dynamic they’ve created and their inability to perceive the changes (in the same way as swinging a wrecking ball is a “change”) they’ve wrought on an organisation.
May, Davies, Fox, the whole lousy lot of ’em are like the pilot in the Airplane movie — they’re pulling on the levers of power only to find they’re not actually connected to anything. Wait until they pull a little harder and the whole bloody thing comes off in their hands.
Americans typically do this sort of thing with a better look: the expectations are usually less obviously implausible, particularly if they might be presented to the wider world. One of the cancers of our society is the belief that any problem can be solved with better PR, another manifestation of symbol economy thinking.
I could elaborate further on how these attitudes have become common, such as the ability of companies to hide bad operating results and them come clean every so often as if it were an extraordinary event, short job tenures promoting “IBG/YBG” opportunism, and the use of lawyers as liability shields (for the execs, not the company, natch).
But it’s not hard to see how it was easy to rationalize away the risks of decisions like globalization. Why say no to what amounted to a transfer from direct factory labor to managers and execs? Offshoring and outsourcing were was sophisticated companies did. Wall Street liked them. Them gave senior employees an excuse to fly abroad on the company dime. So what if the economic case was marginal? So what if the downside could be really bad? What Keynes said about banker herd mentality applies:
A sound banker, alas! is not one who foresees danger and avoids it, but one who, when he is ruined, is ruined in a conventional and orthodox way along with his fellows, so that no one can really blame him.
It’s not hard to see how a widespread societal disconnect of decision-makers from risk, particularly health-related risks, compounded with management by numbers as opposed to kicking the tires, would combine to produce lax attitude toward operations in general.
I believe a third likely factor is poor governance practices, and those have gotten generally worse as organizations have grown in scale and scope. But there is more country-specific nuance here, and I can discuss only a few well, so adding this to my theory will have to hold for another day. But it isn’t hard to think of some in America. For instance, 40 years ago, there were more midsized companies, with headquarters in secondary cities like Dayton, Ohio. Executives living in and caring about their reputation in their communities served as a check on behavior.
Before you depict me as exaggerating about the change in posture toward risks, I recall reading policy articles in the 1960s where officials wrung their hands about US dependence on strategic materials found only in unstable parts of Africa. That US would never have had China make its soldiers’ uniforms, boots, and serve as the source for 80+ of the active ingredients in its drugs. And America was most decidedly capitalist in the 1960s. So we need to look at how things have changed to explain changes in postures towards risk and notions of what competence amounts to.
1 One of my early memories was seeing a one-legged man using a crutch, with the trouser of his missing leg pinned up. I pointed to him and said something to my parents and was firmly told never to do anything like that again.
2 The US did not learn much from its 33 cases. But the lack of fatalities may have contributed.
3 Japan has had a pretty lame coronavirus response, but that is the result of Japan’s strong and idiosyncratic culture. While Japanese are capable of taking action individually when they are isolated, in group settings, no one wants to act or even worse take responsibility unless their is an accepted or established protocol.
Climate crises, like financial crises, will be damaging, unpredictable and almost impossible to avoid
Last year Australia’s central bank hoped that several interest-rate cuts would mark a turning point for its slowing economy. That was before the worst bushfires in Australia’s history hit tourism, consumer confidence and growth forecasts for this year. There is now a good chance the bank will cut interest rates again soon.
Welcome to a world in which climate change’s economic impact is no longer distant and imperceptible.
- Puerto Rico never fully recovered from Hurricane Maria in 2017.
- Extreme drought in California and poorly maintained utility power lines led to severe wildfires in 2018, the utility’s bankruptcy and blackouts last year.
Climate change can’t be directly blamed for any single extreme weather event, including Hurricane Maria, California’s wildfires or Australia’s bushfires. But it makes such events more likely. “They are starting to be more than tail events, they’re starting to affect economic outcomes,” Robert Kaplan, president of the Federal Reserve Bank of Dallas, told an economic conference earlier this month.
Climate crises in the next 30 years may resemble financial crises in recent decades:
- potentially quite destructive,
- largely unpredictable and, given the powerful underlying causes,
Climate has muscled to the top of business worries.
Every year, the World Economic Forum asks business, political, academic and nongovernmental leaders to rank the most probable and consequential risks, from cyberattacks to fiscal crises. This year, ahead of its annual meeting next week in Davos, Switzerland, climate-related risks took the five top spots in terms of probability, the first time a single issue had done so in the survey’s 14-year history.The New NormalAs global temperatures rise, extreme temperatures and environmental disasters become more common.Summer temperatures for local regions in the northern hemisphereSTANDARD DEVIATION FROM 115-YEAR AVERAGETHOUSANDS OF OCCURRENCES1961-19802011-2015-4-3-2-10123450102030405060World-wide extreme weather eventsSources: McKinsey Global Institute (temperature distribution), JPMorgan (extreme weather events).events1980’85’90’952000’05’10’150100200300400500600700800900
Of course, economies have always been vulnerable to natural disasters. Before the modern industrial era, crop failures were a leading cause of recession. The monsoon season remains a key economic variable in India, and the Tohoku earthquake and tsunami in 2011 tipped Japan into recession.
And while estimates of climate’s economic impact are suffused with uncertainty, they don’t suggest any major economy will be pushed into recession, much less depression.
Studies reviewed by David Mackie of JPMorgan Chase suggest climate change could reduce global gross domestic product by 1% to 7% by 2100, assuming “business as usual” (i.e., absent policies to mitigate emissions of carbon dioxide). Given that the impact is spread out over 80 years, in which per capita incomes probably rise 300% to 400%, even larger climate change impacts would appear small, he said.
Aggregate changes in GDP, though, can be misleading. As global temperatures climb, the probability of extreme temperatures and events and the associated economic consequences should rise more.
This relationship is driven home in a study released Thursday by the McKinsey Global Institute. It estimated that “unusually hot summers” affected 15% of the Northern Hemisphere’s land surface in 2015, up from 0.2% before 1980.
McKinsey estimated that climate change made the European heat wave that in 2019 killed 1,500 in France 10 times more likely and the forest fires that devastated northern Alberta in 2016 up to six times more likely.
Looking ahead, assuming business as usual, McKinsey projected the probability of a 10% drop in wheat, corn, soybean and rice yields in any given year will rise from 6% now to 18% in 2050. Such a change wouldn’t cause food shortages but could cause prices to spike. The probability that a catastrophic cyclone disrupts semiconductor manufacturing in the western Pacific will double or quadruple by 2040. Such an event “could potentially lead to months of lost production for the directly affected companies,” McKinsey said. The probability of rain heavy enough to halt the mining in southeastern China of rare-earth elements, vital to many electronic devices, will rise from 2.5% now to 6% by 2050.
Such an exercise comes with plenty of caveats. The projections make no allowance for adaptation, though no doubt some outdoor activity will move indoors, some businesses will relocate from flood plains, and insurance will cushion the cost for many.
But adaptation goes only so far. Humans can’t survive prolonged high heat and humidity beyond certain thresholds. Those thresholds are rarely met now, but will be reached regularly in some regions by 2050.
Adaptation and insurance may be deemed too costly. “Underinsurance may grow worse as more extreme events unfold, because fewer people carry insurance for them,” McKinsey predicted.
Some on Wall Street are starting to treat climate change the way they regard financial crises. “Climate change is almost invariably the top issue that clients around the world raise with BlackRock,” Chairman and CEO Laurence Fink told chief executives this week in explaining why climate would be a key criterion in how BlackRock Inc. invests its $7 trillion of client money. For businesses, mandates—private or government-driven—pose a risk distinct from climate change itself. Car companies are now spending heavily to market electric vehicles with no assurance they will be profitable.
Some central bankers are also talking about climate risk the way they talk about financial crises. Christine Lagarde, the newly installed European Central Bank president, told European parliamentarians last fall, “At a minimum…[the ECB’s] macroeconomic models must incorporate the risk of climate change.”
SHARE YOUR THOUGHTS
How do you adjust an economy for climate change? Join the conversation below.
Yet worrying about it isn’t the same as doing something about it. Unlike financial crises, neither Wall Street nor central bankers have the tools to alter the forces making climate crises more likely: rising carbon dioxide emissions and economic development in vulnerable regions. Only political leaders can—and it isn’t clear they will.
The Madrid climate summit in December “is the most recent example of countries failing to cooperate to create a global emissions trading regime,” Mr. Mackie said. “Most likely, business as usual will be the path that policy makers follow in the years ahead…[which] increases the likelihood that the costs of dealing with climate change will go up as action is delayed.”
Join Anand Giridharadas author of Winners Take All, in conversation with Belfer Center Executive Director Aditi Kumar on the perils of philanthropy and policy in the hands of the global elite.
- Reputation Laundering
- Do enough to be considered OK to meet with a senator
- People are busy. They don’t have time to research connection between “Sackler Art Wing” and Oxycontin
- Only way Jeffery Epstein could come back was through association with Harvard
- Gives Bill Gates more votes on Common Core than public
- Which University will be able to fund an Institute on Wealth Taxation verses Social Philanthropy?
32:59how many people are able to live adecent life housing cost New York Cityis as cruel as Idaho and so you look atsomething like the Fair Work Weekovement right which is trying to gointo communities and say simple thingbut a thing that has transformativeeffect on people’s lives you can’tchange people’s hours with like two daysnotice now for many people in this roomthat may not be an issue that affectsyour career you’re gonna be paid onsalary but if you’re paid by the hourhaving your hours changed moved aroundcut when I was in a restaurant not longago and as his waitress crying becauseshe was sent home early three hoursearly for like the third time that weekright which means not getting paidbecause there was not enough demand atthe sameshop the company’s risk being put on herback she’s gonna miss a bill this ishappening to millions of Americans nowthe thing about a Fair Work Week law isyou don’t need Mitch McConnell any cityin America can pass this counties canpass it states can pass it isn’t itinteresting that we keep using a coupleparts of government that aredysfunctional and absolve ourselves ofthe fact that in all these cities whereliberals completely control everythingno one’s doing itno one’s passing these Fair Work Week
Oct.31 — Nassim Nicholas Taleb, scientific advisor at Universa Investments, discusses the factors causing global fragility, hidden liabilities in global markets, and what he sees as safe trades in the current market. He speaks with Bloomberg’s Erik Schatzker on “Bloomberg Markets.”
Washington (CNN)A vehicle accident that killed one cadet from the US Military Academy and injured 21 others Thursday is refocusing attention on a startling statistic — more American service members are dying during training exercises than in combat operations.Between 2006 and 2018, 31.9% of active-duty military deaths were the result of accidents, according to a congressional report updated last month. By comparison, 16.3% of service members who died during that time were killed in action.And a large majority of those accidents occurred in circumstances unrelated to combat deployments.“Since 2006 … a total of 16,652 active-duty personnel and mobilized reservists have died while serving in the US armed forces. Seventy-three percent of these casualties occurred under circumstances unrelated to war,” the report states.It is a trend that has only seemed to pick up momentum of late, as noncombat deaths have exceeded the number of military members killed in action every year since 2015.In 2017, nearly four times as many service members died in training accidents as were killed in combat, according to a House Armed Services Committee report related to the National Defense Authorization Act for fiscal year 2019 — a key point highlighted by many lawmakers and military officials who argued for additional defense spending to help offset readiness issues that have compounded for years.“In all, 21 servicemembers died in combat that year while 80 died as a result of noncombat training-related accidents,” the report said.The military endured a high rate of training-related deaths again in 2018 and a spate of deadly noncombat military aircraft crashes prompted then-House Armed Services Chairman Mac Thornberry, R-Texas, to say the “readiness of the military is at a crisis point.”“This crisis is not limited to military aviation,” Thornberry wrote in a 2018 report. “This past summer, the Navy lost 17 Sailors in separate collisions involving the USS McCain and the USS Fitzgerald. Navy investigators later found that both accidents were related to ongoing Navy readiness problems.”One of the most recent accidents occurred in March, when two Marine pilots died in a helicopter crash near Yuma, Arizona, during a routine training exercise.While details on the cause of Thursday’s deadly accident remain unclear, the incident serves as yet another reminder of broader safety issues related to military training.Lt. Gen. Darryl A. Williams, the superintendent of the US Military Academy at West Point, was unable to immediately provide details about the cause of the rollover vehicle accident, citing the ongoing investigation. He said the name of the deceased cadet would not be released until the family was notified.“Today was a tragic day for the West Point community and our United States Army,” Williams said.
Barring Representatives Ilhan Omar and Rashida Tlaib shows weakness and intolerance, not strength.
There are not many traditions of decorum that President Trump has not trampled on since entering the White House. But to put at risk, so cynically, America’s special relationship with Israel solely to titillate the bigots in his base, to lean so crassly on a foreign leader to punish his own political adversaries, to demonstrate so foul a lack of respect for the most elemental democratic principles, is new territory even for him.
Though facing a difficult election next month for which he sorely needs the support of his fractured right-wing base, Prime Minister Benjamin Netanyahu was said to be leaning toward allowing Representatives Ilhan Omar of Minnesota and Rashida Tlaib of Michigan to travel through Israel “out of respect for the U.S. Congress and the great alliance between Israel and America,” as his ambassador to Washington, Ron Dermer, wisely said last month. But, on Thursday, Mr. Netanyahu cravenly bowed before the pressure from Mr. Trump.
“It would show great weakness if Israel allowed Rep. Omar and Rep. Tlaib to visit,” Mr. Trump tweeted on Thursday morning. “They hate Israel & all Jewish people, & there is nothing that can be said or done to change their minds.”Sad, to borrow one of Mr. Trump’s favorite words. How sad that two leaders — each desperate to look tough to his own base — are risking a bipartisan relationship built between these two nations over generations. Only weak leaders would risk so much for a reward so negligible. To what end?
- To win a few political points against two of the newest members of Congress?
- To capture a few news cycles?
- To dial up the outrage machine just one more notch?
Confident leaders would never have risked so much for so little.
Though many American presidents have sought to influence Israeli decisions throughout the history of the Arab-Israeli conflict, they usually did so diplomatically — and to advance America’s interests. Mr. Trump, by contrast, leaned on Mr. Netanyahu as he would on one of his own appointees, in broad view, and in direct violation of what the president of the United States should be doing when democratically elected lawmakers are threatened with a blockade by an allied leader.
There can be, and has been, considerable debate over what the two congresswomen, the first two Muslim women elected to Congress and both sharp critics of the Israeli government, have said and done. They have supported the controversial Boycott, Divestment and Sanctions (B.D.S.) movement aimed at pressuring Israel into ending its occupation of the West Bank, a movement that some Jews have deemed to be anti-Semitic.
Yet, from the outset, Mr. Trump has pounced on the religion and background of the two congresswomen to fan racial divisions. Ms. Omar and Ms. Tlaib were two of the four congresswomen of color, along with Alexandria Ocasio-Cortez of New York and Ayanna Pressley of Massachusetts, who Mr. Trump said should “go back” to the countries they came from, giving rise to chants of “send her back” at a subsequent Trump political rally.
The visit Ms. Omar and Ms. Tlaib were contemplating was not to Israel proper, but to the West Bank, where they were to visit Hebron, Ramallah and Bethlehem, as well as Israeli-occupied East Jerusalem, on a trip co-sponsored by a Palestinian organization, Miftah, that promotes “global awareness and knowledge of Palestinian realities.” A visit was planned to the Al Aqsa Mosque, on what Israelis call the Temple Mount, an especially volatile site in the Israeli-Palestinian conflict. There is little question that their visit would have focused on Palestinian grievances over the Israeli occupation.
All that was clearly troublesome for Mr. Netanyahu, especially the support of the congresswomen for the B.D.S. movement. A relatively recent law allows the Israeli government to deny entry to supporters of the movement; it was this law that the government used to deny entry to the representatives.
In April the United States barred Omar Barghouti, one of the co-founders of the B.D.S. movement, from entering the country when he was scheduled to deliver a series of talks and attend his daughter’s wedding. Other American public figures have been detained by Israeli authorities, ostensibly because of their political views, including the
- IfNotNow founder, Simone Zimmerman, who was held at the border; a B.D.S. advocate,
- Ariel Gold, who was denied entry to the country; and the
- journalist Peter Beinart, who was held at the airport. Mr. Netanyahu later called Mr. Beinart’s detention a “mistake.”
Yet contrary to Mr. Trump’s tweet, it is blocking entry by two American legislators who are critics of Israel that shows great weakness, especially after Israel hosted visits by delegations of 31 Republican and 41 Democratic lawmakers this month.
It has long been Israel’s mantra that critics of its policies should come see for themselves, and the country is certainly strong enough to handle any criticism from two members of Congress. Mr. Trump has done Israel no favor.
For decades, the freedom of monetary policymakers to make difficult decisions without having to worry about political blowback has proven indispensable to macroeconomic stability. But now, central bankers must ease monetary policies in response to populist mistakes for which they themselves will be blamed.CHICAGO – Central-bank independence is back in the news. In the United States, President Donald Trump has been berating the Federal Reserve for keeping interest rates too high, and has reportedly explored the possibility of forcing out Fed Chair Jerome Powell. In Turkey, President Recep Tayyip Erdoğan has fired the central-bank governor. The new governor is now pursuing sharp rate cuts. And these are hardly the only examples of populist governments setting their sights on central banks in recent months.
In theory, central-bank independence means that monetary policymakers have the freedom to make unpopular but necessary decisions, particularly when it comes to combating inflation and financial excesses, because they do not have to stand for election. When faced with such decisions, elected officials will always be tempted to adopt a softer response, regardless of the longer-term costs. To avoid this, they have handed over the task of intervening directly in monetary and financial matters to central bankers, who have the discretion to meet goals set by the political establishment however they choose.
This arrangement gives investors more confidence in a country’s monetary and financial stability, and they will reward it (and its political establishment) by accepting lower interest rates for its debt. In theory, the country thus will live happily ever after, with low inflation and financial-sector stability.
Having proved effective in many countries starting in the 1980s, central-bank independence became a mantra for policymakers in the 1990s. Central bankers were held in high esteem, and their utterances, though often elliptical or even incomprehensible, were treated with deep reverence. Fearing a recurrence of the high inflation of the early 1980s, politicians gave monetary policymakers wide leeway, and scarcely ever talked about their actions publicly.
But now, three developments seem to have shattered this entente in developed countries. The first development was the 2008 global financial crisis, which suggested that central banks had been asleep at the wheel. Although central bankers managed to create an even more powerful aura around themselves by marshaling a forceful response to the crisis, politicians have since come to resent sharing the stage with these unelected saviors.
Second, since the crisis, central banks have repeatedly fallen short of their inflation targets. While this may suggest that they could have done more to boost growth, in reality they don’t have the means to pursue much additional monetary easing, even using unconventional tools. Any hint of further easing seems to encourage financial risk-taking more than real investment. Central bankers have thus become hostages of the aura they helped to conjure. When the public believes that monetary policymakers have superpowers, politicians will ask why those powers aren’t being used to fulfill their mandates.
Third, in recent years many central banks changed their communication approach, shifting from Delphic utterances to a policy of full transparency. But since the crisis, many of their public forecasts of growth and inflation have missed the mark. That these might have been the best estimates at the time convinces no one. That they were wrong is all that matters. This has left them triply damned in the eyes of politicians: they
- failed to prevent the financial crisis and paid no price; they are
- failing now to meet their mandate; and they
- seem to know no more than the rest of us about the economy.
It is no surprise that populist leaders would be among the most incensed at central banks. Populists believe they have a mandate from “the people” to wrest control of institutions from the “elites,” and there is nothing more elite than pointy-headed PhD economists speaking in jargon and meeting periodically behind closed doors in places like Basel, Switzerland. For a populist leader who fears that a recession might derail his agenda and tarnish his own image of infallibility, the central bank is the perfect scapegoat.
Markets seem curiously benign in the face of these attacks. In the past, they would have reacted by pushing up interest rates. But investors seem to have concluded that the deflationary consequences of the policy uncertainty created by the unorthodox and unpredictable actions of populist administrations far outweigh any damage done to central bank independence. So they want central banks to respond as the populist leader desires, not to support their “awesome” policies, but to offset their adverse consequences.
A central bank’s mandate requires it to ease monetary policy when growth is flagging, even when the government’s own policies are the problem. Though the central bank is still autonomous, it effectively becomes a dependent follower. In such cases, it may even encourage the government to undertake riskier policies on the assumption that the central bank will bail out the economy as needed. Worse, populist leaders may mistakenly believe the central bank can do more to rescue the economy from their policy mistakes than it actually can deliver. Such misunderstandings could be deeply problematic for the economy.
Furthermore, central bankers are not immune to public attack. They know that an adverse image hurts central bank credibility as well as its ability to recruit and act in the future. Knowing that they are being set up to take the fall in case the economy falters, it would be only human for central bankers to buy extra insurance against that eventuality. In the past, the cost would have been higher inflation over the medium term; today, it is more likely that the cost will be more future financial instability. This possibility, of course, will tend to depress market interest rates further rather than elevating them.
What can central bankers do? Above all, they need to explain their role to the public and why it is about more than simply moving interest rates up or down on a whim. Powell has been transparent in his press conferences and speeches, as well as honest about central bankers’ own uncertainties regarding the economy. Shattering the mystique surrounding central banking could open it to attack in the short run, but will pay off in the long run. The sooner the public understands that central bankers are ordinary people doing a difficult job with limited tools under trying circumstances, the less it will expect monetary policy magically to correct elected politicians’ errors. Under current conditions, that may be the best form of independence central bankers can hope for.