The First Iraq War Was Also Sold to the Public Based on a Pack of Lies

Polls suggest that Americans tend to differentiate between our “good war” in Iraq — “Operation Desert Storm,” launched by George HW Bush in 1990 — and the “mistake” his son made in 2003.

Across the ideological spectrum, there’s broad agreement that the first Gulf War was “worth fighting.” The opposite is true of the 2003 invasion, and a big reason for those divergent views was captured in a 2013 CNN poll that found that “a majority of Americans (54%) say that prior to the start of the war the administration of George W. Bush deliberately misled the U.S. public about whether Baghdad had weapons of mass destruction.”

But as the usual suspects come out of the woodwork to urge the US to once again commit troops to Iraq, it’s important to recall that the first Gulf War was sold to the public on a pack of lies that were just as egregious as those told by the second Bush administration 12 years later.

The Lie of an Expansionist Iraq

Most countries condemned Iraq’s 1990 invasion of Kuwait. But the truth — that it was the culmination of a series of tangled economic and historical conflicts between two Arab oil states — wasn’t likely to sell the US public on the idea of sending our troops halfway around the world to do something about it.

So we were given a variation of the “domino theory.” Saddam Hussein, we were told, had designs on the entire Middle East. If he wasn’t halted in Kuwait, his troops would just keep going into other countries.

As Scott Peterson reported for The Christian Science Monitor in 2002, a key part of the first Bush administration’s case “was that an Iraqi juggernaut was also threatening to roll into Saudi Arabia. Citing top-secret satellite images, Pentagon officials estimated in mid-September [of 1990]  that up to 250,000 Iraqi troops and 1,500 tanks stood on the border, threatening the key US oil supplier.”

A quarter of a million troops with heavy armor amassed on the Saudi border certainly seemed like a clear sign of hostile intent. In announcing that he had deployed troops to the Gulf in August 1990, George HW Bush said, “I took this action to assist the Saudi Arabian Government in the defense of its homeland.” He asked the American people for their “support in a decision I’ve made to stand up for what’s right and condemn what’s wrong, all in the cause of peace.”

But one reporter — Jean Heller of the St. Petersburg Times — wasn’t satisfied taking the administration’s claims at face value. She obtained two commercial satellite images of the area taken at the exact same time that American intelligence supposedly had found Saddam’s huge and menacing army and found nothing there but empty desert.

She contacted the office of then-Secretary of Defense Dick Cheney “for evidence refuting the Times photos or analysis offering to hold the story if proven wrong.” But “the official response” was: “Trust us.”

Heller later told the Monitor’s Scott Peterson that the Iraqi buildup on the border between Kuwait and Saudi Arabia “was the whole justification for Bush sending troops in there, and it just didn’t exist.”

Dead Babies, Courtesy of a New York PR Firm

Military occupations are always brutal, and Iraq’s six-month occupation of Kuwait was no exception. But because Americans didn’t have an abundance of affection for Kuwait, a case had to be built that the Iraqi army was guilty of nothing less than Nazi-level atrocities.

That’s where a hearing held by the Congressional Human Rights Caucus in October 1990 played a major role in making the case for war.

A young woman who gave only her first name, Nayira, testified that she had been a volunteer at Kuwait’s al-Adan hospital, where she had seen Iraqi troops rip scores of babies out of incubators, leaving them “to die on the cold floor.” Between tears, she described the incident as “horrifying.”

Her account was a bombshell. Portions of her testimony were aired that evening on ABC’s “Nightline” and NBC’s “Nightly News.” Seven US senators cited her testimony in speeches urging Americans to support the war, and George HW Bush repeated the story on 10 separate occasions in the weeks that followed.

In 2002, Tom Regan wrote about his own family’s response to the story for The Christian Science Monitor:

I can still recall my brother Sean’s face. It was bright red. Furious. Not one given to fits of temper, Sean was in an uproar. He was a father, and he had just heard that Iraqi soldiers had taken scores of babies out of incubators in Kuwait City and left them to die. The Iraqis had shipped the incubators back to Baghdad. A pacifist by nature, my brother was not in a peaceful mood that day. “We’ve got to go and get Saddam Hussein. Now,” he said passionately.

Subsequent investigations by Amnesty Internationala division of Human Rights Watch and independent journalists would show that the story was entirely bogus — a crucial piece of war propaganda the American media swallowed hook, line and sinker. Iraqi troops had looted Kuwaiti hospitals, but the gruesome image of babies dying on the floor was a fabrication.

In 1992, John MacArthur revealed in The New York Times that Nayirah was in fact the daughter of Saud Nasir al-Sabah, Kuwait’s ambassador to the US. Her testimony had been organized by a group called Citizens for a Free Kuwait, which was a front for the Kuwaiti government.

Tom Regan reported that Citizens for a Free Kuwait hired Hill & Knowlton, a New York-based PR firm that had previously spun for the tobacco industry and a number of governments with ugly human rights records. The company was paid “$10.7 million to devise a campaign to win American support for the war.” It was a natural fit, wrote Regan. “Craig Fuller, the firm’s president and COO, had been then-President George Bush’s chief of staff when the senior Bush had served as vice president under Ronald Reagan.”

According to Robin Andersen’s A Century of Media, a Century of War, Hill & Knowlton had spent $1 million on focus groups to determine how to get the American public behind the war, and found that focusing on “atrocities” was the most effective way to rally support for rescuing Kuwait.

Arthur Rowse reported for the Columbia Journalism Review that Hill & Knowlton sent out a video news release featuring Nayirah’s gripping testimony to 700 American television stations.

As Tom Regan noted, without the atrocities, the idea of committing American blood and treasure to save Kuwait just “wasn’t an easy sell.”

Only a few weeks before the invasion, Amnesty International accused the Kuwaiti government of jailing dozens of dissidents and torturing them without trial. In an effort to spruce up the Kuwait image, the company organized Kuwait Information Day on 20 college campuses, a national day of prayer for Kuwait, distributed thousands of “Free Kuwait” bumper stickers, and other similar traditional PR ventures. But none of it was working very well. American public support remained lukewarm the first two months.

That would change as stories about Saddam’s baby-killing troops were splashed across front pages across the country.

Saddam Was Irrational

Saddam Hussein’s 1990 invasion of Kuwait was just as illegal as the US invasion that would ultimately oust him 13 years later — it was neither an act of self-defense, nor did the UN Security Council authorize it.

But it can be argued that Iraq had significantly more justification for its attack.

Kuwait had been a close ally of Iraq, and a top financier of the Iraqi invasion of Iran in 1980, which, as The New York Times reported, occurred after “Iran’s revolutionary government tried to assassinate Iraqi officials, conducted repeated border raids and tried to topple Mr. Hussein by fomenting unrest within Iraq.”

Saddam Hussein felt that Kuwait should forgive part of his regime’s war debt because he had halted the “expansionist plans of Iranian interests” not only on behalf of his own country, but in defense of the other Gulf Arab states as well.

After an oil glut knocked out about two-thirds of the value of a barrel of crude oil between 1980 and 1986, Iraq appealed to OPEC to limit crude oil production in order to raise prices — with oil as low as $10 per barrel, the government was struggling to pay its debts. But Kuwait not only resisted those efforts — and asked OPEC to increase its quotas by 50 percent instead — for much of the 1980s it also had maintained its own production well above OPEC’s mandatory quota. According to a study by energy economist Mamdouh Salameh, “between 1985 and 1989, Iraq lost US$14 billion a year due to Kuwait’s oil price strategy,” and “Kuwait’s refusal to decrease its oil production was viewed by Iraq as an act of aggression against it.”

There were additional disputes between the two countries centering on Kuwait’s exploitation of the Rumaila oil fields, which straddled the border between the two countries. Kuwait was accused of using a technique known as “slant-drilling” to siphon off oil from the Iraqi side.

None of this justifies Iraq’s invasion of Kuwait. But a longstanding and complex dispute between two undemocratic petrostates wasn’t likely to inspire Americans to accept the loss of their sons and daughters in a distant fight.

So instead, George HW Bush told the public that Iraq’s invasion was “without provocation or warning,” and that “there is no justification whatsoever for this outrageous and brutal act of aggression.” He added: “Given the Iraqi government’s history of aggression against its own citizens as well as its neighbors, to assume Iraq will not attack again would be unwise and unrealistic.”

Ultimately, these longstanding disputes between Iraq and Kuwait got considerably less attention in the American media than did tales of Kuwaiti babies being ripped out of incubators by Saddam’s stormtroopers.

Saddam Was “Unstoppable”

A crucial diplomatic error on the part of the first Bush administration left Saddam Hussein with the impression that the US government had little interest in Iraq’s conflict with Kuwait. But that didn’t fit into the narrative that the Iraqi dictator was an irrational maniac bent on regional domination. So there was a concerted effort to deny that the US government had ever had a chance to deter his aggression through diplomatic means — and even to paint those who said otherwise as conspiracy theorists.

As John Mearsheimer from the University of Chicago and Harvard’s Stephen Walt wrote in 2003, “Saddam reportedly decided on war sometime in July 1990, but before sending his army into Kuwait, he approached the United States to find out how it would react.”

In a now famous interview with the Iraqi leader, U.S. Ambassador April Glaspie told Saddam, “[W]e have no opinion on the Arab-Arab conflicts, like your border disagreement with Kuwait.” The U.S. State Department had earlier told Saddam that Washington had “no special defense or security commitments to Kuwait.” The United States may not have intended to give Iraq a green light, but that is effectively what it did.

Exactly what was said during the meeting has been a source of some controversy. Accounts differ. According to a transcript released by the Iraqi government, Glaspie told Hussein, ” I admire your extraordinary efforts to rebuild your country.”

I know you need funds. We understand that and our opinion is that you should have the opportunity to rebuild your country. But we have no opinion on the Arab-Arab conflicts, like your border disagreement with Kuwait.

I was in the American Embassy in Kuwait during the late 60’s. The instruction we had during this period was that we should express no opinion on this issue and that the issue is not associated with America. James Baker has directed our official spokesmen to emphasize this instruction.

Leslie Gelb of The New York Times reported that Glaspie told the Senate Foreign Relations Committee that the transcript was inaccurate “and insisted she had been tough.” But that account was contradicted when diplomatic cables between Baghdad and Washington were released. As Gelb described it, “The State Department instructed Ms. Glaspie to give the Iraqis a conciliatory message punctuated with a few indirect but significant warnings,” but “Ms. Glaspie apparently omitted the warnings and simply slobbered all over Saddam in their meeting on July 25, while the Iraqi dictator threatened Kuwait anew.”

There is no dispute about one crucially important point: Saddam Hussein consulted with the US before invading, and our ambassador chose not to draw a line in the sand, or even hint that the invasion might be grounds for the US to go to war.

The most generous interpretation is that each side badly misjudged the other. Hussein ordered the attack on Kuwait confident that the US would only issue verbal condemnations. As for Glaspie, she later told The New York Times, ”Obviously, I didn’t think — and nobody else did — that the Iraqis were going to take all of Kuwait.”

Fool Me Once…

The first Gulf War was sold on a mountain of war propaganda. It took a campaign worthy of George Orwell to convince Americans that our erstwhile ally Saddam Hussein — whom the US had aided in his war with Iran as late as 1988 — had become an irrational monster by 1990.

Twelve years later, the second invasion of Iraq was premised on Hussein’s supposed cooperation with al Qaeda, vials of anthrax, Nigerian yellowcake and claims that Iraq had missiles poised to strike British territory in little as 45 minutes.

Now, eleven years later, as Bill Moyers put it last week, “the very same armchair warriors in Washington who from the safety of their Beltway bunkers called for invading Baghdad, are demanding once again that America plunge into the sectarian wars of the Middle East.” It’s vital that we keep our history in Iraq in mind, and apply some healthy skepticism to the claims they offer us this time around.

Joshua Holland was a senior digital producer for BillMoyers.com and now writes for The Nation. He’s the author of The Fifteen Biggest Lies About the Economy (and Everything Else the Right Doesn’t Want You to Know about Taxes, Jobs and Corporate America) (Wiley: 2010), and host of Politics and Reality Radio. Follow him on Twitter: @JoshuaHol.

Coronavirus: A Theory of Incompetence

Leaders in the public and private sector in advanced economies, typically highly credentialed, have with very few exceptions shown abject incompetence in dealing with coronavirus as a pathogen and as a wrecker of economies. The US and UK have made particularly sorry showings, but they are not alone.

It’s become fashionable to blame the failure to have enough medical stockpiles and hospital beds and engage in aggressive enough testing and containment measures on capitalism. But as I will describe shortly, even though I am no fan of Anglosphere capitalism, I believe this focus misses the deeper roots of these failures.

After all the country lauded for its response, South Korea, is capitalist. Similarly, reader vlade points out that the Czech Republic has had only 2 coronavirus deaths per million versus 263 for Italy. Among other things, the Czech Republic closed its borders in mid-March and made masks mandatory. Newscasters and public officials wear them to underscore that no one is exempt.

Even though there are plenty of examples of capitalism gone toxic, such as hospitals and Big Pharma sticking doggedly to their price gouging ways or rampant production disruptions due to overly tightly-tuned supply chains, that isn’t an adequate explanation. Government dereliction of duty also abound. In 2006, California’s Governor Arnold Schwarznegger reacted to the avian flu by creating MASH on steroids. From the LA Times:

They were ready to roll whenever disaster struck California: three 200-bed mobile hospitals that could be deployed to the scene of a crisis on flatbed trucks and provide advanced medical care to the injured and sick within 72 hours.

Each hospital would be the size of a football field, with a surgery ward, intensive care unit and X-ray equipment. Medical response teams would also have access to a massive stockpile of emergency supplies: 50 million N95 respirators, 2,400 portable ventilators and kits to set up 21,000 additional patient beds wherever they were needed…

“In light of the pandemic flu risk, it is absolutely a critical investment,” he [Governor Schwarznegger] told a news conference. “I’m not willing to gamble with the people’s safety.”

They were dismantled in 2011 by Governor Jerry Brown as part of post-crisis belt tightening.

The US for decades has as a matter of policy tried to reduce the number of hospital beds, which among other things has led to the shuttering of hospitals, particularly in rural areas. Hero of the day, New York’s Governor Andrew Cuomo pursued this agenda with vigor, as did his predecessor George Pataki.

And even though Trump has made bad decision after bad decision, from eliminating the CDC’s pandemic unit to denying the severity of the crisis and refusing to use government powers to turbo-charge state and local medical responses, people better qualified than he is have also performed disastrously. America’s failure to test early and enough can be laid squarely at the feet of the CDCAs New York Magazine pointed out on March 12:

In a functional system, much of the preparation and messaging would have been undertaken by the CDC. In this case, it chose not to simply adopt the World Health Organization’s COVID-19 test kits — stockpiling them in the millions in the months we had between the first arrival of the coronavirus in China and its widespread appearance here — but to try to develop its own test. Why? It isn’t clear. But they bungled that project, too, failing to produce a reliable test and delaying the start of any comprehensive testing program by a few critical weeks.

The testing shortage is catastrophic: It means that no one knows how bad the outbreak already is, and that we couldn’t take effectively aggressive measures even we wanted to. There are so few tests available, or so little capacity to run them, that they are being rationed for only the most obvious candidates, which practically defeats the purpose. It is not those who are very sick or who have traveled to existing hot spots abroad who are most critical to identify, but those less obvious, gray-area cases — people who may be carrying the disease around without much reason to expect they’re infecting others…Even those who are getting tested have to wait at least several days for results; in Senegal, where the per capita income is less than $3,000, they are getting results in four hours. Yesterday, apparently, the CDC conducted zero tests…

[O]ur distressingly inept response, kept bringing to mind an essay by Umair Haque, first published in 2018 and prompted primarily by the opioid crisis, about the U.S. as the world’s first rich failed state

And the Trump Administration has such difficulty shooting straight that it can’t even manage its priority of preserving the balance sheets of the well off. Its small business bailouts, which are as much about saving those enterprises as preserving their employment, are off to a shaky start. How many small and medium sized ventures can and will maintain payrolls out of available cash when they aren’t sure when and if Federal rescue money will hit their bank accounts?

How did the US, and quite a few other advanced economies, get into such a sorry state that we are lack the operational capacity to engage in effective emergency responses? Look at what the US was able to do in the stone ages of the Great Depression. As Marshall Auerback wrote of the New Deal programs:

The government hired about 60 per cent of the unemployed in public works and conservation projects that

  • planted a billion trees,
  • saved the whooping crane,
  • modernized rural America, and
  • built such diverse projects as the Cathedral of Learning in Pittsburgh,
  • the Montana state capitol,
  • much of the Chicago lakefront,
  • New York’s Lincoln Tunnel and Triborough Bridge complex,
  • the Tennessee Valley Authority and
  • the aircraft carriers Enterprise and Yorktown. It also
  • built or renovated 2,500 hospitals,
  • 45,000 schools,
  • 13,000 parks and playgrounds,
  • 7,800 bridges,
  • 700,000 miles of roads, and
  • a thousand airfields. And it
  • employed 50,000 teachers,
  • rebuilt the country’s entire rural school system, and
  • hired 3,000 writers,
    • musicians,
    • sculptors and painters,
    • including Willem de Kooning and Jackson Pollock.

What are the deeper causes of our contemporary generalized inability to respond to large-scale threats? My top picks are a lack of respect for risk and the rise of symbol manipulation as the dominant means of managing in the private sector and government.

Risk? What Risk?

Thomas Hobbes argued that life apart from society would be “solitary, poor, nasty, brutish and short.” Outside poor countries and communities, advances in science and industrialization have largely proven him right.

It was not long ago, in historical terms, that even aristocrats would lose children to accidents and disease. Only four of Winston Churchill’s six offspring lived to be adults. Comparatively few women now die in childbirth.

But it isn’t just that better hygiene, antibiotics, and vaccines have helped reduce the scourges of youth. They have also reduced the consequences of bad fortune. Fewer soldiers are killed in wars. More are patched up, so fewer come back in coffins and more with prosthetics or PTSD. And those prosthetics, which enable the injured to regain some of their former function, also perversely shield ordinary citizens from the spectacle of lost limbs.1

Similarly, when someone is hit by a car or has a heart attack, as traumatic as the spectacle might be to onlookers, typically an ambulance arrives quickly and the victim is whisked away. Onlookers can tell themselves he’s in good hands and hope for the best.

With the decline in manufacturing, fewer people see or hear of industrial accidents, like the time a salesman in a paper mill in which my father worked stuck his hand in a digester and had his arm ripped off. And many of the victims of hazardous work environments suffer from ongoing exposures, such as to toxic chemicals or repetitive stress injuries, so the danger isn’t evident until it is too late.

Most also are oddly disconnected from the risks they routinely take, like riding in a car (I for one am pretty tense and vigilant when I drive on freeways, despite like to speed as much as most Americans). Perhaps it is due in part to the illusion of being in control while driving.

Similarly, until the coronavirus crisis, even with America’s frayed social safety nets, most people, particularly the comfortably middle class and affluent, took comfort in appearances of normalcy and abundance. Stores are stocked with food. Unlike the oil crisis of the 1970, there’s no worry about getting petrol at the pump. Malls may be emptying out and urban retail vacancies might be increasing, but that’s supposedly due to the march of Amazon, and not anything amiss with the economy. After all, unemployment is at record lows, right?

Those who do go to college in America get a plush experience. No thin mattresses or only adequately kept-up dorms, as in my day. The notion that kids, even of a certain class, have to rough it a bit, earn their way up and become established in their careers and financially, seems to have eroded. Quite a few go from pampered internships to fast-track jobs. In the remote era of my youth, even in the prestigious firms, new hires were subjected to at least a couple of years of grunt work.

So the class of people with steady jobs (which these days are well-placed members of the professional managerial class, certain trades and those who chose low-risk employment with strong civil service protections) have also become somewhat to very removed from the risks endured when most people were subsistence farmers or small town merchants who served them.

Consider this disconnect, based on an Axios-Ipsos survey:

The coronavirus is spreading a dangerous strain of inequality. Better-off Americans are still getting paid and are free to work from home, while the poor are either forced to risk going out to work or lose their jobs.

Generally speaking, the people who are positioned to be least affected by coronavirus are the most rattled. That is due to the gap between expectations and the new reality. Poor people have Bad Shit Happen on a regular basis. Wealthy people expect to be able to insulate themselves from most of it and then have it appear in predictable forms, like cheating spouses and costly divorces, bad investments (still supposedly manageable if you are diversified!), renegade children, and common ailments, like heart attacks and cancer, where the rich better the odds by advantaged access to care.

The super rich are now bunkered, belatedly realizing they can’t set up ICUs at home, and hiring guards to protect themselves from marauding hordes, yet uncertain that their mercenaries won’t turn on them.

The bigger point is that we’ve had a Minksy-like process operating on a society-wide basis: as daily risks have declined, most people have blinded themselves to what risk amounts to and where it might surface in particularly nasty forms. And the more affluent and educated classes, who disproportionately constitute our decision-makers, have generally been the most removed.

The proximity to risk goes a long way to explaining who has responded better. As many have pointed out, the countries that had meaningful experience with SARS2 had a much better idea of what they were up against with the coronavirus and took aggressive measures faster.

But how do you explain South Korea, which had only three cases of SARS and no deaths? It doesn’t appear to have had enough experience with SARS to have learned from it.

A related factor may be that developing economies have fresh memories of what life was like before they became affluent. I can’t speak for South Korea, but when I worked with the Japanese, people still remembered the “starving times” right after World War II. Japan was still a poor country in the 1960s.3 South Korea rose as an economic power after Japan. The Asian Tigers were also knocked back on their heels with the 1997 emerging markets crisis. And of course Seoul is in easy nuke range of North Korea. It’s the only country I ever visited, including Israel, where I went through a metal detector to enter and saw lots of soldiers carrying machine guns in the airport. So they likely have a keen appreciation of how bad bad can be.

The Rise and Rise of the Symbol Economy

Let me start with an observation by Peter Drucker that I read back in the 1980s, but will then redefine his take on “symbol economy,” because I believe the phenomenon has become much more pervasive than he envisioned.

A good recap comes in Fragile Finance: Debt, Speculation and Crisis in the Age of Global Credit by A. Nesvetailova:

The most significant transformation for Drucker was the changed relationship between the symbolic economy of capital movements, exchange rates, and credit flows, and the real economy of the flow of goods and services:

…in the world economy of today, the ‘real economy’ of goods and services and the ‘symbol economy’ of money, credit, and capital are no longer bound tightly to each other; they are indeed, moving further and further apart (1986: 783)

The rise of the financial sphere as the flywheel of the world economy, Drucker noted, is both the most visible and the least understood change of modern capitalism.

What Drucker may not have sufficiently appreciated was money and capital flows are speculative and became more so over time. In their study of 800 years of financial crises, Carmen Reinhart and Ken Rogoff found that high levels of international capital flows were strongly correlated with more frequent and more severe financial crises. Claudio Borio and Petit Disyatat of the Banks of International Settlements found that on the eve of the 2008 crisis, international capital flows were 61 times as large as trade flows, meaning they were only trivially settling real economy transactions.

Now those factoids alone may seem to offer significant support to Drucker’s thesis. But I believe he conceived of it too narrowly. I believe that modeling techniques, above all, spreadsheet-based models, have removed decision-makers from the reality of their decisions. If they can make it work on paper, they believe it will work that way.

When I went to business school and started on Wall Street, financiers and business analysts did their analysis by hand, copying information from documents and performing computations with calculators. It was painful to generate financial forecasts, since one error meant that everything to the right was incorrect and had to be redone.

The effect was that when managers investigated major capital investments and acquisitions, they thought hard about the scenarios they wanted to consider since they could look at only a few. And if a model turned out an unfavorable-looking result, that would be hard to rationalize away, since a lot of energy had been devoted to setting it up.

By contrast, when PCs and Visicalc hit the scene, it suddenly became easy to run lots of forecasts. No one had any big investment in any outcome. And spending so much time playing with financial models would lead most participants to a decision to see the model as real, when it was a menu, not a meal.

When reader speak with well-deserved contempt of MBA managers, the too-common belief that it is possible to run an operation, any operation, by numbers, appears to be a root cause. For over five years, we’ve been running articles from the Health Renewal Blog decrying the rise of “generic managers” in hospital systems (who are typically also spectacularly overpaid) who proceed to grossly mismanage their operations yet still rake in the big bucks.

The UK version of this pathology is more extreme, because it marries managerial overconfidence with a predisposition among British elites to look at people who work hard as “must not be sharp.” But the broad outlines apply here. From Clive, on a Brexit post, when Brexit was the poster child of UK elite incompetence:

What’s struck me most about the UK government’s approach to the practical day-to-day aspects of Brexit is that it is exemplifying a typically British form of managerialism which bedevilles both public sector and private sector organisations. It manifests itself in all manner of guises but the main characteristic is that some “leader” issues impractical, unworkable, unachievable or contradictory instructions (or a “strategy”) to the lower ranks. These lower ranks have been encouraged to adopt the demeanour of yes-men (or yes-women). So you’re not allowed to question the merits of the ask. Everyone keeps quiet and takes the paycheck while waiting for the roof to fall in on them. It’s not like you’re on the breadline, so getting another year or so in isn’t a bad survival attitude. If you make a fuss now, you’ll likely be replaced by someone who, in the leadership’s eyes is a lot more can-do (but is in fact just either more naive or a better huckster).

Best illustrated perhaps by an example — I was asked a day or two ago to resolve an issue I’d reported using “imaginative” solutions. Now, I’ve got a a vivid imagination, but even that would not be able to comply with two mutually contradictory aims at the same time (“don’t incur any costs for doing some work” and “do the work” — where because we’ve outsourced the supply of the services in question, we now get real, unhideable invoices which must be paid).

To the big cheeses, the problem is with the underlings not being sufficiently clever or inventive. The real problem is the dynamic they’ve created and their inability to perceive the changes (in the same way as swinging a wrecking ball is a “change”) they’ve wrought on an organisation.

May, Davies, Fox, the whole lousy lot of ’em are like the pilot in the Airplane movie — they’re pulling on the levers of power only to find they’re not actually connected to anything. Wait until they pull a little harder and the whole bloody thing comes off in their hands.

Americans typically do this sort of thing with a better look: the expectations are usually less obviously implausible, particularly if they might be presented to the wider world. One of the cancers of our society is the belief that any problem can be solved with better PR, another manifestation of symbol economy thinking.

I could elaborate further on how these attitudes have become common, such as the ability of companies to hide bad operating results and them come clean every so often as if it were an extraordinary event, short job tenures promoting “IBG/YBG” opportunism, and the use of lawyers as liability shields (for the execs, not the company, natch).

But it’s not hard to see how it was easy to rationalize away the risks of decisions like globalization. Why say no to what amounted to a transfer from direct factory labor to managers and execs? Offshoring and outsourcing were was sophisticated companies did. Wall Street liked them. Them gave senior employees an excuse to fly abroad on the company dime. So what if the economic case was marginal? So what if the downside could be really bad? What Keynes said about banker herd mentality applies:

A sound banker, alas! is not one who foresees danger and avoids it, but one who, when he is ruined, is ruined in a conventional and orthodox way along with his fellows, so that no one can really blame him.

It’s not hard to see how a widespread societal disconnect of decision-makers from risk, particularly health-related risks, compounded with management by numbers as opposed to kicking the tires, would combine to produce lax attitude toward operations in general.

I believe a third likely factor is poor governance practices, and those have gotten generally worse as organizations have grown in scale and scope. But there is more country-specific nuance here, and I can discuss only a few well, so adding this to my theory will have to hold for another day. But it isn’t hard to think of some in America. For instance, 40 years ago, there were more midsized companies, with headquarters in secondary cities like Dayton, Ohio. Executives living in and caring about their reputation in their communities served as a check on behavior.

Before you depict me as exaggerating about the change in posture toward risks, I recall reading policy articles in the 1960s where officials wrung their hands about US dependence on strategic materials found only in unstable parts of Africa. That US would never have had China make its soldiers’ uniforms, boots, and serve as the source for 80+ of the active ingredients in its drugs. And America was most decidedly capitalist in the 1960s. So we need to look at how things have changed to explain changes in postures towards risk and notions of what competence amounts to.

_____
1 One of my early memories was seeing a one-legged man using a crutch, with the trouser of his missing leg pinned up. I pointed to him and said something to my parents and was firmly told never to do anything like that again.

2 The US did not learn much from its 33 cases. But the lack of fatalities may have contributed.

3 Japan has had a pretty lame coronavirus response, but that is the result of Japan’s strong and idiosyncratic culture. While Japanese are capable of taking action individually when they are isolated, in group settings, no one wants to act or even worse take responsibility unless their is an accepted or established protocol.

Trump Pulls Back From Declaring a National Emergency to Fund a Wall

WASHINGTON — President Trump has stepped back from declaring a national emergency to pay for a border wall, under pressure from congressional Republicans, his own lawyers and advisers, who say using it as a way out of the government shutdown does not justify the precedent it would set and the legal questions it could raise.

If today the national emergency is border security, tomorrow the national emergency might be climate change,” Senator Marco Rubio of Florida, one of the idea’s critics, said this week. Another Republican, Senator Mitt Romney of Utah, told an interviewer that declaring a national emergency should be reserved for “the most extreme circumstances.”

.. “What we’re not looking to do right now is national emergency,” he told reporters gathered in the Cabinet Room as the shutdown approached its fourth week. Minutes later he contradicted himself, saying that he would declare a state of emergency if he had to.

.. Instead, Mr. Trump would use his authority to transfer funds to the wall that were appropriated by Congress for other purposes. Toward that end, the Army Corps of Engineers has been directed to study whether it can divert about $13.9 million in emergency aide set aside for Puerto Rico, Florida, Texas and California. And with the money secured, the president could drop his opposition to the appropriations bills whose passage would end the shutdown.

.. Former White House aides, who noted that Mr. Trump did not focus on the wall during the first two years of his presidency, said the optics of fighting for the wall were more important to the president than erecting it.

.. But opposition has come from many Republican quarters. Some conservatives see it as an unacceptable extension of executive power. Kellyanne Conway, a White House aide, has said it would essentially give Congress a pass. Representative Mike Simpson, Republican of Idaho and a member of the House Appropriations Committee, said it was not clear to him that an emergency declaration would even lead to the prompt reopening of the government.

*Bull Shit Jobs: A Theory*

Overall he presents the five types of bullshit jobs as flunkies, goons, duct tapers, box tickers, and taskmasters, but he spends too much time trying to lower the status of these jobs and not enough time investigating what happens when those jobs go away.

.. A simple experiment would vastly improve this book and make for a marvelous case study chapter: let him spend a year managing a mid-size organization, say 60-80 employees, but one which does not have an adequately staffed HR department, or perhaps does not have an HR department at all.  Then let him report back to us.

At that point we’ll see who really has the bullshit job.