The Doctor Versus the Denier

Anthony Fauci’s at the pool, but Donald Trump’s in deep.

Never mind Johnny Depp and Amber Heard.

You want to see a real can’t-look-away train wreck of a relationship? Look to the nation’s capital, where a messy falling out is chronicled everywhere from the tabloids to a glossy fashion magazine, replete with a photo shoot by a swimming pool.

The saga has enough betrayal, backstabbing, recrimination, indignation and ostracization to impress Edith Wharton.

The press breathlessly covers how much time has passed since the pair last spoke, whether they’re headed for splitsville, and if they can ever agree on what’s best for the children.

It was always bound to be tempestuous because they are the ultimate odd couple, the doctor and the president.

  • One is a champion of truth and facts. The other is a master of deceit and denial.
  • One is highly disciplined, working 18-hour days. The other can’t be bothered to do his homework and golfs instead.
  • One is driven by science and the public good. The other is a public menace, driven by greed and ego.
  • One is a Washington institution. The other was sent here to destroy Washington institutions.
  • One is incorruptible. The other corrupts.
  • One is apolitical. The other politicizes everything he touches — toilets, windows, beans and, most fatally, masks.

After a fractious week, when the former reality-show star in the White House retweeted a former game-show host saying that we shouldn’t trust doctors about Covid-19, Donald Trump and Anthony Fauci are gritting their teeth.

What’s so scary is that the bumpy course of their relationship has life-or-death consequences for Americans.

Who could even dream up a scenario where a president and a White House drop oppo research on the esteemed scientist charged with keeping us safe in a worsening pandemic?

The administration acted like Peter Navarro, Trump’s wacko-bird trade adviser, had gone rogue when he assailed Dr. Fauci for being Dr. Wrong, in a USA Today op-ed. But does anyone believe that? And if he did, would he still have his job?

No doubt it was a case of Trump murmuring: Will no one rid me of this meddlesome infectious disease specialist?

Republicans on Capitol Hill privately confessed they were baffled by the whole thing, saying they couldn’t understand why Trump would undermine Fauci, especially now with the virus resurgent. They think it’s not only hurting Trump’s re-election chances, but theirs, too.

As though it couldn’t get more absurd, Kellyanne Conway told Fox News on Friday that she thinks it would help Trump’s poll numbers for him to start giving public briefings on the virus again — even though that exercise went off the rails when the president began suggesting people inject themselves with bleach.

How did we get to a situation in our country where the public health official most known for honesty and hard work is most vilified for it?” marvels Michael Specter, a science writer for The New Yorker who began covering Fauci during the AIDs crisis. “And as Team Trump trashes him, the numbers keep horrifyingly proving him right.”

When Dr. Fauci began treating AIDs patients, nearly every one of them died. “It was the darkest time of my life,” he told Specter. In an open letter, Larry Kramer called Fauci a “murderer.”

Then, as Specter writes, he started listening to activists and made a rare admission: His approach wasn’t working. He threw his caution to the winds and became a public-health activist. Through rigorous research and commitment to clinical studies, the death rate from AIDs has plummeted over the years.

Now Fauci struggles to drive the data bus as the White House throws nails under his tires. It seems emblematic of a deeper, existential problem: America has lost its can-do spirit. We were always Bugs Bunny, faster, smarter, more wily than everybody else. Now we’re Slugs Bunny.

Can our country be any more pathetic than this: The Georgia governor suing the Atlanta mayor and City Council to block their mandate for city residents to wear masks?

Trump promised the A team, but he has surrounded himself with losers and kiss-ups and second-raters. Just your basic Ayn Rand nightmare.

Certainly, Dr. Fauci has had to adjust some of his early positions as he learned about this confounding virus. (“When the facts change, I change my mind. What do you do, sir?” John Maynard Keynes wisely observed.)

Medicine is not an exact art,” Jerome Groopman, the best-selling author and professor at Harvard Medical School, put it. “There’s lots of uncertainty, always evolving information, much room for doubt. The most dangerous people are the ones who speak with total authority and no room for error.”

Sound like someone you know?

Medical schools,” Dr. Groopman continued, “have curricula now to teach students the imperative of admitting when something went wrong, taking responsibility, and committing to righting it.”

Some are saying the 79-year-old Dr. Fauci should say to hell with it and quit. But we need his voice of reason in this nuthouse of a White House.

Despite Dr. Fauci’s best efforts to stay apolitical, he has been sucked into the demented political kaleidoscope through which we view everything now. Consider the shoot by his pool, photographed by Frankie Alduino, for a digital cover story by Norah O’Donnell for InStyle magazine.

From the left, the picture represented an unflappable hero, exhausted and desperately in need of some R & R, chilling poolside, not letting the White House’s slime campaign get him down or silence him. And on the right, some saw a liberal media darling, high on his own supply in the midst of a deadly pandemic. “While America burns, Fauci does fashion mag photo shoots,” tweeted Sean Davis, co-founder of the right-wing website The Federalist.

It’s no coincidence that the QAnon-adjacent cultists on the right began circulating a new conspiracy theory in the fever swamps of Facebook that Dr. Fauci’s wife of three and a half decades, a bioethicist, is Ghislane Maxwell’s sister. (Do I need to tell you she isn’t?)

Worryingly, new polls show that the smear from Trumpworld may be starting to stick; fewer Republicans trust the doctor now than in the spring.

Forget Mueller, Sessions, Comey, Canada, his niece, Mika Brzezinski. Of the many quarrels, scrapes and scraps Trump has instigated in his time in office, surely this will be remembered not only as the most needless and perverse, but as the most dangerous.

As Dr. Fauci told The Atlantic, it’s “a bit bizarre.”

More than a bit, actually.

What Social Distancing Looked Like in 1666

Humanity has been surviving plagues for thousands of years, and we have managed to learn a lot along the way.

A lot of English people believed 1666 would be the year of the apocalypse. You can’t really blame them. In late spring 1665, bubonic plague began to eat away at London’s population. By fall, roughly 7,000 people were dying every week in the city. The plague lasted through most of 1666, ultimately killing about 100,000 people in London alone — and possibly as many as three-quarters of a million in England as a whole.

Perhaps the greatest chronicler of the Great Plague was Samuel Pepys, a well-connected English administrator and politician who kept a detailed personal diary during London’s darkest years. He reported stumbling across corpses in the street, and anxiously reading the weekly death tolls posted in public squares.

In August of 1665, Pepys described walking to Greenwich, “in my way seeing a coffin with a dead body therein, dead of the plague, lying in [a field] belonging to Coome farme, which was carried out last night, and the parish have not appointed any body to bury it, but only set a watch there day and night, that nobody should go thither or come thence, which is a most cruel thing.” To ensure that no one — not even the family of the dead person — would go near the corpse or bury it, the parish had stationed a guard. “This disease making us more cruel to one another than if we are doggs.”

It felt like Armageddon. And yet it was also the beginning of a scientific renaissance in England, when doctors experimented with quarantines, sterilization and social distancing. For those of us living through these stay-at-home days of Covid-19, it’s useful to look back and see how much has changed — and how much hasn’t. Humanity has been guarding against plagues and surviving them for thousands of years, and we have managed to learn a lot along the way.

When a plague hit England during the summer of 1665, it was a time of tremendous political turmoil. The nation was deep into the Second Anglo-Dutch War, a nasty naval conflict that had torpedoed the British economy. But there were deeper sources of internal political conflict. Just five years earlier in 1660, King Charles II had wrested back control of the government from the Puritan members of Parliament led by Oliver Cromwell.

Though Cromwell had died in 1658, the king had him exhumed, his corpse put in chains and tried for treason. After the inevitable guilty verdict, the King’s henchmen mounted Cromwell’s severed head on a 20-foot spike over Westminster Hall, along with the heads of two co-conspirators. Cromwell’s rotting head stayed there, gazing at London, throughout the plague and for many years after.

War and social upheaval hastened the spread of the plague, which had broken out several years earlier in Holland. But when he wasn’t displaying the severed heads of his enemies, the king was invested in scientific progress. He sanctioned the founding of the Royal Society of London for Improving Natural Knowledge, a venerable scientific institution known today as The Royal Society.

It was most likely thanks to his interest in science that government representatives and doctors quickly used social distancing methods for containing the spread of bubonic plague. Charles II issued a formal order in 1666 that ordered a halt to all public gatherings, including funerals. Already, theaters had been shut down in London, and licensing curtailed for new pubs. Oxford and Cambridge closed.

Isaac Newton was one of the students sent home, and his family was among the wealthy who fled the cities so they could shelter in place at their country homes. He spent the plague year at his family estate, teasing out the foundational ideas for calculus.

Things were less cozy in London. Quarantining was invented during the first wave of bubonic plague in the 14th century, but it was deployed more systematically during the Great Plague. Public servants called searchers ferreted out new cases of plague, and quarantined sick people along with everyone who shared their homes. People called warders painted a red cross on the doors of quarantined homes, alongside a paper notice that read “LORD HAVE MERCY UPON US.” (Yes, the all-caps was mandatory.)

The government supplied food to the housebound. After 40 days, warders painted over the red crosses with white crosses, ordering residents to sterilize their homes with lime. Doctors believed that the bubonic plague was caused by “smells” in the air, so cleaning was always recommended. They had no idea that it was also a good way to get rid of the ticks and fleas that actually spread the contagion.

Of course, not everyone was compliant. Legal documents at the U.K. National Archives show that in April 1665, Charles II ordered severe punishment for a group of people who took the cross and paper off their door “in a riotious manner,” so they could “goe abroad into the street promiscuously, with others.” It’s reminiscent of all those modern Americans who went to the beaches in Florida over spring break, despite what public health experts told them.

Pepys was a believer in science, and he tried to follow the most cutting-edge advice from his doctor friends. This included smoking tobacco as a precautionary measure, because smoke and fire would purify the “bad air.” In June of 1665, as the plague began, Pepys described seeing red crosses on doors for the first time. “It put me into an ill conception of myself and my smell,” he writes, “so that I was forced to buy some roll-tobacco to smell and chaw, which took away the apprehension.”

Quack medicine will always be with us. But there was some good advice, too. During the Great Plague, shopkeepers asked customers to drop their coins in dishes of vinegar to sterilize them, using the 1600s version of hand sanitizer.

Just as some American politicians blame the Chinese for the coronavirus, there were 17th century Brits who blamed the Dutch for spreading the plague. Others blamed Londoners. Mr. Pepys had relocated his family to a country home in Woolwich, and writes in his diary that the locals “are afeard of London, being doubtfull of anything that comes from thence, or that hath lately been there … I was forced to say that I lived wholly at Woolwich.”

By late 1666, the plague had begun its retreat from England, but one disaster led to another. In autumn, the Great Fire of London destroyed the city’s downtown in a weeklong conflagration. The damage was so extensive in part because city officials were slow to respond, having already spent over a year dealing with plague. The fire left 70,000 Londoners homeless and angry, threatening to riot.

While the mayor of London issued orders to evacuate the city, Pepys had more pedestrian concerns: He wrote about helping a friend dig a pit in his garden, where the two men buried “my Parmazan cheese, as well as my wine and some other things.” Even in the middle of a civilization-shaking event, people will still hoard odd things, like toilet paper — or cheese.

Despite the war, the plague and the fire, London survived. Urbanites rebuilt relatively quickly, using the same basic street layout. In 1667, Pepys was bustling around the healing city, putting his rooms back in order and turning his thoughts to new developments in politics.

Pepys survived. Scholars are still not sure whether he ever retrieved his cheese.

The surprising science of alpha males | Frans de Waal

In this fascinating look at the “alpha male,” primatologist Frans de Waal explores the privileges and costs of power while drawing surprising parallels between how humans and primates choose their leaders. His research reveals some of the unexpected capacities of alpha males — generosity, empathy, even peacekeeping — and sheds light on the power struggles of human politicians. “Someone who is big and strong and intimidates and insults everyone is not necessarily an alpha male,” de Waal says.

After Neoliberalism

For the past 40 years, the United States and other advanced economies have been pursuing a free-market agenda of low taxes, deregulation, and cuts to social programs. There can no longer be any doubt that this approach has failed spectacularly; the only question is what will – and should – come next.

The neoliberal experiment – lower taxes on the rich, deregulation of labor and product markets, financialization, and globalization – has been a spectacular failure. Growth is lower than it was in the quarter-century after World War II, and most of it has accrued to the very top of the income scale. After decades of stagnant or even falling incomes for those below them, neoliberalism must be pronounced dead and buried.
Vying to succeed it are at least three major political alternatives:
  1. far-right nationalism,
  2. center-left reformism, and the
  3. progressive left (with the center-right representing the neoliberal failure).

And yet, with the exception of the progressive left, these alternatives remain beholden to some form of the ideology that has (or should have) expired.

The center-left, for example, represents neoliberalism with a human face. Its goal is to bring the policies of former US President Bill Clinton and former British Prime Minister Tony Blair into the twenty-first century, making only slight revisions to the prevailing modes of financialization and globalization. Meanwhile, the nationalist right disowns globalization, blaming migrants and foreigners for all of today’s problems. Yet as Donald Trump’s presidency has shown, it is no less committed – at least in its American variant – to tax cuts for the rich, deregulation, and shrinking or eliminating social programs.

By contrast, the third camp advocates what I call , which prescribes a radically different economic agenda, based on four priorities. The first is to

  1. restore the balance between markets, the state, and civil society. Slow economic growth, rising inequality, financial instability, and environmental degradation are problems born of the market, and thus cannot and will not be overcome by the market on its own. Governments have a duty to limit and shape markets through environmental, health, occupational-safety, and other types of regulation. It is also the government’s job to do what the market cannot or will not do, like actively investing in basic research, technology, education, and the health of its constituents.
  2. The second priority is to recognize that the “wealth of nations” is the result of  – learning about the world around us – and social organization that allows large groups of people to work together for the common good. Markets still have a crucial role to play in facilitating social cooperation, but they serve this purpose only if they are governed by the rule of law and subject to democratic checks. Otherwise, individuals can get rich by exploiting others, extracting wealth through rent-seeking rather than creating wealth through genuine ingenuity. Many of today’s wealthy took the exploitation route to get where they are. They have been well served by Trump’s policies, which have encouraged rent-seeking while destroying the underlying sources of wealth creation. Progressive capitalism seeks to do precisely the opposite.
  3. This brings us to the third priority: addressing the growing problem of concentrated . By exploiting information advantages, buying up potential competitors, and creating entry barriers, dominant firms are able to engage in large-scale rent-seeking to the detriment of everyone else. The rise in corporate market power, combined with the decline in workers’ bargaining power, goes a long way toward explaining why inequality is so high and growth so tepid. Unless government takes a more active role than neoliberalism prescribes, these problems will likely become much worse, owing to advances in robotization and artificial intelligence.
  4. The fourth key item on the progressive agenda is to sever the link between economic power and political influence. Economic power and political influence are mutually reinforcing and self-perpetuating, especially where, as in the US, wealthy individuals and corporations may spend without limit in elections. As the US moves ever closer to a fundamentally undemocratic system of “one dollar, one vote,” the system of checks and balances so necessary for democracy likely cannot hold: nothing will be able to constrain the power of the wealthy. This is not just a moral and political problem: economies with less inequality actually perform better. Progressive-capitalist reforms thus have to begin by curtailing the influence of money in politics and reducing wealth inequality.3

There is no magic bullet that can reverse the damage done by decades of neoliberalism. But a comprehensive agenda along the lines sketched above absolutely can. Much will depend on whether reformers are as resolute in combating problems like excessive market power and inequality as the private sector is in creating them.

A comprehensive agenda must focus on education, research, and the other true sources of wealth. It must protect the environment and fight climate change with the same vigilance as the Green New Dealers in the US and Extinction Rebellion in the United Kingdom. And it must provide public programs to ensure that no citizen is denied the basic requisites of a decent life. These include economic security, access to work and a living wage, health care and adequate housing, a secure retirement, and a quality education for one’s children.

This agenda is eminently affordable; in fact, we cannot afford not to enact it. The alternatives offered by nationalists and neoliberals would guarantee more stagnation, inequality, environmental degradation, and political acrimony, potentially leading to outcomes we do not even want to imagine.

Progressive capitalism is not an oxymoron. Rather, it is the most viable and vibrant alternative to an ideology that has clearly failed. As such, it represents the best chance we have of escaping our current economic and political malaise.

NASA scientist Jen Heldmann describes how the Earth’s moon was formed

NASA scientist Jennifer Heldmann describes the most popular theory of how the solar system and Earth’s moon was formed. Below you can watch a short four minute video of her explanation of the accretion theory, see a computer simulation of the hypothesis, or watch the whole 45 minute video as recorded during the “Ask a Scientist” event in San Francisco, CA, on Oct 7th, 2008.

How Did the Moon Form? We May Need a New Theory

There may be much more water on the moon than we thought. And that could change everything.

How did the moon become the moon? Where did it come from? How did it first form?

We don’t know in any fully satisfying way, but we do have a compelling theoryin the form of the giant impact hypothesis. Per the theory, early in Earth’s history — when Earth was still, technically, proto-Earth — an object the size of Mars slammed into the planet. The impact of the collision generated a ring of debris — and the planetary detritus coalesced slowly (very slowly: over the course of millions of years) into the gray, glowing spheroid that has since been a source of fascination to scientists and poets alike.

.. Here’s the problem, though: A string of recent research has suggested that there’s more water on the moon than we previously thought. (Though liquid water can’t persist on the moon’s surface, scientists can search for “water” in the form of ice, and also in the form of hydrogen and oxygen atoms extant in the lunar landscape.) In 2008, Space Ref notes, an analysis of Apollo lunar samples conducted by ion microprobe detected indigenous hydrogen in the moon rocks. And in 2009, NASA’s Lunar Crater Observation and Sensing Satellite slammed into a permanently shadowed lunar crater, ejecting a plume of material that ended up being rich in, yep, water ice. Hydroxyls — chemical functional groups consisting of one atom of hydrogen and one of oxygen — have also been detected in other volcanic rocks and in the lunar regolith, the layer of fine powder that coats the moon’s surface.
.. The team found about six parts per million of water — quantities large enough to suggest that the water didn’t just come from elements in errant comets or asteroids. “The surprise discovery of this work is that in lunar rocks, even in nominally water-free minerals such as plagioclase feldspar, the water content can be detected,” researcher Youxue Zhang put it. And given the age of the sample and the amount of hydroxyl discovered, the scientists say, the water-forming elements must have been on the moon since its formation. “Because these are some of the oldest rocks from the moon, the water is inferred to have been in the moon when it formed,” Zhang told TG Daily.

The team’s findings — “Water in lunar anorthosites and evidence for a wet early Moon” — were published this weekend in the journal Nature Geoscience.

.. None of that means the impact formation theory is invalid — that the moon theory is moot. It means, though, that the giant impact theory now has another layer of complexity, in the form of evidence that would seem to contradict it. We’ve long known of the presence of water — and of water-forming elements — on the moon; the question is how it got there. Previous research suggested that those elements might have been brought to the moon from outside sources like comets and meteorites after the moon’s crust formed and cooled. But evidence of water-forming elements in lunar-native rocks complicates that theory. “I still think the impact scenario is the best formation scenario for the moon,” study leader Hejiu Hui, an engineering researcher at the University of Notre Dame, noted, “but we need to reconcile the theory of hydrogen.”