As I pointed out in my latest column, the politics of inflation are dominated by concerns about gasoline and food prices — precisely the prices over which policymakers in general, and the president in particular, have the least influence. Economists, by contrast, usually focus on measures that try to get at underlying inflation, excluding highly volatile stuff like, well, energy and food.
Actually, the traditional definition of “core” inflation — the one usually used by the Federal Reserve — which excludes only energy and food, has been problematic in the post-pandemic era. Why? Because we’ve been seeing some wild fluctuations in other prices, like used cars. So there’s growing emphasis on other measures of core inflation, like the Dallas Fed’s “trimmed mean” measure, which excludes extreme price movements in either direction. You can see the difference in this figure, which shows three-month rates of change in the two measures since 2020 (month to month is too noisy, whereas annual changes lag too far behind events):
Traditional core inflation has been highly variable, the alternative measure less so. Both measures, however, have eased off lately. It looks as if underlying inflation is running at something like 3.5 to 4 percent.
Easing inflation is good. But we’re still well above 2 percent inflation, which the Fed and other central banks have traditionally seen as their target. And the Fed is set to continue tightening until that target is hit.
So why is 2 percent the target? I’m not going to crusade against the 2 percent solution. But anyone interested in economic policy should know that the history of how 2 percent came to define “price stability” is peculiar, and that the argument for keeping that target is grounded less in straightforward economics than in almost metaphysical concerns about credibility.
One way to see the peculiarity of 2 percent is to take a longer view of inflation, going back to 1984, the year of “morning in America.” At the time, the United States was experiencing rapid economic growth because the Fed, which had squeezed the economy extremely hard to end double-digit inflation, had relaxed monetary policy because, in its view, inflation had been vanquished. By 1984, and for the rest of the 1980s, the Fed felt comfortable about inflation because it was running at around only 4 percent:
The point, of course, is that during Ronald Reagan’s second term, America’s underlying inflation rate was roughly what it is now. Yet policymakers were strutting around boasting about their victory over inflation, and the public didn’t see inflation as a major concern:
So how did 4 percent inflation come to be considered excessive and 2 percent acquire sacred status? It’s a long story, in which New Zealand, of all places, played a crucial role.
But the short answer is that 2 percent seemed to offer an easy answer to a dispute between economists who wanted true price stability — zero inflation — and those, including a guy named Larry Summers, who thought we needed somewhat positive inflation to preserve the Fed’s ability to fight recessions. The stable-price crowd was willing to believe that 2 percent was actually zero, because conventional inflation measures understated the benefits of technological progress. The room-to-act crowd believed that 2 percent was high enough that the Fed would rarely end up cutting interest rates all the way to zero and finding that it wasn’t enough.
As it turned out, however, this latter judgment was all wrong. The Fed and other central banks have spent much of the past 15 years with interest rates as low as they can go, desperately seeking other tools to stimulate their economies:
As a result, a number of economists have suggested that the inflation target should be raised. For example, in 2010 Olivier Blanchard, then the chief economist of the International Monetary Fund, made the case for an inflation target as high as 4 percent. I made similar arguments to the European Central Bank a bit later.
None of these arguments got much real-world traction, however, perhaps because central bankers weren’t convinced that a higher inflation target would help them very much. But right now we face a different question: How much are we prepared to pay to get back to 2 percent?
Again, if the Fed were to apply the standards that prevailed in the 1980s, it would consider the current rate of inflation acceptable and declare victory. Instead, it’s putting a squeeze on credit markets and risking at least a mild recession to get us down to 2 percent from 4 percent. Why? It’s not because there’s a compelling economic case. As Blanchard and his co-authors asked back in 2010, “Are the net costs of inflation much higher at, say, 4 percent than at 2 percent?” There’s no real evidence to that effect.
As best I can tell, the main reason Fed officials are insistent on getting back to 2 percent is concern about credibility. They fear that if they ease off at, say, 3 percent inflation, markets and the public will wonder whether they will eventually accept 4 percent, then 5 percent and so on. One reassuring aspect of the current bout of rising prices is that longer-term inflation expectations have remained “anchored,” so that there are no signs of a 1970s-type wage-price spiral. Giving up on the 2 percent target might risk losing this anchor.
Being honest, if I were a decision maker at the Fed, I would probably have the same concerns. But it seems important to realize that if we are about to have a recession, which is certainly possible, it won’t be because hard economic considerations require that we squeeze inflation all the way back down to 2 percent. What we’re seeing instead is monetary policy driven by softer, vaguer concerns about credibility. We live in peculiar times.
I’ll leave the savvy political analysis to others. I don’t know why Senator Joe Manchin apparently decided to go back on an explicit promise he made to President Biden. Naïvely, I thought that even in this era of norm-breaking, honoring a deal you’ve just made would be one of the last norms to go, since a reputation for keeping your word once given is useful even to highly cynical politicians. I also don’t know what, if anything, can be saved from the Build Back Better framework.
What I do know is that there will be huge human and, yes, economic costs if Biden’s moderate but crucial spending plans fall by the wayside.
Failure to enact a decent social agenda would condemn millions of American children to poor health and low earnings in adulthood — because that’s what growing up in poverty does. It would condemn millions more to inadequate medical care and financial ruin if they got ill, because that’s what happens when people lack adequate health insurance. It would condemn hundreds of thousands, maybe more, to unnecessary illness and premature death from air pollution, even aside from the intensified risk of climate catastrophe.
I’m not speculating here. There’s overwhelming evidence that children in low-income families who receive financial aid are significantly healthier and more productive than those who didn’t once they become adults. Uninsured Americans often lack access to needed medical care and face unaffordable bills. And studies show that policies to mitigate climate change will also yield major health benefits from cleaner air over the next decade.
As an aside, it’s not clear how many Americans realize the extent to which we’re falling behind other nations in terms of meeting basic human needs. For example, I still keep running into people who believe that we have the world’s highest life expectancy, when the reality is that we can expect to live between three and five fewer years than citizens of most European countries.
There are also, by the way, large and growing gaps between U.S. states. In 1979 life expectancy in West Virginia was only about 14 months shorter than in New York; by 2016 the gap had widened to six years. And yes, Manchin’s home state would benefit immensely from the social spending its Democratic senator seems determined to block.
The weakness of the U.S. social safety net also has economic consequences. It’s true that we still have high gross domestic product per capita — but that’s largely because Americans take far less vacation time than their counterparts abroad, which means that they produce more because they work more hours. In other ways we lag. Even before the pandemic, Americans in their prime working years were less likely to be employed than citizens of Canada or many European countries, probably in part because we don’t help adults stay in the work force by providing child care and parental leave.
But can we afford to make our lives better? One answer is that other rich countries seem to manage it just fine. Another answer is that Manchin’s objections to the proposed legislation evaporate under scrutiny.
Manchin asserted that the Congressional Budget Office determined that the cost of the bill is “upwards of $4.5 trillion.” No, it didn’t. That was a Republican-demanded estimate of outlays — not the considerably smaller impact on the deficit — under the assumption that everything in the legislation would be made permanent, which isn’t what the bill says. And if Congress did vote to extend programs like the child tax credit, it would probably also vote for revenue offsets. The budget office analysis of the legislation as actually written — which found it roughly deficit-neutral — is a much better guide to its likely fiscal impact than this rigged hypothetical.
As for Manchin’s claim that we have a “staggering” national debt, maybe it’s worth noting that federal interest payments as a percentage of G.D.P. are only half what they were under Ronald Reagan, and that if you adjust for inflation — as you should — they’re basically zero.
What about inflation? The proposed spending in Build Back Better is spread over multiple years, so it wouldn’t do a lot to raise overall demand in the near term — the first-year addition to the deficit would be just 0.6 percent of G.D.P., which isn’t enough to make much difference to inflation in any model I know. Besides, the Federal Reserve has just made it clear that it’s ready to raise interest rates if inflation doesn’t subside, so government spending should matter even less.
As I said, I’m not going to try to analyze Manchin’s thought processes, and I’ll leave it to others to speculate about his personal motives. What I can say is that the letter he released to explain why he said what he said on Fox News doesn’t read like a carefully worked-out policy statement; it doesn’t even read like a coherent ideological manifesto. Indeed, it feels rushed — a grab bag of Republican talking points hastily trotted out in an attempt to justify his abrupt betrayal and to portray himself as a victim.
Sorry, but no. America — not a senator who’s taking heat for a broken promise — is the victim in this story.
Why is it so hard for him to just admit he was wrong?
Nobel Prize–winning economist Paul Krugman is one of the most influential individuals in his field, which means people listen when he talks about bitcoin. Unfortunately, most of what he has had to say about the cryptocurrency over the years has been misguided, uninformed, or just plain wrong.
It’s sometimes difficult for the average person to understand what economists and politicians are talking about when they debate policy, but the value proposition of bitcoin can be easily understood by anyone through its NgU technology (NgU is an abbreviation of Number Go Up and is a meme based around bitcoin’s deflationary monetary policy). While Krugman has stated that his 1998 prediction that “the Internet’s impact on the economy [would be] no greater than the fax machine’s” was supposed to be a fun and provocative thought experiment, it may be much more difficult to explain away his many confused and oftentimes arrogant takes on bitcoin over the past ten years.
Krugman first wrote about bitcoin in The New York Times back in September 2011. In this post, Krugman mainly compared bitcoin to gold in a rather negative light. “To the extent that the [bitcoin] experiment tells us anything about monetary regimes,” he wrote, “it reinforces the case against anything like a new gold standard—because it shows just how vulnerable such a standard would be to money-hoarding, deflation, and depression.”
In other words, Krugman made a moral case against the adoption of bitcoin as money. In Krugman’s telling, a bitcoin standard would make the world much worse off because bitcoin has a fixed supply and central bankers would not be able to increase the money supply to stimulate the economy during economic recessions.
Even if you accept the idea that the world would be much better off under a more inflationary monetary system where central bankers have the power to stabilize the economy (I don’t), individuals tend to respond to incentives related to the betterment of their own lives, not necessarily the greater good of society. If holding bitcoin theoretically makes the world as a whole a bit worse off but acts as a better form of savings for an individual, is the average person going to choose to put his savings in fiat currencies that lose value over time out of the kindness of his own heart, or will he choose to just hold bitcoin? It’s also important to remember that the entire point of bitcoin is to persist in the face of governments that try to force their citizens into only using the government-approved form of money.
In April 2013, Krugman invoked Adam Smith to make another moral case against bitcoin, this time claiming that the use of gold, silver, or bitcoin as money was a waste of resources. “Smith actually wrote eloquently about the fundamental foolishness of relying on gold and silver currency, which— as he pointed out—serve only a symbolic function, yet absorbed real resources in their production, and why it would be smart to replace them with paper currency,” Krugman wrote. “And now here we are in a world of high information technology—and people think it’s smart, nay cutting-edge, to create a sort of virtual currency whose creation requires wasting real resources in a way Adam Smith considered foolish and outmoded in 1776.”
This was an early version of the energy and climate change–based arguments being made against bitcoin today. This is a faulty argument, however, because it assumes there is no difference between bitcoin and traditional bank accounts. The entire point of bitcoin as an asset is that, unlike Venmo or traditional bank accounts, users can retain full control over their digital money and are not simply holding IOUs. Claiming that this is a waste of resources is a subjective argument. It is no different from saying automobiles or YouTube are wasteful due to the amount of energy that is used to power them. People use bitcoin because it provides value for them, so the resources expended to make bitcoin possible aren’t a waste.
Later in 2013, Krugman simply declared that “Bitcoin Is Evil” because, as science-fiction writer Charlie Stross put it, “BitCoin looks like it was designed as a weapon intended to damage central banking and money issuing banks, with a Libertarian political agenda in mind—to damage states ability to collect tax and monitor their citizens financial transactions.” That said, Krugman did at least go into the argument that bitcoin lacked any sort of fundamental price floor and contrasted that characterization with gold’s use in jewelry and the U.S. dollar’s use for paying taxes.
Krugman would go on to use bitcoin’s lack of a price floor mechanism as his key argument against the cryptocurrency for many years to come. For example, as he argued in a 2015 interview, bitcoin “is a technically sweet solution to a problem, but it’s not clear that problem is one that has much economic relevance. It’s certainly not a reason to hold that currency.…If you’re looking for the idea that a currency doesn’t really have to be something physical, it can be something that is virtual, that’s the system we already have.”
But this misses the point of bitcoin, which is actually nothing like the monetary system we currently have. For one, bitcoin’s long-term monetary policy was “set in stone” when the network launched in January 2009, and it is not subject to changes by a trusted third party such as a central bank. Additionally, bitcoin solves the problem of centralization that is found in the digital equivalents of both the gold and fiat-based currency systems. Bitcoin users are able to retain full ownership over their coins with no counterparty risk; a bitcoin is not an IOU. Further, due to the censorship-resistant nature of the bitcoin network, a new financial system can be built on top of the bitcoin blockchain through the use of smart contracts to enable a greater degree of user privacy for a wide variety of activities, operating in a manner that contrasts the current surveillance state.
In addition to calling bitcoin evil, Krugman has also dismissed it as “libertarian derp” on multiple occasions. He even took pleasure in the crashing bitcoin price in early 2018. Notably, some of Krugman’s negative comments toward bitcoin popped up around the absolute bottoms of two consecutive cryptocurrency bear markets. In other words, it may be a good time to buy bitcoin whenever you see Krugman taking a victory lap.
Unfortunately for Krugman, the “libertarian derp” cryptocurrency hit a new all-time high once again in 2021, 10 years after his initial criticisms of the crypto asset were first published in The New York Times. Instead of acknowledging the reasons for bitcoin’s staying power, however, it appears that Krugman will continue to claim there is no utility for this technology and keep dismissing bitcoin as a cult that can survive indefinitely.
Fortunately for bitcoin, it can rebut Krugman by simply continuing to exist and thrive in the marketplace.
Krugman, 1998: “The growth of the Internet will slow drastically, as the flaw in ‘Metcalfe’s law’ becomes apparent: most people have nothing to say to each other!
By 2005, it will become clear that the Internet’s impact on the economy has been no greater than the fax machine’s”
A number of readers have asked me to weigh in on Bitcoin and other cryptocurrencies, whose fluctuations have dominated a lot of market news. Would I please comment on what it’s all about, and what’s going on?
Well, I can tell you what it’s about. What’s going on is harder to explain.
The story so far: Bitcoin, the first and biggest cryptocurrency, was introduced in 2009. It uses an encryption key, similar to those used in hard-to-break codes — hence the “crypto” — to establish chains of ownership in tokens that entitle their current holders to … well, ownership of those tokens. And nowadays we use Bitcoin to buy houses and cars, pay our bills, make business investments, and more.
Oh, wait. We don’t do any of those things. Twelve years on, cryptocurrencies play almost no role in normal economic activity. Almost the only time we hear about them being used as a means of payment — as opposed to speculative trading — is in association with illegal activity, like money laundering or the Bitcoin ransom Colonial Pipeline paid to hackers who shut it down.
Twelve years is an eon in information technology time. Venmo, which I can use to share restaurant bills, buy fresh fruit at sidewalk kiosks, and much more, was also introduced in 2009. Apple unveiled its first-generation iPad in 2010. Zoom came into use in 2012. By the time a technology gets as old as cryptocurrency, we expect it either to have become part of the fabric of everyday life or to have been given up as a nonstarter.
If normal, law-abiding people don’t use cryptocurrency, it’s not for lack of effort on the part of crypto boosters. Many highly paid person-hours have been spent trying to find the killer app, the thing that will finally get the masses using Bitcoin, Ethereum or some other brand daily.
But I’ve been in numerous meetings with enthusiasts for cryptocurrency and/or blockchain, the concept that underlies it. In such meetings I and others always ask, as politely as we can: “What problem does this technology solve? What does it do that other, much cheaper and easier-to-use technologies can’t do just as well or better?” I still haven’t heard a clear answer.
Yet investors continue to pay huge sums for digital tokens. The values of major cryptocurrencies fluctuate wildly — Bitcoin fell 30 percent Wednesday morning, then made up most of the losses that afternoon. Their collective value has, however, at times exceeded $2 trillion, more than half the value of all the intellectual property owned by U.S. business.
Why are people willing to pay large sums for assets that don’t seem to do anything? The answer, obviously, is that the prices of these assets keep going up, so that early investors made a lot of money, and their success keeps drawing in new investors.
This may sound to you like a speculative bubble, or maybe a Ponzi scheme — and speculative bubbles are, in effect, natural Ponzi schemes. But could a Ponzi scheme really go on for this long? Actually, yes: Bernie Madoff ran his scam for almost two decades, and might have gone even longer if the financial crisis hadn’t intervened.
Now, a long-running Ponzi scheme requires a narrative — and the narrative is where crypto really excels.
First, crypto boosters are very good at technobabble — using arcane terminology to convince themselves and others that they’re offering a revolutionary new technology, even though blockchain is actually pretty elderly by infotech standards and has yet to find any compelling uses.
Second, there’s a strong element of libertarian derp — assertions that fiat currencies, government-issued money without any tangible backing, will collapse any day now. True, Britain, whose currency was still standing last time I looked, went off the gold standard 90 years ago. But who’s counting?
Given all this, are cryptocurrencies headed for a crash sometime soon? Not necessarily. One fact that gives even crypto skeptics like me pause is the durability of gold as a highly valued asset. Gold, after all, suffers from pretty much the same problems as Bitcoin. People may think of it as money, but it lacks any attributes of a useful currency: You can’t actually use it to make transactions — try buying a new car with gold ingots — and its purchasing power has been extremely unstable.
So when John Maynard Keynes called the gold standard a “barbarous relic” way back in 1924, he wasn’t wrong. But the metal’s mystique, and its valuation, live on. It’s conceivable that one or two cryptocurrencies will somehow achieve similar longevity.
Or maybe not. For one thing, governments are well aware that cryptocurrencies are being used by bad actors, and may well crack down in a way they never did on gold trading. Also, the proliferation of cryptocurrencies may prevent any one of them from achieving the semi-sacred status gold holds in some people’s minds.
The good news is that none of this matters very much. Because Bitcoin and its relatives haven’t managed to achieve any meaningful economic role, what happens to their value is basically irrelevant to those of us not playing the crypto game.
G.O.P. cynics have been coddling crazies for a long time.
One striking aspect of the Capitol Hill putsch was that none of the rioters’ grievances had any basis in reality.
No, the election wasn’t stolen — there is no evidence of significant electoral fraud. No, Democrats aren’t part of a satanic pedophile conspiracy. No, they aren’t radical Marxists — even the party’s progressive wing would be considered only moderately left of center in any other Western democracy.
So all the rage is based on lies. But what’s almost as striking as the fantasies of the rioters is how few leading Republicans have been willing, despite the violence and desecration, to tell the MAGA mob that their conspiracy theories are false.
Bear in mind that Kevin McCarthy, the House minority leader, and two-thirds of his colleagues voted against accepting the Electoral College results even after the riot. (McCarthy then shamelessly decried “division,” saying that “we must call on our better angels.”)
Or consider the behavior of leading Republicans who aren’t usually considered extremists. On Sunday Senator Rob Portman declared that we need to “restore confidence in the integrity of our electoral system.” Portman isn’t stupid; he has to know that the only reason so many people doubt the election results is that members of his party deliberately fomented that doubt. But he’s still keeping up the pretense.
And the cynicism and cowardice of leading Republicans is, I would argue, the most important cause of the nightmare now enveloping our nation.
Of course we need to understand the motives of our homegrown enemies of democracy. In general, political scientists find — not surprisingly, given America’s history — that racial antagonism is the best predictor of willingness to countenance political violence. Anecdotally, personal frustrations — often involving social interactions, not “economic anxiety” — also seem to drive many extremists.
But neither racism nor widespread attraction to conspiracy theories is new in our political life. The worldview described in Richard Hofstadter’s classic 1964 essay “The Paranoid Style in American Politics” is barely distinguishable from QAnon beliefs today.
So there’s only so much to be gained from interviewing red-hatted guys in diners; there have always been people like that. If there are or seem to be more such people than in the past, it probably has less to do with intensified grievances than with outside encouragement.
For the big thing that has changed since Hofstadter wrote is that one of our major political parties has become willing to tolerate and, indeed, feed right-wing political paranoia.
This coddling of the crazies was, at first, almost entirely cynical. When the G.O.P. began moving right in the 1970s its true agenda was mainly economic — what its leaders wanted, above all, were business deregulation and tax cuts for the rich. But the party needed more than plutocracy to win elections, so it began courting working-class whites with what amounted to thinly disguised racist appeals.
Not incidentally, white supremacy has always been sustained in large part through voter suppression. So it shouldn’t be surprising to see right-wingers howling about a rigged election — after all, rigging elections is what their side is accustomed to doing. And it’s not clear to what extent they actually believe that this election was rigged, as opposed to being enraged that this time the usual vote-rigging didn’t work.
But it’s not just about race. Since Ronald Reagan, the G.O.P. has been closely tied to the hard-line Christian right. Anyone shocked by the prevalence of insane conspiracy theories in 2020 should look back to “The New World Order,” published by Reagan ally Pat Robertson in 1991, which saw America menaced by an international cabal of Jewish bankers, Freemasons and occultists. Or they should check out a 1994 video promoted by Jerry Falwell Sr. called “The Clinton Chronicles,” which portrayed Bill Clinton as a drug smuggler and serial killer.
So what has changed since then? For a long time Republican elites imagined that they could exploit racism and conspiracy theorizing while remaining focused on a plutocratic agenda. But with the rise first of the Tea Party, then of Donald Trump, the cynics found that the crazies were actually in control, and that they wanted to destroy democracy, not cut tax rates on capital gains.
And Republican elites have, with few exceptions, accepted their new subservient status.
You might have hoped that a significant number of sane Republican politicians would finally say that enough is enough, and break with their extremist allies. But Trump’s party didn’t balk at his corruption and abuse of power; it stood by him when he refused to accept electoral defeat; and some of its members are responding to a violent attack on Congress by complaining about their loss of Twitter followers.
And there’s no reason to believe that the atrocities yet to come — for there will be more atrocities — will make a difference. The G.O.P. has reached the culmination of its long journey away from democracy, and it’s hard to see how it can ever be redeemed.
The right has made irresponsible behavior a key principle.
America’s response to the coronavirus has been a lose-lose proposition.
The Trump administration and governors like Florida’s Ron DeSantis insisted that there was no trade-off between economic growth and controlling the disease, and they were right — but not in the way they expected.
Premature reopening led to a surge in infections: Adjusted for population, Americans are currently dying from Covid-19 at around 15 times the rate in the European Union or Canada. Yet the “rocket ship” recovery Donald Trump promised has crashed and burned: Job growth appears to have stalled or reversed, especially in states that were most aggressive about lifting social distancing mandates, and early indications are that the U.S. economy is lagging behind the economies of major European nations.
So we’re failing dismally on both the epidemiological and the economic fronts. But why?
On the face of it, the answer is that Trump and allies were so eager to see big jobs numbers that they ignored both infection risks and the way a resurgent pandemic would undermine the economy. As I and others have said, they failed the marshmallow test, sacrificing the future because they weren’t willing to show a little patience.
And there’s surely a lot to that explanation. But it isn’t the whole story.
For one thing, people truly focused on restarting the economy should have been big supporters of measures to limit infections without hurting business — above all, getting Americans to wear face masks. Instead, Trump ridiculed those in masks as “politically correct,” while Republican governors not only refused to mandate mask-wearing, but they prevented mayors from imposing local mask rules.
Also, politicians eager to see the economy bounce back should have wanted to sustain consumer purchasing power until wages recovered. Instead, Senate Republicans ignored the looming July 31 expiration of special unemployment benefits, which means that tens of millions of workers are about to see a huge hit to their incomes, damaging the economy as a whole.
So what was going on? Were our leaders just stupid? Well, maybe. But there’s a deeper explanation of the profoundly self-destructive behavior of Trump and his allies: They were all members of America’s cult of selfishness.
You see, the modern U.S. right is committed to the proposition that greed is good, that we’re all better off when individuals engage in the untrammeled pursuit of self-interest. In their vision, unrestricted profit maximization by businesses and unregulated consumer choice is the recipe for a good society.
Support for this proposition is, if anything, more emotional than intellectual. I’ve long been struck by the intensity of right-wing anger against relatively trivial regulations, like bans on phosphates in detergent and efficiency standards for light bulbs. It’s the principle of the thing: Many on the right are enraged at any suggestion that their actions should take other people’s welfare into account.
This rage is sometimes portrayed as love of freedom. But people who insist on the right to pollute are notably unbothered by, say, federal agents tear-gassing peaceful protesters. What they call “freedom” is actually absence of responsibility.
Rational policy in a pandemic, however, is all about taking responsibility. The main reason you shouldn’t go to a bar and should wear a mask isn’t self-protection, although that’s part of it; the point is that congregating in noisy, crowded spaces or exhaling droplets into shared air puts others at risk. And that’s the kind of thing America’s right just hates, hates to hear.
Indeed, it sometimes seems as if right-wingers actually make a point of behaving irresponsibly. Remember how Senator Rand Paul, who was worried that he might have Covid-19 (he did), wandered around the Senate and even used the gym while waiting for his test results?
Anger at any suggestion of social responsibility also helps explain the looming fiscal catastrophe. It’s striking how emotional many Republicans get in their opposition to the temporary rise in unemployment benefits; for example, Senator Lindsey Graham declared that these benefits would be extended “over our dead bodies.” Why such hatred?
It’s not because the benefits are making workers unwilling to take jobs. There’s no evidence that this is happening — it’s just something Republicans want to believe. And in any case, economic arguments can’t explain the rage.
Again, it’s the principle. Aiding the unemployed, even if their joblessness isn’t their own fault, is a tacit admission that lucky Americans should help their less-fortunate fellow citizens. And that’s an admission the right doesn’t want to make.
Just to be clear, I’m not saying that Republicans are selfish. We’d be doing much better if that were all there were to it. The point, instead, is that they’ve sacralized selfishness, hurting their own political prospects by insisting on the right to act selfishly even when it hurts others.
What the coronavirus has revealed is the power of America’s cult of selfishness. And this cult is killing us.