Since the invention of writing, human innovation has transformed how we formulate new ideas, organize our societies, and communicate with one another. But in an age of rapid-fire social media and nonstop algorithm-generated outrage, technology is no longer helping to expand or enrich the public sphere.
BERKELEY – Since 1900, human technology and organization have been evolving at a blistering pace. The degree of change that occurs in just one year would have taken 50 years or more before 1500. War and politics used to be the meat of human history, with advances in technology and organization unfolding very slowly – if at all – in the background. Now, the inverse is true.
The impact of technological innovation on the marketplace of ideas has brought about some of the most consequential changes. The shift from the age of handwritten and hand-copied manuscripts to that of the Gutenberg press ushered in the Copernican Revolution (along with almost two centuries of genocidal religious war). Pamphlets and coffee houses broadened the public sphere and positioned public opinion as a powerful constraint on political rulers’ behavior.
As John Adams, the second president of the United States, later pointed out, the “[American] Revolution was effected before the war commenced … in the minds and hearts of the people.” The decisive intellectual battle, we now know, was won by the English-born printer Thomas Paine’s pamphlet Common Sense. Still, even during the revolutionary period, the pace of change was far slower than it is today. In the space of just two human lifetimes, we have gone from mass-market newspapers and press lords to radio and network television, and then on to the Internet and today’s social media-driven public sphere. And most of us will live long enough to witness whatever comes next.
There is now a near-consensus – at least among those who are not completely steeped in social-media propaganda – that the current public sphere does not serve us well. “Social media is broken,” the American author Annalee Newitz wrote in a recent commentary for the New York Times. “It has poisoned the way we communicate with each other and undermined the democratic process. Many of us just want to get away from it, but we can’t imagine a world without it.”
Western societies have experienced a similar sentiment before. In the 1930s, my great-uncles listened to their elders complain about how radio had allowed demagogues like Adolf Hitler, Charles Coughlin, and Franklin D. Roosevelt (that “communist”) to short-circuit the normal processes of public discourse. No longer were public debates kept sober and rational by traditional gatekeepers. In the new age of broadcast, unapproved memes could spread far and wide without interference. Politicians and ideologues who may not have had the public interest in mind could get right into people’s ears and hijack their brains.
Nowadays, the problem is not a single demagogue, but a public sphere beset by swarms of “influencers,” propagandists, and bots, all semi-coordinated by the dynamics of the medium itself. Once again, ideas of dubious quality and provenance are shaping people’s thoughts without having been subjected to adequate evaluation and analysis.
We should have seen this coming. A generation ago, when the “net” was limited to universities and research institutes, there was an annual “September” phenomenon. Each year, new arrivals to the institution would be given an email account and/or user profile, whereupon they would rapidly find their online communities. They would begin to talk, and someone, inevitably, would get annoyed. For the next month, whatever informational or discursive use the net might have had would be sidelined by continuous vitriolic exchanges.
Then things would calm down. People would remember to put on their asbestos underwear before logging on; they learned not to take the newbies so seriously. Trolls would find themselves banned from the forums they loved to disrupt. And, in any case, most who experimented with the troll lifestyle realized that it has little to recommend it. For the next 11 months, the net would serve its purpose, significantly extending each user’s cultural, conversational, and intellectual range, and adding to the collective stock of human intelligence.
But as the Internet began to spread to each household and then to each smartphone, fears about the danger of an “eternal September” have been confirmed. There is more money to be made by stoking outrage than by providing sound information and encouraging the social-learning process that once taught net newbies to calm down. And yet, today’s Internet does offer valuable information, so much so that few of us could imagine doing without it. To access that information, we have tacitly agreed to allow the architects at Facebook, Twitter, Google (especially YouTube), and elsewhere to shape the public sphere with their outrage- and clickbait-generating algorithms.
Meanwhile, others have found that there is a great deal of money and power to be gained by shaping public opinion online. If you want to get your views out there, it is easier to piggyback on the outrage machine than to develop a comprehensive rational argument – especially when those views are self-serving and deleterious to the public good.
For her part, Newitz ends her recent commentary on a hopeful note. “Public life has been irrevocably changed by social media; now it’s time for something else,” she writes. “We need to stop handing off responsibility for maintaining public space to corporations and algorithms – and give it back to human beings. We may need to slow down, but we’ve created democracies out of chaos before. We can do it again.”
Such hope may be necessary for journalists these days. Unfortunately, a rational evaluation of our situation suggests that it is unjustified. The eternal September of our discontent has arrived.
A thoroughly partisan hack, Moore has no consistent economic beliefs or theories
(Project Syndicate) – In December 2015, the right-wing commentator Stephen Moore, President Donald Trump’s pick to fill a vacancy on the Federal Reserve Board of Governors, savagely attacked then-Fed Chair Janet Yellen and her predecessor, Ben Bernanke, for maintaining loose monetary policies in the years following the Great Recession.
According to Moore, who is not a Ph.D. economist, investors had “become hyper-dependent” on the Fed’s “zero-interest-rate policy … just as an addict craves crack cocaine.” This “money creation,” he surmised, had yielded “nada” in terms of “helping juice the economy, creating jobs, or giving the American worker a pay raise.”
Worse, the United States had already “tried this before — twice — and both times the story ended badly with a pop of the bubble … in 1999-2000 and … in 2008-09.” The lesson, he concluded, is that, “Micromanaging the economy through the lever of money creation at the grand fiefdom within the Fed doesn’t work.”
Or does it? Moore himself is probably not the most reliable judge.
On Dec. 26, 2018, he savagely attacked Yellen’s successor, Jerome Powell, for raising interest rates to unwind the very approach that he had condemned three years earlier. “If you cut engine power too far on a jetliner,” he warned, “it will stall and drop out of the sky.”
Moore complained that, after having “risen by 382 points on hopes that the Fed would listen to Trump and stop cutting power,” the Dow Jones Industrial AverageDJIA, -0.11% had “plunged by 895 points” on the news of another interest-rate hike. This, he concluded, was evidence that “the Fed’s monetary policy has come unhinged.”
Moore called on Powell to “do the honorable thing … and resign.” But, failing that, he hoped that Trump would simply fire the Fed chair. “The law says he can replace the Federal Reserve chairman for cause,” Moore observed in an interview that same week. “Well, the cause is that he’s wrecking our economy.”
.. Of course, a less-generous interpretation is that Moore has not changed his view of the economy, and was acting in bad faith during the years of the Obama administration. Or, less likely, he is acting in bad faith now, after having conducted himself in an honest manner up until 2016.
As it happens, none of these interpretations applies, because they are all predicated on the false assumption that Moore actually has an informed perspective of the economy. To my mind, he does not.
True, Moore has consistently advocated low government spending and opposed progressive taxation. He might even support more open immigration policies, as one would expect from a self-proclaimed free-market conservative. Then again, his views may have changed since he started advising Trump in 2016. After all, he already seems to have abandoned his previous commitment to free trade.
That comes as no surprise. Throughout his career as a partisan talking head, Moore’s economic analysis has never had any basis in empirical reality. To the contrary, he has repeatedly shown that he will say whatever needs to be said to please his political master.
Needless to say, Moore is wholly unfit to serve in the office to which he is being nominated. He has absolutely no business overseeing U.S. monetary policy. The same is true of any president who would appoint him and any senator who would vote to confirm him.
Another way to distill the failure is to say that in the last decade or so the center-left attempted a series of policy compromises — Obamacare instead of single-payer, cap-and-trade instead of a Green New Deal, modest upper-bracket tax increases instead of big attempts to soak the rich — and then discovered that the Republican Party was either still too far away ideologically or too much of an internally divided mess to make a lasting deal on any issue. And meanwhile the compromises were often unpopular with swing voters — as Obamacare was at first, as cap-and-trade or a carbon tax probably would be (just ask Emmanuel Macron) — so there was no obvious political advantage to making them pre-emptively.
.. Both of these accounts fit DeLong’s narrative; both make a case for letting the further-left parts of the Democratic coalition try leadership instead, and seeking compromises between socialists and liberals rather than pining for moderate-Republican partners who don’t appear to be in evidence.
But then consider a third distillation, a third narrative, in which the center-left’s signal political failure was that it never really sought to preserve a cultural centrism, which meant over time that its party’s approach to social issues has been dictated more and more completely by the left. In this story the political success of Bill Clinton reflected not only his compromises with Republicans on taxes and spending, his tacit nods to Reaganomics, but also his ability to infuse a centrist liberalism with reassuring nods to various kinds of moderate cultural conservatism — the school uniform and v-chip business and the rhetoric of “safe, legal and rare” on abortion, the easy Baptist religiosity, the tacitly center-right positions on immigration and crime and same-sex marriage.
And in this part of the Democratic coalition’s story, the center-left’s role has been extraordinarily passive, essentially following the cultural left a tiny bit more slowly rather than trying to devise a more moderate approach. You can find hints of what such a moderate approach might look like in intellectual projects like Jonathan Haidt’s Heterodox Academy, or in the probing, evenhanded culture-war reportage of the magazine writer Jesse Singal (whom I hesitate to even praise because it will do him no favors on the internet). But that cultural moderation has no substantial political form, no important champions within the Democratic Party. It has Joe Manchin and Tulsi Gabbard, maybe, but they are eccentric figures; elsewhere among the Democrats there is little interest in considering all the different ways that cultural extremism costs them votes.
Which means that if the center-left abdicates, DeLong-style, on economic policy, the Democratic Party as a whole will have moved to the left on every front, writing off not only the possibility of compromising with Republican politicians (which, for now, might be understandable) but also the possibility of winning over voters who would almost certainly be Democrats if the party still occupied the cultural terrain that it held in 2000 or even as late as 2008.
Sadly the rest of the DeLong thread didn’t take up that possibility. It degenerated, instead, into a howl against Republican fascism and a post-Protestant sermon about how liberal America can build the true and only heaven, the real shining city on the hill.
Which suggests that to reckon with the possibility that making liberalism a pseudo-church might be a problem, not an aspiration, we need a very different center-left from the one surrendering today.
Three of the last four US recessions stemmed from unforeseen shocks in financial markets. Most likely, the next downturn will be no different: the revelation of some underlying weakness will trigger a retrenchment of investment, and the government will fail to pursue counter-cyclical fiscal policy.BERKELEY – Over the past 40 years, the US economy has experienced four recessions. Among the four, only the extended downturn of 1979-1982 had a conventional cause. The US Federal Reserve thought that inflation was too high, so it hit the economy on the head with the brick of interest-rate hikes. As a result, workers moderated their demands for wage increases, and firms cut back on planned price increases.The other three recessions were each caused by derangements in financial markets. After the
- savings-and-loan crisis of 1991-1992
- came the bursting of the dot-com bubble in 2000-2002, followed by the
- collapse of the subprime mortgage market in 2007, which triggered the global financial crisis the following year
.. And one can infer from today’s macroeconomic big picture that the next recession most likely will not be due to a sudden shift by the Fed from a growth-nurturing to an inflation-fighting policy. Given that visible inflationary pressures probably will not build up by much over the next half-decade, it is more likely that something else will trigger the next downturn.
Specifically, the culprit will probably be a sudden, sharp “flight to safety” following the revelation of a fundamental weakness in financial markets. That, after all, is the pattern that has been generating downturns since at least 1825, when England’s canal-stock boom collapsed.
.. Needless to say, the particular nature and form of the next financial shock will be unanticipated. Investors, speculators, and financial institutions are generally hedged against the foreseeable shocks, but there will always be other contingencies that have been missed. For example, the death blow to the global economy in 2008-2009 came not from the collapse of the mid-2000s housing bubble, but from the concentration of ownership of mortgage-backed securities.
Likewise, the stubbornly long downturn of the early 1990s was not directly due to the deflation of the late-1980s commercial real-estate bubble. Rather, it was the result of failed regulatory oversight, which allowed insolvent savings and loan associations to continue speculating in financial markets. Similarly, it was not the deflation of the dot-com bubble, but rather the magnitude of overstated earnings in the tech and communications sector that triggered the recession in the early 2000s.
At any rate, today’s near-inverted yield curve, low nominal and real bond yields, and equity values all suggest that US financial markets have begun to price in the likelihood of a recession. Assuming that business investment committees are thinking like investors and speculators, all it will take now to bring on a recession is an event that triggers a retrenchment of investment spending.
If a recession comes anytime soon, the US government will not have the tools to fight it. The White House and Congress will once again prove inept at deploying fiscal policy as a counter-cyclical stabilizer; and the Fed will not have enough room to provide adequate stimulus through interest-rate cuts. As for more unconventional policies, the Fed most likely will not have the nerve, let alone the power, to pursue such measures.
As a result, for the first time in a decade, Americans and investors cannot rule out a downturn. At a minimum, they must prepare for the possibility of a deep and prolonged recession, which could arrive whenever the next financial shock comes.