For centuries, black music, forged in bondage, has been the sound of complete artistic freedom. o wonder everybody is always stealing it.
In 1830, Rice was a nobody actor in his early 20s, touring with a theater company in Cincinnati (or Louisville; historians don’t know for sure), when, the story goes, he saw a decrepit, possibly disfigured old black man singing while grooming a horse on the property of a white man whose last name was Crow. On went the light bulb. Rice took in the tune and the movements but failed, it seems, to take down the old man’s name. So in his song based on the horse groomer, he renamed him: “Weel about and turn about jus so/Ebery time I weel about, I jump Jim Crow.” And just like that, Rice had invented the fellow who would become the mascot for two centuries of legalized racism.
That night, Rice made himself up to look like the old black man — or something like him, because Rice’s get-up most likely concocted skin blacker than any actual black person’s and a gibberish dialect meant to imply black speech. Rice had turned the old man’s melody and hobbled movements into a song-and-dance routine that no white audience had ever experienced before. What they saw caused a permanent sensation. He reportedly won 20 encores.
Rice repeated the act again, night after night, for audiences so profoundly rocked that he was frequently mobbed during performances. Across the Ohio River, not an arduous distance from all that adulation, was Boone County, Ky., whose population would have been largely enslaved Africans. As they were being worked, sometimes to death, white people, desperate with anticipation, were paying to see them depicted at play.
Other performers came and conquered, particularly the Virginia Minstrels, who exploded in 1843, burned brightly then burned out after only months. In their wake, P.T. Barnum made a habit of booking other troupes for his American Museum; when he was short on performers, he blacked up himself. By the 1840s, minstrel acts were taking over concert halls, doing wildly clamored-for residencies in Boston, New York and Philadelphia.
A blackface minstrel would sing, dance, play music, give speeches and cut up for white audiences, almost exclusively in the North, at least initially. Blackface was used for mock operas and political monologues (they called them stump speeches), skits, gender parodies and dances. Before the minstrel show gave it a reliable home, blackface was the entertainment between acts of conventional plays. Its stars were the Elvis, the Beatles, the ’NSync of the 19th century. The performers were beloved and so, especially, were their songs.
During minstrelsy’s heyday, white songwriters like Stephen Foster wrote the tunes that minstrels sang, tunes we continue to sing. Edwin Pearce Christy’s group the Christy Minstrels formed a band — banjo, fiddle, bone castanets, tambourine — that would lay the groundwork for American popular music, from bluegrass to Motown. Some of these instruments had come from Africa; on a plantation, the banjo’s body would have been a desiccated gourd. In “Doo-Dah!” his book on Foster’s work and life, Ken Emerson writes that the fiddle and banjo were paired for the melody, while the bones “chattered” and the tambourine “thumped and jingled a beat that is still heard ’round the world.”
But the sounds made with these instruments could be only imagined as black, because the first wave of minstrels were Northerners who’d never been meaningfully South. They played Irish melodies and used Western choral harmonies, not the proto-gospel call-and-response music that would make life on a plantation that much more bearable. Black artists were on the scene, like the pioneer bandleader Frank Johnson and the borderline-mythical Old Corn Meal, who started as a street vendor and wound up the first black man to perform, as himself, on a white New Orleans stage. His stuff was copied by George Nichols, who took up blackface after a start in plain-old clowning. Yet as often as not, blackface minstrelsy tethered black people and black life to white musical structures, like the polka, which was having a moment in 1848. The mixing was already well underway: Europe plus slavery plus the circus, times harmony, comedy and drama, equals Americana.
And the muses for so many of the songs were enslaved Americans, people the songwriters had never met, whose enslavement they rarely opposed and instead sentimentalized. Foster’s minstrel-show staple “Old Uncle Ned,” for instance, warmly if disrespectfully eulogizes the enslaved the way you might a salaried worker or an uncle:
Den lay down de shubble and de hoe,
Hang up de fiddle and de bow:
No more hard work for poor Old Ned —
He’s gone whar de good Niggas go,
No more hard work for poor Old Ned —
He’s gone whar de good Niggas go.
Such an affectionate showcase for poor old (enslaved, soon-to-be-dead) Uncle Ned was as essential as “air,” in the white critic Bayard Taylor’s 1850 assessment; songs like this were the “true expressions of the more popular side of the national character,” a force that follows “the American in all its emigrations, colonizations and conquests, as certainly as the Fourth of July and Thanksgiving Day.” He’s not wrong. Minstrelsy’s peak stretched from the 1840s to the 1870s, years when the country was as its most violently and legislatively ambivalent about slavery and Negroes; years that included the Civil War and Reconstruction, the ferocious rhetorical ascent of Frederick Douglass, John Brown’s botched instigation of a black insurrection at Harpers Ferry and the assassination of Abraham Lincoln.
Minstrelsy’s ascent also coincided with the publication, in 1852, of “Uncle Tom’s Cabin,” a polarizing landmark that minstrels adapted for the stage, arguing for and, in simply remaining faithful to Harriet Beecher Stowe’s novel, against slavery. These adaptations, known as U.T.C.s, took over the art form until the end of the Civil War. Perhaps minstrelsy’s popularity could be (generously) read as the urge to escape a reckoning. But a good time predicated upon the presentation of other humans as stupid, docile, dangerous with lust and enamored of their bondage? It was an escape into slavery’s fun house.
What blackface minstrelsy gave the country during this period was an entertainment of skill, ribaldry and polemics. But it also lent racism a stage upon which existential fear could become jubilation, contempt could become fantasy. Paradoxically, its dehumanizing bent let white audiences feel more human. They could experience loathing as desire, contempt as adoration, repulsion as lust. They could weep for overworked Uncle Ned as surely as they could ignore his lashed back or his body as it swung from a tree.
But where did this leave a black performer? If blackface was the country’s cultural juggernaut, who would pay Negroes money to perform as themselves? When they were hired, it was only in a pinch. Once, P.T. Barnum needed a replacement for John Diamond, his star white minstrel. In a New York City dance hall, Barnum found a boy, who, it was reported at the time, could outdo Diamond (and Diamond was good). The boy, of course, was genuinely black. And his being actually black would have rendered him an outrageous blight on a white consumer’s narrow presumptions. As Thomas Low Nichols would write in his 1864 compendium, “Forty Years of American Life,” “There was not an audience in America that would not have resented, in a very energetic fashion, the insult of being asked to look at the dancing of a real negro.”
So Barnum “greased the little ‘nigger’s’ face and rubbed it over with a new blacking of burned cork, painted his thick lips vermilion, put on a woolly wig over his tight curled locks and brought him out as ‘the champion nigger-dancer of the world.’ ” This child might have been William Henry Lane, whose stage name was Juba. And, as Juba, Lane was persuasive enough that Barnum could pass him off as a white person in blackface. He ceased being a real black boy in order to become Barnum’s minstrel Pinocchio.
After the Civil War, black performers had taken up minstrelsy, too, corking themselves, for both white and black audiences — with a straight face or a wink, depending on who was looking. Black troupes invented important new dances with blue-ribbon names (the buck-and-wing, the Virginia essence, the stop-time). But these were unhappy innovations. Custom obligated black performers to fulfill an audience’s expectations, expectations that white performers had established. A black minstrel was impersonating the impersonation of himself. Think, for a moment, about the talent required to pull that off. According to Henry T. Sampson’s book, “Blacks in Blackface,” there were no sets or effects, so the black blackface minstrel show was “a developer of ability because the artist was placed on his own.” How’s that for being twice as good? Yet that no-frills excellence could curdle into an entirely other, utterly degrading double consciousness, one that predates, predicts and probably informs W.E.B. DuBois’s more self-consciously dignified rendering.
American popular culture was doomed to cycles not only of questioned ownership, challenged authenticity, dubious propriety and legitimate cultural self-preservation but also to the prison of black respectability, which, with brutal irony, could itself entail a kind of appropriation. It meant comportment in a manner that seemed less black and more white. It meant the appearance of refinement and polish. It meant the cognitive dissonance of, say, Nat King Cole’s being very black and sounding — to white America, anyway, with his frictionless baritone and diction as crisp as a hospital corner — suitably white. He was perfect for radio, yet when he got a TV show of his own, it was abruptly canceled, his brown skin being too much for even the black and white of a 1955 television set. There was, perhaps, not a white audience in America, particularly in the South, that would not have resented, in a very energetic fashion, the insult of being asked to look at the majestic singing of a real Negro.
The modern conundrum of the black performer’s seeming respectable, among black people, began, in part, as a problem of white blackface minstrels’ disrespectful blackness. Frederick Douglass wrote that they were “the filthy scum of white society.” It’s that scum that’s given us pause over everybody from Bert Williams and Bill “Bojangles” Robinson to Flavor Flav and Kanye West. Is their blackness an act? Is the act under white control? Just this year, Harold E. Doley Jr., an affluent black Republican in his 70s, was quoted in The Times lamenting West and his alignment with Donald Trump as a “bad and embarrassing minstrel show” that “served to only drive black people away from the G.O.P.”
But it’s from that scum that a robust, post-minstrel black American theater sprung as a new, black audience hungered for actual, uncorked black people. Without that scum, I’m not sure we get an event as shatteringly epochal as the reign of Motown Records. Motown was a full-scale integration of Western, classical orchestral ideas (strings, horns, woodwinds) with the instincts of both the black church (rhythm sections, gospel harmonies, hand claps) and juke joint Saturday nights (rhythm sections, guitars, vigor). Pure yet “noisy.” Black men in Armani. Black women in ball gowns. Stables of black writers, producers and musicians. Backup singers solving social equations with geometric choreography. And just in time for the hegemony of the American teenager.
Even now it feels like an assault on the music made a hundred years before it. Motown specialized in love songs. But its stars, those songs and their performance of them were declarations of war on the insults of the past and present. The scratchy piccolo at the start of a Four Tops hit was, in its way, a raised fist. Respectability wasn’t a problem with Motown; respectability was its point. How radically optimistic a feat of antiminstrelsy, for it’s as glamorous a blackness as this country has ever mass-produced and devoured.
The proliferation of black music across the planet — the proliferation, in so many senses, of being black — constitutes a magnificent joke on American racism. It also confirms the attraction that someone like Rice had to that black man grooming the horse. But something about that desire warps and perverts its source, lampoons and cheapens it even in adoration. Loving black culture has never meant loving black people, too. Loving black culture risks loving the life out of it.
And yet doesn’t that attraction make sense? This is the music of a people who have survived, who not only won’t stop but also can’t be stopped. Music by a people whose major innovations — jazz, funk, hip-hop — have been about progress, about the future, about getting as far away from nostalgia as time will allow, music that’s thought deeply about the allure of outer space and robotics, music whose promise and possibility, whose rawness, humor and carnality call out to everybody — to other black people, to kids in working class England and middle-class Indonesia. If freedom’s ringing, who on Earth wouldn’t also want to rock the bell?
In 1845, J.K. Kennard, a critic for the newspaper The Knickerbocker, hyperventilated about the blackening of America. Except he was talking about blackface minstrels doing the blackening. Nonetheless, Kennard could see things for what they were:
“Who are our true rulers? The negro poets, to be sure! Do they not set the fashion, and give laws to the public taste? Let one of them, in the swamps of Carolina, compose a new song, and it no sooner reaches the ear of a white amateur, than it is written down, amended, (that is, almost spoilt,) printed, and then put upon a course of rapid dissemination, to cease only with the utmost bounds of Anglo-Saxondom, perhaps of the world.”
What a panicked clairvoyant! The fear of black culture — or “black culture” — was more than a fear of black people themselves. It was an anxiety over white obsolescence. Kennard’s anxiety over black influence sounds as ambivalent as Lorde’s, when, all the way from her native New Zealand, she tsk-ed rap culture’s extravagance on “Royals,” her hit from 2013, while recognizing, both in the song’s hip-hop production and its appetite for a particular sort of blackness, that maybe she’s too far gone:
Every song’s like gold teeth, Grey Goose, trippin’ in the bathroom
Bloodstains, ball gowns, trashin’ the hotel room
We don’t care, we’re driving Cadillacs in our dreams
But everybody’s like Cristal, Maybach, diamonds on your timepiece
Jet planes, islands, tigers on a gold leash
We don’t care, we aren’t caught up in your love affair
Beneath Kennard’s warnings must have lurked an awareness that his white brethren had already fallen under this spell of blackness, that nothing would stop its spread to teenage girls in 21st-century Auckland, that the men who “infest our promenades and our concert halls like a colony of beetles” (as a contemporary of Kennard’s put it) weren’t black people at all but white people just like him — beetles and, eventually, Beatles. Our first most original art form arose from our original sin, and some white people have always been worried that the primacy of black music would be a kind of karmic punishment for that sin. The work has been to free this country from paranoia’s bondage, to truly embrace the amplitude of integration. I don’t know how we’re doing.
Last spring, “Old Town Road,” a silly, drowsy ditty by the Atlanta songwriter Lil Nas X, was essentially banished from country radio. Lil Nas sounds black, as does the trap beat he’s droning over. But there’s definitely a twang to him that goes with the opening bars of faint banjo and Lil Nas’s lil’ cowboy fantasy. The song snowballed into a phenomenon. All kinds of people — cops, soldiers, dozens of dapper black promgoers — posted dances to it on YouTube and TikTok. Then a crazy thing happened. It charted — not just on Billboard’s Hot 100 singles chart, either. In April, it showed up on both its Hot R&B/Hip-Hop Songs chart and its Hot Country Songs chart. A first. And, for now at least, a last.
The gatekeepers of country radio refused to play the song; they didn’t explain why. Then, Billboard determined that the song failed to “embrace enough elements of today’s country music to chart in its current version.” This doesn’t warrant translation, but let’s be thorough, anyway: The song is too black for certain white people.
But by that point it had already captured the nation’s imagination and tapped into the confused thrill of integrated culture. A black kid hadn’t really merged white music with black, he’d just taken up the American birthright of cultural synthesis. The mixing feels historical. Here, for instance, in the song’s sample of a Nine Inch Nails track is a banjo, the musical spine of the minstrel era. Perhaps Lil Nas was too American. Other country artists of the genre seemed to sense this. White singers recorded pretty tributesin support, and one, Billy Ray Cyrus, performed his on a remix with Lil Nas X himself.
The newer version lays Cyrus’s casual grit alongside Lil Nas’s lackadaisical wonder. It’s been No.1 on Billboard’s all-genre Hot 100 singles chart since April, setting a record. And the bottomless glee over the whole thing makes me laugh, too — not in a surprised, yacht-rock way but as proof of what a fine mess this place is. One person’s sign of progress remains another’s symbol of encroachment. Screw the history. Get off my land.
Four hundred years ago, more than 20 kidnapped Africans arrived in Virginia. They were put to work and put through hell. Twenty became millions, and some of those people found — somehow — deliverance in the power of music. Lil Nas X has descended from those millions and appears to be a believer in deliverance. The verses of his song flirt with Western kitsch, what young black internetters branded, with adorable idiosyncrasy and a deep sense of history, the “yee-haw agenda.” But once the song reaches its chorus (“I’m gonna take my horse to the Old Town Road, and ride til I can’t no more”), I don’t hear a kid in an outfit. I hear a cry of ancestry. He’s a westward-bound refugee; he’s an Exoduster. And Cyrus is down for the ride. Musically, they both know: This land is their land.
We can re-engineer the system to create a new political centre, says Charles Wheelan of Dartmouth College and a former candidate for Congress
Strikingly, data from Beyond Conflict, an NGO that promotes reconciliation in conflict areas, show that Americans feel “dehumanised” by the opposing party—a sentiment often associated with political violence—at roughly the same level as Israelis and Palestinians viewed each other during the Gaza War in 2014.
Moreover democracy is being asked to deal with policy challenges that have longer time horizons and more complexity than in the past. So the urgency to fix problems can seem less apparent: it is more like termites in the basement than a collapsing roof. Complexity also opens up a space for demagoguery. Beating back Hitler was no easy feat—but the need to do so was easier to explain than why universal health care requires a health insurance mandate.
Many voters are convinced that politicians are selling them out. They have a point. When I ran for Congress in 2009 as a Democratic candidate in Illinois, I went hat-in-hand to rich donors, as all candidates must. After one meeting with a group of private-equity types, one of them pulled me aside and asked how I felt about the “taxation of carried interest”—an arcane policy that lets major investors pay less tax on their earnings.
I told the fellow that income was income, and that “carried interest” ought to be taxed the same way as everyone else’s paycheck, and not as capital gains.
“That’s too bad,” he said, and walked away. He did not write me a cheque. I lost the race.
.. It did not matter that I was taught by economists at the University of Chicago and took classes from three Nobel Prize winners. Political issues devolve into protecting one’s niche perks. In this case, some of the wealthiest people in the country cared about one issue: whether they could pay a lower tax rate than the people who make their lattes and mow their lawns. More cunning candidates tolerate this to get into office.
The constant need for fundraising also drives partisanship. Emails with subject-lines like “Help me strike a compromise to bring down the deficit” are certain to remain unopened. But drop into an inbox “The Republicans will end Medicare” or “The Democrats are killing babies” and the contributions will flow, helping to make the partisanship ever more toxic.
In this environment, the biggest threat to a candidate is not from an opposition party with a different set of policies but from the extremist end of his or her own party. Hence the rise of the word “primary” as a verb, as in “The Democrats may primary him.” The optimal strategy is ideological purity, even if it means getting nothing done as a legislator.
Now for the really dangerous part: changing demographics have made the electoral college and the Senate increasingly out of sync, as population grows in blue states and wanes in red ones. By 2040 it is possible that roughly 70% of Senate seats will be controlled by 30% of the population. If we are looking for something that can ignite the current partisan tinder, this is it: a prolonged period in which the political will of the majority is thwarted by a minority opposition.
There are lots of ways to do this but the two boldest ideas are to create an independent group of centrist legislators to act as the “king makers” to pass legislation, and to implement something called “ranked-choice” voting that would make it harder for candidates on the political extremes to win election. Consider both in turn.
First, the legislators. It is easy to imagine that a bipartisan group of prominent politicians could step aside from their parties, band together, and create a new movement of the centre. I called this the “fulcrum strategy” in my book “The Centrist Manifesto” in 2013, and it is similar to the recent moves by Labour MPs in Britain, now joined by a few Conservatives.
Just a small handful of defections would go a long way to changing America’s political dynamic. It could provide a pragmatic center of gravity, restore a shared political narrative, rebuild the connective tissue between the parties, and place a healthy check on the Trump administration and whoever comes after, in a way that is less partisan than the Democrats today.
Could it happen? Absolutely. Here is what Jeff Flake, a former Republican senator from Arizona said at a conference this month at Harvard’s Kennedy School of Government: “With three or four Rs and three or four Dems, if they come together now, or just about any time—the Senate rarely has more than a three-, four-, five- or six-person majority on either side—you could really change that place. You could create a completely different power structure. And that would be very healthy right now.”
Joel Searby, a political consultant working to rebuild the centre ground of American politics, says there is “high interest” in doing something like this. Mr Searby has met with chiefs of staff for a handful of senators, both Republicans and Democrats, to pitch the fulcrum idea. “They’re taking meetings with me in their Senate offices, and they know exactly what I’m there to talk about,” he says.
Moreover the Senate just got a new member who is less beholden to the political establishment than most: Mitt Romney, the Republican presidential candidate in 2012 who is also a former governor of Massachusetts, one of the most liberal states in the country. He has been a critic of the president from his own party. Will Mr Romney be the guy to change American politics forever? Or could it be the senator for Maine, Susan Collins? Or West Virginia’s Joe Manchin, a Democrat in a red state? It will only take a few.
The same fulcrum strategy could work at the state level. For all the talk of “red” and “blue” states, the fact is that many state legislatures are as narrowly divided as the Senate, meaning that a mere handful of centrists could band together to restore sanity.
In fact, in Alaska this just happened. After the mid-term election in 2018, a single Republican lawmaker refused to be the 21st vote that would give his party control of the 40-person legislature. Instead, he negotiated a governing coalition of eight Republicans, 15 Democrats and two independents. Committee chairs will be shared across parties and there is an independent speaker of the house.
The public seems receptive to this. After all, the two most popular governors are Republicans in blue states: Larry Hogan in Maryland and Charlie Baker in Massachusetts. This suggests that there are politicians able to cross the partisan divide and that voters will embrace them.
Politics needs to evolve with the times like everything else. The Republican Party emerged to deal with the thorny issue of slavery. Emmanuel Macron built a new party in France and captured a parliamentary majority. If economists can count almost 5,000 breakfast cereals in America, why should its citizens settle for just two political parties?
Precedents exist, such as Israel’s centrist Yesh Atid party that emerged in 2012. There are also examples of tiny factions that exert outsized influence, such as small, religious parties in Israel and Japan. A centrist faction can play the same role in America.
Then there is the issue of voting. There is a powerful change that would be a force for moderation: replace the primary system with a “top four, ranked choice” voting system. Yes, it needs a better name. But it’s the best way to hold elections with multiple candidates.
It works like this: In the first round of voting, the four top vote-getters advance. In the second round, voters rank those four candidates.
If no candidate wins an outright majority, then the candidate with the fewest votes is eliminated. Those who voted for that candidate get their second choice instead. And this process of counting the next-best-candidate continues until one person gets a majority.
This system has three huge advantages. First, it minimises partisanship. Candidates would no longer compete to attract support from the most ideological members of their party but from all voters, which would have a moderating influence.
Second, this would create space for new political competition since independents and third parties would no longer present a “spoiler problem”. For example, Ralph Nader would have been eliminated in 2000 after the first round; most of his votes would probably have gone to Al Gore, who then would have become president.
Third, ranked-choice voting also creates an incentive for candidates to behave more civilly because it is important to be many voters’ “second choice”. Maligning other candidates—and all the other nasty tactics of modern elections—would carry a higher price.
.. The political landscape could change quickly. Reforms like the “fulcrum strategy” and “ranked voting” will make it easier for independents and members of new parties to get elected. Public support for both parties is in secular decline. And much of the partisanship is negative partisanship, meaning that party identification is driven mostly by loathing for the other side. A solid majority of Americans say that the country needs a third major political party.
.. Indeed, the data from Beyond Conflict found that voters believe that members of the other party think worse of them than they actually do. It turns out that we have not dehumanised each other to the same degree as the Palestinians and the Israelis; it only feels that way.
.. There are four faces on America’s Mount Rushmore monument: Washington, Jefferson, Lincoln, and Theodore Roosevelt. With democracy under siege, it is worth thinking about each of them.
George Washington was an independent and warned about political “factions” in his farewell address. Thomas Jefferson said the greatest evil was “a Division of the Republic into two great Parties.” Abraham Lincoln was part of the new Republican Party that arose when the two extant parties could not manage the issue of slavery. Teddy Roosevelt made a third run for president with his own “Bull Moose Party.”
What the four leaders carved in stone share is an unease with partisanship, a willingness to challenge political orthodoxy, and an unwavering belief in democracy. Those are the right principles to bear in mind as we look to strengthen the foundations of our system.
World War I and the adversarial mentality.
It’s the eternal argument. When you are fighting a repulsive foe, the ends justify any means and serve as rationale for any selfishness.
Dax’s struggle is not to change the war or to save lives. That’s impossible. The war has won. The struggle is simply to remain a human being, to maintain some contact with goodness in circumstances that are inhumane.
Disillusionment was the classic challenge for the generation that fought and watched that war. Before 1914, there was an assumed faith in progress, a general trust in the institutions and certainties of Western civilization. People, especially in the educated classes, approached life with a gentlemanly, sporting spirit.
As Paul Fussell pointed out in “The Great War and Modern Memory,” the upper classes used genteel words in place of plain ones: slumber for sleep, the heavens for the sky, conquer for win, legion for army.
The war blew away that gentility, those ideals and that faith in progress. Ernest Hemingway captured the rising irony and cynicism in “A Farewell to Arms.” His hero is embarrassed “by the words sacred, glorious and sacrifice and the expression, in vain.” He had seen nothing sacred in the war, nothing glorious, just meaningless slaughter.
.. European culture suffered a massive disillusion during the conflict — no God, no beauty, no coherence, no meaning, just the cruel ironic joke of life. Cynicism breeds a kind of nihilism, a disbelief in all values, an assumption that others’ motives are bad.
Fussell wrote that the war spread an adversarial mentality. The men in the trenches were obsessed with the enemy — those anonymous creatures across no man’s land who rained down death. “Prolonged trench warfare, whether enacted or remembered, fosters paranoid melodrama,” he wrote.
The “versus habit” construes reality as us versus them — a mentality that spread through British society. It was the officers versus the men, and, when they got home, the students at university versus the dons.
George Orwell wrote that he recognized the Great War mentality lingering even in the 1930s in his own left-wing circles — the same desire to sniff out those who departed from party orthodoxy, the same retelling of mostly false atrocity stories, the same war hysteria. As Christopher Isherwood put it, all the young people who were ashamed of never having fought in the war brought warlike simplicities to political life.
.. Some of the disillusioned drop out of public life, since it’s all meaningless. But others want to burn it all down because it’s all rotten. Moderation is taken for cowardice. Aggression is regarded as courage. No conciliatory word is permitted when a fighting word will do.
Today we face no horrors equal to the Great War, but there is the same loss of faith in progress, the reality of endless political trench warfare, the paranoid melodrama, the specter that we are all being dehumanized amid the fight.
It is a stunning turnabout. A party that once spoke with urgency and apparent conviction about the importance of ethical leadership — fidelity, honesty, honor, decency, good manners, setting a good example — has hitched its wagon to the most thoroughly and comprehensively corrupt individual who has ever been elected president. Some of the men who have been elected president have been unscrupulous in certain areas — infidelity, lying, dirty tricks, financial misdeeds — but we’ve never before had the full-spectrum corruption we see in the life of Donald Trump.
.. And the moral indictment against Mr. Trump is obvious and overwhelming. Corruption has been evident in Mr. Trump’s private and public life,
- in how he has treated his wives,
- in his business dealings and scams,
- in his pathological lying and cruelty,
- in his bullying and shamelessness,
- in his conspiracy-mongering and appeals to the darkest impulses of Americans. (Senator Bob Corker, a Republican, refers to the president’s race-based comments as a “base stimulator.”)
Mr. Trump’s corruptions are ingrained, the result of a lifetime of habits. It was delusional to think he would change for the better once he became president.
.. Some of us who have been lifelong Republicans and previously served in Republican administrations held out a faint hope that our party would at some point say “Enough!”; that there would be some line Mr. Trump would cross, some boundary he would transgress, some norm he would shatter, some civic guardrail he would uproot, some action he would take, some scheme or scandal he would be involved in that would cause large numbers of Republicans to break with the president. No such luck. Mr. Trump’s corruptions have therefore become theirs. So far there’s been no bottom, and there may never be.
.. the Republican Party’s as-yet unbreakable attachment to Mr. Trump is coming at quite a cost. There is the rank hypocrisy, the squandered ability to venerate public character or criticize Democrats who lack it, and the damage to the white Evangelical movement, which has for the most part enthusiastically rallied to Mr. Trump and as a result has been largely discredited.
.. Mr. Trump and the Republican Party are right now the chief emblem of corruption and cynicism in American political life, of an ethic of might makes right. Dehumanizing others is fashionable and truth is relative. (“Truth isn’t truth,” in the infamous words of Mr. Trump’s lawyer Rudy Giuliani.) They are stripping politics of its high purpose and nobility.
.. A warning to my Republican friends: The worst is yet to come. Thanks to the work of Robert Mueller — a distinguished public servant, not the leader of a “group of Angry Democrat Thugs” — we are going to discover deeper and deeper layers to Mr. Trump’s corruption. When we do, I expect Mr. Trump will unravel further as he feels more cornered, more desperate, more enraged; his behavior will become ever more erratic, disordered and crazed.
Most Republicans, having thrown their MAGA hats over the Trump wall, will stay with him until the end. Was a tax cut, deregulation and court appointments really worth all this?
“Hate speech” is extraordinarily vague and subjective. Libel and slander are not... Most appallingly, he has insisted that these grieving families were faking their pain: “I’ve looked at it and undoubtedly there’s a cover-up, there’s actors, they’re manipulating, they’ve been caught lying and they were preplanning before it and rolled out with it.”.. Rather than applying objective standards that resonate with American law and American traditions of respect for free speech and the marketplace of ideas, the companies applied subjective standards that are subject to considerable abuse. Apple said it “does not tolerate hate speech.” Facebook accused Mr. Jones of violating policies against “glorifying violence” or using “dehumanizing language to describe people who are transgender, Muslims and immigrants.” YouTube accused Mr. Jones of violating policies against “hate speech and harassment.”.. In the name of stopping hate speech, university mobs have turned their ire not just against alt-right figures like Milo Yiannopoulos and Richard Spencer, but also against the most mainstream of conservative voices, like Ben Shapiro and Heather Mac Donald.Dissenting progressives aren’t spared, either. Just ask Evergreen State College’s Bret Weinstein, who was hounded out of a job after refusing to participate in a “day of absence” protest in which white students and faculty members were supposed to leave campus for the day to give students and faculty members of color exclusive access to the college... The far better option would be to prohibit libel or slander on their platforms... Unlike “hate speech,” libel and slander have legal meanings. There is a long history of using libel and slander laws to protect especially private figures from false claims. It’s properly more difficult to use those laws to punish allegations directed at public figures, but even then there are limits on intentionally false factual claims.
I don’t know about you, but I find American life these days positively exhausting. Everything is always trying to wind you up, from political tweets and cable news to sports debate shows, thrill-ride movies and Internet headlines that will say anything to make you click on a link. Small wonder that many people are looking for things that don’t do that, but that offer what we might call counterprogramming to our whole troll-infested culture.
Audiences have found that in what may be the summer’s most surprising and beloved hits – “Won’t You Be My Neighbor,” Morgan Neville’s moving documentary about Fred Rogers, the creator and star of “Mister Rogers’ Neighborhood,” and “Nanette,” starring the Australian comic Hannah Gadsby, which has been called transformative by viewers, critics and her fellow comedians.
.. Born into money, ordained as a Christian minister, registered as a lifelong Republican, Rogers turned out to be a gentle radical whose mission was to embody and promote humane values. As Neville shows, “Mister Rogers’ Neighborhood” was inspired by Rogers’ dismay at the existing television shows for children, which he thought degrading, fatuous, thoughtlessly violent and designed to transform kids into consumers.
.. Then, she shifts gears, and we discover a value she shares with Fred Rogers, a refusal to play along with the rules of the medium of which they are a part. Just as he thought ordinary TV demeaned children, Gadsby explains why she can no longer do stand-up. She argues that stand-up works by ratcheting up tension with psychologically fraught material then releasing it with a punchline. And the demands of this process, tension and release, keep you from saying anything that doesn’t fit into that pattern.
.. neither Gadsby nor Rogers are scolds who hate art, which is, after all, a way of expressing feelings and truths that can’t be fully expressed any other way. In fact, both are consciously artful in what they do. But they also suggest that too much commercial entertainment is dehumanizing because it’s all about prompting an instantly pleasurable reaction. “Won’t You Be My Neighbor” and “Nanette” do precisely the opposite. They’re humanizing.
Both the Christian religion and American psyche need deep cleansing and healing from our many unhealed wounds. Only a contemplative mind can hold our fear, confusion, vulnerability, and anger and guide us toward love.
Contemplative Christians can model a way of building a collaborative, compassionate politics. I suggest we start by reclaiming the wisdom of Trinity, a circle dance of mutuality and communion. Humans—especially the powerful, the wealthy, and supporters of the patriarchal system—are more comfortable with a divine monarch at the top of pyramidal reality. So Christians made Jesus into a distant, imperial God rather than a living member of divine-human relationship.
.. Isaiah tried to teach such servanthood to Israel in the classic four “servant songs.”  But Hebrew history preceded what Christianity repeated: both traditions preferred kings, wars, and empires instead of suffering servanthood or leveling love.
.. We believe our elected officials are called to public service, not public tyranny, so we must protect the limits, checks, and balances of democracy and encourage humility and civility on the part of elected officials. . . .
We reject any moves toward autocratic political leadership and authoritarian rule. . . . Disrespect for the rule of law, not recognizing the equal importance of our three branches of government, and replacing civility with dehumanizing hostility toward opponents are of great concern to us. Neglecting the ethic of public service and accountability, in favor of personal recognition and gain often characterized by offensive arrogance, are not just political issues for us. They raise deeper concerns about political idolatry, accompanied by false and unconstitutional notions of authority. 
.. We already have all the power (dynamis) we need both within us and between us—in fact, Jesus assures us that we are already “clothed” in it “from on high” (see Luke 24:49)!