violence is a part of human nature, too, but the technology of the atom bomb multiplies the danger for us all. So too with these newsfeed algorithms, which favor engagement above everything else, no matter how base and degraded the content is.
I’ve been thinking a lot about this interview with Jaron Lanier, and I’ll just share an excerpt because I think it provides some insight: “The problem, however, is that behind the scenes there are these manipulation, behavior modification, and addiction algorithms that are running. And these addiction algorithms are blind. They’re just dumb algorithms. What they want to do is take whatever input people put into the system and find a way to turn it into the most engagement possible. And the most engagement comes from the startle emotions, like fear and anger and jealousy, because they tend to rise the fastest and then subside the slowest in people, and the algorithms are measuring people very rapidly, so they tend to pick up and amplify startle emotions over slower emotions like the building of trust or affection.” https://lareviewofbooks.org/article/delete-your-account-a-co…!
During the 2016 campaign, Zeynep Tufekci was watching videos of Donald Trump rallies on YouTube. But then, she writes, she “noticed something peculiar. YouTube started to recommend and ‘autoplay’ videos for me that featured white supremacist rants, Holocaust denials and other disturbing content.”
And it wasn’t just Trump videos. Watching Hillary Clinton rallies got her “arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11.” Nor was it just politics. “Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.”
Tufekci is a New York Times columnist and a professor at the University of North Carolina. She’s also one of the clearest thinkers around on how digital platforms work, how their algorithms understand and shape our preferences, and what the consequences are for society. So as we learn that Facebook is detecting new efforts at electoral manipulation and as we watch online politics become ever more bitter and divisive, I wanted to talk with Tufekci about how digital platforms have become engines of radicalization, and what we can do about it.
In an oral culture, memory is prized.
In a social media culture, attention-getting is prized. The Kardashians do this. Trump is an ex-reality television star, because that is what he excelled at. She thinks this won’t work well because it will be misunderstood. You don’t have control over where it goes.
What is this media training us to do? It is rewarding attention-grabbing with political power and money. Politicians try to get attention without letting it take over.
The space is so crowded, so competitive.
What really wins when thousands of things are competing? (28:50 min)
Things that outrage or excite core identities. Really funny, mean, or shocking.
We are taught to believe that competition is always better. The more we train people to win this war, it is easy to see how so much falls along identity lines, funny, mean, shocking.
Every company knows the power of the default.
The most effective forms of censorship involve messing with trust and attention.
Is censorship the right word? People are asking this of Facebook and Google.
What to do with Alex Jones and what to call him?
3 degrees of Alex Jones: you can start anywhere on Facebook? and Alex Jones will be recommended.
With InfoWars they are targeting people for violent incitement. Claiming that the Sandy Hooks parents kids are actors and they pretended a shooting occurred so that the government can take your guns away.
They are not governments; they are gatekeepers.
Ted Cruz has allied himself with someone who said his father helped kill JFK.
We need forms of due process
In the first hours after the Texas school shooting that left at least 10 dead Friday, online hoaxers moved quickly to spread a viral lie, creating fake Facebook accounts with the suspected shooter’s name and a doctored photo showing him wearing a “Hillary 2016” hat.
Several were swiftly flagged by users and deleted by the social network. But others rose rapidly in their place: Chris Sampson, a disinformation analyst for a counterterrorism think tank, said he could see new fakes as they were being created and filled out with false information, including images linking the suspect to the anti-fascist group Antifa.
It has become a familiar pattern in the all-too-common aftermath of U.S. school shootings: A barrage of online misinformation, seemingly designed to cloud the truth or win political points.
.. But some social media watchers said they were still surprised at the speed with which the Santa Fe shooting descended into information warfare. Sampson said he watched the clock after the suspect was first named by police to see how long it would take for a fake Facebook account to be created in the suspect’s name: less than 20 minutes... Some critics suggested the site should force new accounts into a waiting period before they are publicly available or that the company should more aggressively watch names in the news for potential fakes.