Civil discourse is in decline, with potentially dire results for American democracy.
People born after 1995, especially the coasts and Chicago feel anxiety and fear.
Kids on milk cartons
We deprived kids to develop their normal risk taking abilities
Social media spreads to kids who are 11, 12, 13, and this stresses kids
- imagine the absolute worst of Jr High School, 24-hours a day forever
- Social media develops an echo chamber which gives you a dopamine rush
(30 min) Some people are looking to interpreting things in the worst possible light and Call-Out things.
There is no trust.
There are more conservatives and more liberals and less moderates.
(34 min) Upper class liberals are reporting their lower class minority people for being insensitive.
3 Great Untruths:
- What doesn’t kill you makes you weaker.
- Always trust your feelings.
- Life is a battle between good people and evil people.
Many of the people most passionate about aggressive speech police belong to high class liberal elites.
All social apps grow until you need a newsfeed
All newsfeeds grow until you need an algorithmic feed
All algorithmic feeds grow until you get fed up of not seeing stuff/seeing the wrong stuff & leave for new apps with less overload
All those new apps grow until…
A pessimist might say this looks like slash & burn agriculture, or perhaps the old joke ‘No-one goes there anymore – it’s too crowded.’ That is, for social, Metcalfe’s Law might look more like a bell curve. I don’t know what the next product here will be (I didn’t create Snap, after all). But tech like this tends to move in cycles – we swing from one kind of expression to another and back again, and we might be swinging away from the feed.
Finally, any such changes have consequences for the traffic that sharing creates. ‘Like’ buttons made it frictionless to post any web page you want into your feed and push it to (some arbitrarily calculated percentage of) your friends, and many hands have been wrung about how much traffic this can drive and how Facebook moves things up and down the feed ranking. But sharing links inside Stories isn’t the same, today, and a link you share in a WhatsApp or iMessage group with 5 friends will only be seen by them, and Facebook has no lever to pull to make this more or less visible. On the other hand, the ‘WhatsApp forward’ can take such a link and send it viral across a country, and where Facebook can ultimately kill a link or an entire source across the whole site if it really wants to, it’s very different for a P2P messaging app to make that call (outside China, of course). That is, the plea from many media companies to ‘up-rank’ their posts in the newsfeed – to make people eat their greens – and to kill ‘fake news’ links is at least theoretically possible on Facebook. It’s not possible in iMessage – with end-to-end encryption, Apple has no idea what you’re sharing.
The year is 2016. The place, Facebook. A 30-something man is scrolling through his newsfeed when he sees the inflammatory headline of a news article bashing the presidential candidate he supports. Angrily, the man glances up to see who posted the article, hesitates momentarily, and then “unfriends” the “friend” he has not seen since high school.
Is the man right to remove the offending presence? After all, the article was clearly biased, and discussing politics over social media never changes anyone’s mind, right?
.. The idea of social media echo chambers has garnered much attention lately. The concept of “confirmation bias” — the instinct to seek information that supports a current belief or conviction — has long been established in the world of science, and is something to be avoided whenever searching for truth. But in social media, this bias is propagated simply by reading, liking, and sharing content that acts to support those convictions we already hold, while avoiding content that challenges our beliefs. Essentially, we begin to isolate ourselves from those opposing opinions until we’re surrounded with people who agree with us.
.. “That’s a problem for Christians,” Goforth continues, “It’s like I want the simplistic kind of thing. Keep me out of the grays. But my view is, the grays are where the interesting things are, and it’s also where God can do things. When you are uncertain and when things are confusing you turn to God. So it’s an opportunity for Him to work in our lives.”
But it doesn’t stop there. In today’s age of data tracking, each like or click provides search engines and social media sites with information about the kinds of things we like and then works to provide us with more of the same, further insulating us from news or opinions we don’t want to see.
The theory is we become stuck in a feedback loop, liberated from the uncomfortable experience of confronting ideas or beliefs that oppose or challenge our own. And the theory is true, if only to an extent.
.. The researchers concluded that, “Unlike news, where there is solid evidence that people seek out ideologically consistent viewpoints, social media functions differently, perhaps driven by different motivations for use,” they wrote. “[Social media users] may come to learn that their friends don’t agree with them politically but recognize that disagreement isn’t a deal breaker, hence fostering some attenuation of dislike for people we disagree with.”
This recognition — that political views don’t have to make or break a relationship — is a great example of embracing “the grays,” that Goforth referred to. It requires inhabiting a space of tension, holding firmly to that view we profess, while also valuing the human being across from us enough to be drawn into a conversation, rather than walling ourselves off from the “opposition.”
If that mutual respect is demonstrated, social media can be used as a tool to foster community. Of course, if the subject of our ire continuously perpetuates a disregard for the value of others, perhaps engaging that individual over social media will prove fruitless, in which case the unfollow or unfriend options become reasonable.
However, in most instances, what we get out of social media is what we put in. Therefore, if used intentionally, it can prove to create opportunities to genuinely engage with others. Sending a heartfelt message, rather than a quippy reply to a challenging post, demonstrates a willingness to connect beyond a public fray. Yes, this requires more effort, but the payoff is real relationships in which God can move.
By turning away from what’s easiest and stepping into the gray areas, the unknown, where no one person has everything right, we allow God to work in our lives and mold us into the people He created us to be — people who are humble and open, and who acknowledge the inherent value of others.
It’s social media in the age of “patriotic trolling” in the Philippines, where the government is waging a campaign to destroy a critic—with a little help from Facebook itself.
The phenomenon, sometimes referred to as “patriotic trolling,” involves the use of targeted harassment and propaganda meant to go viral and to give the impression that there is a groundswell of organic support for the government. Much of the trolling is carried out by true believers, but there is evidence that some governments, including Duterte’s, pay people to execute attacks against opponents. Trolls use all the social media platforms—including Twitter, Instagram, and YouTube, in addition to the comments sections of news sites. But in the Philippines, Facebook is dominant.
Ressa exposed herself to this in September 2016, a little more than three months after the election. On a Friday night, a bomb ripped through a night market in Davao City, Duterte’s hometown, killing 14 and injuring dozens more. Within hours, Duterte implemented a nationwide state of emergency. That weekend, the most-read story on Rappler was an archived item about the arrest of a man caught planting an improvised explosive device, also in Davao City. The article had been written six months earlier, and the incident had no connection to the night market bombing—but it was circulating on the same Facebook pages that promoted Duterte’s presidency, and people were commenting on it as if to justify the state of emergency.
.. The Rappler data team had spent months keeping track of the Facebook accounts that were going after critics of Duterte. Now Ressa found herself following the trail of her own critics as well. She identified 26 accounts that were particularly virulent. They were all fake (one account used a photo of a young woman who was actually a Korean pop star) and all followed one another. The 26 accounts were posting nearly the exact same content, which was also appearing on faux-news sites such as Global Friends of Rody Duterte and Pinoy Viral News.
The messages being posted consistently linked back to pro-Duterte pages. Ressa and her team put all these accounts into a database, which grew rapidly as they began automating the collection of information, scraping Facebook pages and other public sites. They took to calling their database the Shark Tank. Today it contains more than 12 million accounts that have created or distributed pro-Duterte messages or fake news. Ressa isn’t sure how many of these accounts are fake
Even in the U.S., where Facebook has been hauled before Congress to explain its role in a Russian disinformation campaign designed to influence the U.S. presidential election, the company doesn’t have a clear answer for how it will stem abuse. It says it will add 10,000 workers worldwide to handle security issues, increase its use of third-party fact-checkers to identify fake news, and coordinate more closely with governments to find sources of misinformation and abuse. But the most challenging questions—such as what happens when the government itself is a bad actor and where to draw the line between free speech and a credible threat of violence—are beyond the scope of these fixes. What stays and what goes from the site is still decided subjectively, often by third-party contractors—many of them stationed, as it happens, in the Philippines, a long-standing outsourcing hub.
Facebook is inherently conflicted. It promises advertisers it will deliver interested and engaged users—and often what is interesting and engaging is salacious, aggressive, or simply false. “I don’t think you can underestimate how much of a role they play in societal discourse,” says Carly Nyst, a London-based consultant on technology and human rights who has studied patriotic trolling around the world. “This is a real moment that they have to take some responsibility. These tools they’ve promised as tools of communication and connection are being abused.”
.. Facebook’s executives say the company isn’t interested in being an arbiter of truth, in part because it doesn’t want to assume the role of censor or be seen as having an editorial opinion that may alienate users. Nonetheless, it’s been under increasing pressure to act. In the Philippines, it began conducting safety workshops in 2016 to educate journalists and nongovernmental organization workers. These cover the basics: an overview of the company’s community standards policies, how to block a harasser, how to report abusive content, how to spot fake accounts and other sources of misinformation. The company has increased the number of Tagalog speakers on its global Community Operations team in an effort to better root out local slurs and other abusive language.
Still, Facebook maintains that an aspect of the problem in the Philippines is simply that the country has come online fast and hasn’t yet learned the emergent rules of the internet. In October the company offered a “Think Before You Share” workshop for Filipino students, which focused on teaching them “digital literacy” skills, including critical thinking, empowerment, kindness, and empathy.
Nyst says this amounts to “suggesting that digital literacy should also encapsulate the ability to distinguish between state-sponsored harassment and fake news and genuine content.” The company, she says, “is taking the position that it is individuals who are at fault for being manipulated by the content that appears on Facebook’s platform.”
.. Rappler was born on Facebook and lives there still—it’s the predominant source of Rappler’s traffic. So Ressa finds herself in an awkward spot. She has avoided rocking the boat, because she worries that one of the most powerful companies in the world could essentially crush her. What if Facebook tweaked the algorithm for the Rappler page, causing traffic to plummet? What if it selectively removed monetization features critical to the site’s success? “There’s absolutely no way we can tell what they’re doing, and they certainly do not like being criticized,” she says. But after more than a year of polite dialogue with Facebook, she grew impatient and frustrated.
In a trip to Washington in early November, she met with several lawmakers, telling them that she believes Facebook is being used by autocrats and repressive regimes to manipulate public opinion and that the platform has become a tool for online hooliganism. She did the same in a speech at a dinner hosted by the National Democratic Institute, where Rappler was presented with an award for “being on the front lines of fighting the global challenge of disinformation and false news.”
As she accepted her award, Ressa recalled that she started as a journalist in the Philippines in 1986, the year of the People Power Revolution, an uprising that ultimately led to the departure of Ferdinand Marcos and the move from authoritarian rule to democracy. Now she’s worried that the pendulum is swinging back and that Facebook is hastening the trend. “They haven’t done anything to deal with the fundamental problem, which is they’re allowing lies to be treated the same way as truth and spreading it,” she says. “Either they’re negligent or they’re complicit in state-sponsored hate.”
.. In November, Facebook announced a new partnership with the Duterte government. As part of its efforts to lay undersea cables around the world, Facebook agreed to team up with the government to work on completing a stretch bypassing the notoriously challenging Luzon Strait, where submarine cables in the past have been damaged by typhoons and earthquakes. Facebook will fund the underwater links to the Philippines and provide a set amount of bandwidth to the government. The government will build cable landing stations and other necessary infrastructure.
That’s the sort of big project Facebook embraces. It’s also testing a solar-powered drone that will beam the internet to sub-Saharan Africa and has a team of engineers working on a brain implant to allow users to type with their minds. To Ressa, Facebook looks like a company that will take on anything, except protecting people like her. —With Sarah Frier and Michael Riley