It’s social media in the age of “patriotic trolling” in the Philippines, where the government is waging a campaign to destroy a critic—with a little help from Facebook itself.
The phenomenon, sometimes referred to as “patriotic trolling,” involves the use of targeted harassment and propaganda meant to go viral and to give the impression that there is a groundswell of organic support for the government. Much of the trolling is carried out by true believers, but there is evidence that some governments, including Duterte’s, pay people to execute attacks against opponents. Trolls use all the social media platforms—including Twitter, Instagram, and YouTube, in addition to the comments sections of news sites. But in the Philippines, Facebook is dominant.
Ressa exposed herself to this in September 2016, a little more than three months after the election. On a Friday night, a bomb ripped through a night market in Davao City, Duterte’s hometown, killing 14 and injuring dozens more. Within hours, Duterte implemented a nationwide state of emergency. That weekend, the most-read story on Rappler was an archived item about the arrest of a man caught planting an improvised explosive device, also in Davao City. The article had been written six months earlier, and the incident had no connection to the night market bombing—but it was circulating on the same Facebook pages that promoted Duterte’s presidency, and people were commenting on it as if to justify the state of emergency.
.. The Rappler data team had spent months keeping track of the Facebook accounts that were going after critics of Duterte. Now Ressa found herself following the trail of her own critics as well. She identified 26 accounts that were particularly virulent. They were all fake (one account used a photo of a young woman who was actually a Korean pop star) and all followed one another. The 26 accounts were posting nearly the exact same content, which was also appearing on faux-news sites such as Global Friends of Rody Duterte and Pinoy Viral News.
The messages being posted consistently linked back to pro-Duterte pages. Ressa and her team put all these accounts into a database, which grew rapidly as they began automating the collection of information, scraping Facebook pages and other public sites. They took to calling their database the Shark Tank. Today it contains more than 12 million accounts that have created or distributed pro-Duterte messages or fake news. Ressa isn’t sure how many of these accounts are fake
Even in the U.S., where Facebook has been hauled before Congress to explain its role in a Russian disinformation campaign designed to influence the U.S. presidential election, the company doesn’t have a clear answer for how it will stem abuse. It says it will add 10,000 workers worldwide to handle security issues, increase its use of third-party fact-checkers to identify fake news, and coordinate more closely with governments to find sources of misinformation and abuse. But the most challenging questions—such as what happens when the government itself is a bad actor and where to draw the line between free speech and a credible threat of violence—are beyond the scope of these fixes. What stays and what goes from the site is still decided subjectively, often by third-party contractors—many of them stationed, as it happens, in the Philippines, a long-standing outsourcing hub.
Facebook is inherently conflicted. It promises advertisers it will deliver interested and engaged users—and often what is interesting and engaging is salacious, aggressive, or simply false. “I don’t think you can underestimate how much of a role they play in societal discourse,” says Carly Nyst, a London-based consultant on technology and human rights who has studied patriotic trolling around the world. “This is a real moment that they have to take some responsibility. These tools they’ve promised as tools of communication and connection are being abused.”
.. Facebook’s executives say the company isn’t interested in being an arbiter of truth, in part because it doesn’t want to assume the role of censor or be seen as having an editorial opinion that may alienate users. Nonetheless, it’s been under increasing pressure to act. In the Philippines, it began conducting safety workshops in 2016 to educate journalists and nongovernmental organization workers. These cover the basics: an overview of the company’s community standards policies, how to block a harasser, how to report abusive content, how to spot fake accounts and other sources of misinformation. The company has increased the number of Tagalog speakers on its global Community Operations team in an effort to better root out local slurs and other abusive language.
Still, Facebook maintains that an aspect of the problem in the Philippines is simply that the country has come online fast and hasn’t yet learned the emergent rules of the internet. In October the company offered a “Think Before You Share” workshop for Filipino students, which focused on teaching them “digital literacy” skills, including critical thinking, empowerment, kindness, and empathy.
Nyst says this amounts to “suggesting that digital literacy should also encapsulate the ability to distinguish between state-sponsored harassment and fake news and genuine content.” The company, she says, “is taking the position that it is individuals who are at fault for being manipulated by the content that appears on Facebook’s platform.”
.. Rappler was born on Facebook and lives there still—it’s the predominant source of Rappler’s traffic. So Ressa finds herself in an awkward spot. She has avoided rocking the boat, because she worries that one of the most powerful companies in the world could essentially crush her. What if Facebook tweaked the algorithm for the Rappler page, causing traffic to plummet? What if it selectively removed monetization features critical to the site’s success? “There’s absolutely no way we can tell what they’re doing, and they certainly do not like being criticized,” she says. But after more than a year of polite dialogue with Facebook, she grew impatient and frustrated.
In a trip to Washington in early November, she met with several lawmakers, telling them that she believes Facebook is being used by autocrats and repressive regimes to manipulate public opinion and that the platform has become a tool for online hooliganism. She did the same in a speech at a dinner hosted by the National Democratic Institute, where Rappler was presented with an award for “being on the front lines of fighting the global challenge of disinformation and false news.”
As she accepted her award, Ressa recalled that she started as a journalist in the Philippines in 1986, the year of the People Power Revolution, an uprising that ultimately led to the departure of Ferdinand Marcos and the move from authoritarian rule to democracy. Now she’s worried that the pendulum is swinging back and that Facebook is hastening the trend. “They haven’t done anything to deal with the fundamental problem, which is they’re allowing lies to be treated the same way as truth and spreading it,” she says. “Either they’re negligent or they’re complicit in state-sponsored hate.”
.. In November, Facebook announced a new partnership with the Duterte government. As part of its efforts to lay undersea cables around the world, Facebook agreed to team up with the government to work on completing a stretch bypassing the notoriously challenging Luzon Strait, where submarine cables in the past have been damaged by typhoons and earthquakes. Facebook will fund the underwater links to the Philippines and provide a set amount of bandwidth to the government. The government will build cable landing stations and other necessary infrastructure.
That’s the sort of big project Facebook embraces. It’s also testing a solar-powered drone that will beam the internet to sub-Saharan Africa and has a team of engineers working on a brain implant to allow users to type with their minds. To Ressa, Facebook looks like a company that will take on anything, except protecting people like her. —With Sarah Frier and Michael Riley