THANKS TO GLOBE-SPANNING SOCIAL PLATFORMS like Facebook, YouTube, and Twitter, misinformation (any wrong information) and disinformation (intentional misinformation like propaganda) have never been able to spread so rapidly or so far, powered by algorithms and automated filters. But misinformation expert Joan Donovan, who runs the Technology and Social Change Research Project at Harvard’s Shorenstein Center, says social media platforms are not the only ones who play a critical role in perpetuating the misinformation problem. Journalists and media companies also do, Donovan says, because they often help to amplify misinformation when they cover it and the bad actors who create it, often without thinking about the impact of their coverage.
There is clearly more misinformation around than in previous eras, Donovan tells CJR in a recent interview on our Galley discussion platform, because there’s just a lot more media, and therefore a lot more opportunity to distribute it. “But quantity never really matters unless there is significant attention to the issue being manipulated,” she says. “So this is where my research is fundamentally about journalism and not about audiences. Trusted information brokers, like journalists and news organizations, are important targets for piggybacking misinformation campaigns into the public sphere.”
Donovan’s research looks at how trolls and others—whether they are government-backed or freelance—can use techniques including “social engineering” (lying to or manipulating someone to achieve a specific outcome) and low-level hacking to persuade journalists and news outlets of the newsworthiness of a specific campaign. “Once that story gets picked up by a reputable outlet, it’s game time,” she says. Donovan and other misinformation experts warned that the Christchurch shooter’s massive essay about his alleged justification for the incident in April was clearly designed to get as much media attention as possible, by playing on certain themes and popular topics, and they advised media outlets not to play into this strategy by quoting from it.
ICYMI: I went to prison for leaking state secrets. Now, I want to make sure sources are protected.
Before she joined the Shorenstein Center at Harvard last year, Donovan was a member of the research group Data & Society, where she led the Media Manipulation Initiative, mapping how interest groups, governments, and political operatives use the internet and the media to intentionally manipulate messages. Data & Society published an extensive report on the problem last year, written by Syracuse University media studies professor Whitney Phillips, entitled “The Oxygen of Amplification,” with advice on how to cover topics like white supremacy and the alt-right without giving them more credibility in the process.
“Sometimes, I want to throw my hands in the air and grumble, ‘We know what we know from history! Journalists are not outside of society. In fact, they are the most crucial way the public makes sense of the world,” Donovan writes in her Galley interview. “When journalists pay attention to a particular person or issue, we all do… and that has reverberating effects.’” As part of her postdoctoral research, Donovan looked at racial violence and media coverage in the 1960s and 1970s, when the Ku Klux Klan was active. “The Klan had a specific media strategy to cultivate journalists for positive coverage of their events,” Donovan says. “As journalists pivoted slowly to covering the civil rights movement with a sympathetic tone, Klan violence rises—but also public spectacles, torch marches, and cross burnings. These acts are often done with the potential for media coverage in mind.”
Sometimes, I want to throw my hands in the air and grumble, ‘We know what we know from history! Journalists are not outside of society. In fact, they are the most crucial way the public makes sense of the world.
While mass shootings are clearly newsworthy, Donovan says, the internet introduces a new dynamic where all stories on a topic are instantly available to virtually anyone anywhere around the globe. And the fact that they are shared and re-shared and commented on via half a dozen different social networks means that “journalists quickly lose control over the reception of their work,” she says. “This is why it is even more crucial that journalists frame stories clearly and avoid embedding and hyperlinking to known online spaces of radicalization.” Despite this kind of advice from Donovan and others, including sociologist Zeynep Tufekci, a number of media outlets linked to the Christchurch shooter’s writings, and at least one even included a clip from the live-streamed video of his attack.
When it comes to what the platforms themselves should do about mitigating the spread of misinformation and the amplification of extremists, Donovan says the obvious thing is that they should remove accounts that harass and use hate speech to silence others. This “would go a long way to stamping out the influencers who are providing organizing spaces for their fans to participate in networked harassment and bullying,” she says. On YouTube, some would-be “influencers” use hate speech as a way to attract new audiences and solicit donations, Donovan says, and these attempts are aided by the algorithms and the ad-driven model of the platforms. “These influencers would not have grown this popular without the platform’s consent,” she says. “Something can be done and the means to do it are already available.”
On the topic of the recent Christchurch Call—a commitment to take action on extremism signed by the governments of New Zealand, France, Canada, and a number of other nations, along with tech platforms like Google, Facebook, and Twitter—Donovan says that until there are tangible results, the agreement looks like just another pledge to do better. “These companies apologize and make no specific commitments to change. There are no benchmarks to track progress, no data trails to audit, no human rights abuses accounted for.” Something the Christchurch Call also doesn’t address, Donovan says, are the fundamental incentives behind how hate groups are financed and resourced online, “thanks to access to payment processIng and broadcast technologies at will.”