Mark Zuckerberg, the chief executive, has signed off on an effort to show users pro-Facebook stories and to distance himself from scandals.
Mark Zuckerberg, Facebook’s chief executive, signed off last month on a new initiative code-named Project Amplify.
The effort, which was hatched at an internal meeting in January, had a specific purpose: to use Facebook’s News Feed, the site’s most important digital real estate, to show people positive stories about the social network.
The idea was that pushing pro-Facebook news items — some of them written by the company — would improve its image in the eyes of its users, three people with knowledge of the effort said. But the move was sensitive because Facebook had not previously positioned the News Feed as a place where it burnished its own reputation. Several executives at the meeting were shocked by the proposal, one attendee said.
Project Amplify punctuated a series of decisions that Facebook has made this year to aggressively reshape its image. Since that January meeting, the company has begun a multipronged effort to change its narrative by distancing Mr. Zuckerberg from scandals, reducing outsiders’ access to internal data, burying a potentially negative report about its content and increasing its own advertising to showcase its brand.
The moves amount to a broad shift in strategy. For years, Facebook confronted crisis after crisis over privacy, misinformation and hate speech on its platform by publicly apologizing. Mr. Zuckerberg personally took responsibility for Russian interference on the site during the 2016 presidential election and has loudly stood up for free speech online. Facebook also promised transparency into the way that it operated.
But the drumbeat of criticism on issues as varied as racist speech and vaccine misinformation has not relented. Disgruntled Facebook employees have added to the furor by speaking out against their employer and leaking internal documents. Last week, The Wall Street Journal published articles based on such documents that showed Facebook knew about many of the harms it was causing.
So Facebook executives, concluding that their methods had done little to quell criticism or win supporters, decided early this year to go on the offensive, said six current and former employees, who declined to be identified for fear of reprisal.
“They’re realizing that no one else is going to come to their defense, so they need to do it and say it themselves,” said Katie Harbath, a former Facebook public policy director.
The changes have involved Facebook executives from its marketing, communications, policy and integrity teams. Alex Schultz, a 14-year company veteran who was named chief marketing officer last year, has also been influential in the image reshaping effort, said five people who worked with him. But at least one of the decisions was driven by Mr. Zuckerberg, and all were approved by him, three of the people said.
Joe Osborne, a Facebook spokesman, denied that the company had changed its approach.
“People deserve to know the steps we’re taking to address the different issues facing our company — and we’re going to share those steps widely,” he said in a statement.
For years, Facebook executives have chafed at how their company appeared to receive more scrutiny than Google and Twitter, said current and former employees. They attributed that attention to Facebook’s leaving itself more exposed with its apologies and providing access to internal data, the people said.
So in January, executives held a virtual meeting and broached the idea of a more aggressive defense, one attendee said. The group discussed using the News Feed to promote positive news about the company, as well as running ads that linked to favorable articles about Facebook. They also debated how to define a pro-Facebook story, two participants said.
That same month, the communications team discussed ways for executives to be less conciliatory when responding to crises and decided there would be less apologizing, said two people with knowledge of the plan.
Mr. Zuckerberg, who had become intertwined with policy issues including the 2020 election, also wanted to recast himself as an innovator, the people said. In January, the communications team circulated a document with a strategy for distancing Mr. Zuckerberg from scandals, partly by focusing his Facebook posts and media appearances on new products, they said.
The Information, a tech news site, previously reported on the document.
The impact was immediate. On Jan. 11, Sheryl Sandberg, Facebook’s chief operating officer — and not Mr. Zuckerberg — told Reuters that the storming of the U.S. Capitol a week earlier had little to do with Facebook. In July, when President Biden said the social network was “killing people” by spreading Covid-19 misinformation, Guy Rosen, Facebook’s vice president for integrity, disputed the characterization in a blog post and pointed out that the White House had missed its coronavirus vaccination goals.
“Facebook is not the reason this goal was missed,” Mr. Rosen wrote.
Mr. Zuckerberg’s personal Facebook and Instagram accounts soon changed. Rather than addressing corporate controversies, Mr. Zuckerberg’s posts have recently featured a video of himself riding an electric surfboard with an American flag and messages about new virtual reality and hardware devices.
Facebook also started cutting back the availability of data that allowed academics and journalists to study how the platform worked. In April, the company told its team behind CrowdTangle, a tool that provides data on the engagement and popularity of Facebook posts, that it was being broken up. While the tool still exists, the people who worked on it were moved to other teams.
Part of the impetus came from Mr. Schultz, who had grown frustrated with news coverage that used CrowdTangle data to show that Facebook was spreading misinformation, said two people involved in the discussions.
For academics who relied on CrowdTangle, it was a blow. Cameron Hickey, a misinformation researcher at the National Conference on Citizenship, a nonprofit focused on civic engagement, said he was “particularly angry” because he felt the CrowdTangle team was being punished for giving an unfiltered view of engagement on Facebook.
Mr. Schultz argued that Facebook should publish its own information about the site’s most popular content rather than supply access to tools like CrowdTangle, two people said. So in June, the company compiled a report on Facebook’s most-viewed posts for the first three months of 2021.
But Facebook did not release the report. After the policy communications team discovered that the top-viewed link for the period was a news story with a headline that suggested a doctor had died after receiving the Covid-19 vaccine, they feared the company would be chastised for contributing to vaccine hesitancy, according to internal emails reviewed by The New York Times.
A day before the report was supposed to be published, Mr. Schultz was part of a group that voted to shelve the document, according to the emails. He later posted an internal message about his role at Facebook, which was reviewed by The Times, saying, “I do care about protecting the company’s reputation, but I also care deeply about rigor and transparency.”
Facebook also worked to stamp out employee leaks. In July, the communications team shuttered comments on an internal forum that was used for companywide announcements. “OUR ONE REQUEST: PLEASE DON’T LEAK,” read a post about the change.
At the same time, Facebook ramped up its marketing. During the Olympics this summer, the company paid for television spots with the tagline “We change the game when we find each other,” to promote how it fostered communities. In the first half of this year, Facebook spent a record $6.1 billion on marketing and sales, up more than 8 percent from a year earlier, according to a recent earnings report.
Weeks later, the company further reduced the ability of academics to conduct research on it when it disabled the Facebook accounts and pages of a group of New York University researchers. The researchers had created a feature for web browsers that allowed them to see users’ Facebook activity, which 16,000 people had consented to use. The resulting data had led to studies showing that misleading political ads had thrived on Facebook during the 2020 election and that users engaged more with right-wing misinformation than many other types of content.
In a blog post, Facebook said the N.Y.U. researchers had violated rules around collecting user data, citing a privacy agreement it had originally struck with the Federal Trade Commission in 2012. The F.T.C. later admonished Facebook for invoking its agreement, saying it allowed for good-faith research in the public interest.
Laura Edelson, the lead N.Y.U. researcher, said Facebook cut her off because of the negative attention her work brought. “Some people at Facebook look at the effect of these transparency efforts and all they see is bad P.R.,” she said.
The episode was compounded this month when Facebook told misinformation researchers that it had mistakenly provided incomplete data on user interactions and engagement for two years for their work.
“It is inconceivable that most of modern life, as it exists on Facebook, isn’t analyzable by researchers,” said Nathaniel Persily, a Stanford University law professor, who is working on federal legislation to force the company to share data with academics.
In August, after Mr. Zuckerberg approved Project Amplify, the company tested the change in three U.S. cities, two people with knowledge of the effort said. While the company had previously used the News Feed to promote its own products and social causes, it had not turned to it to openly push positive press about itself, they said.
Once the tests began, Facebook used a system known as Quick Promotes to place stories about people and organizations that used the social network into users’ News Feeds, they said. People essentially see posts with a Facebook logo that link to stories and websites published by the company and from third-party local news sites. One story pushed “Facebook’s Latest Innovations for 2021” and discussed how it was achieving “100 percent renewable energy for our global operations.”
“This is a test for an informational unit clearly marked as coming from Facebook,” Mr. Osborne said, adding that Project Amplify was “similar to corporate responsibility initiatives people see in other technology and consumer products.”
Facebook’s defiance against unflattering revelations has also not let up, even without Mr. Zuckerberg. On Saturday, Nick Clegg, the company’s vice president for global affairs, wrote a blog post denouncing the premise of The Journal investigation. He said the idea that Facebook executives had repeatedly ignored warnings about problems was “just plain false.”
“These stories have contained deliberate mischaracterizations of what we are trying to do,” Mr. Clegg said. He did not detail what the mischaracterizations were.
What’s it like to try and police millions of pieces of abusive content every day? Sudhir takes us inside Facebook, as he and his former colleagues recall how hard it was to encourage civility at a company obsessed with growth — especially when that growth was often driven by some of the most toxic behaviors.
YouTube has become the latest social media platform to suspend President Trump’s account, saying one of his videos incited violence. The move comes after similar action was taken by Facebook, Twitter, and other tech giants. What responsibility does Silicon Valley bear for last week’s Capitol Hill riot? Roger McNamee was an early investor in Facebook and an adviser to Mark Zuckerberg and now has written a damning article for Wired: “Platforms Must Pay for Their Role in the Insurrection.”
Mark Zuckerberg still thinks we’re all “dumb fucks.”
This indisputable fact was once again ground into our skulls Thursday morning when the CEO of the toxic cesspool otherwise known as Facebook waxed semi-philosophic on free speech at Georgetown University. Amidst the tired and expected Reddit-logic-bro-like ramblings, one moment stood out for its sheer audacity: Zuckerberg’s attempt to forcefully rewrite the history of his company’s founding.
And he’s clearly counting on us buying the lie.
Facebook, Zuckerberg insisted, was born out of the noblest of impulses to give “everyone a voice” in the aftermath of the 2003 invasion of Iraq. Yes, you read that correctly.
Before we get into just how extremely bullshit we know this claim to be, it’s worth reading it in its stupefying entirety.
When I was in college, our country had just gone to war in Iraq. The mood on campus was disbelief. It felt like we were acting without hearing a lot of important perspectives. The toll on soldiers, families and our national psyche was severe, and most of us felt powerless to stop it. I remember feeling that if more people had a voice to share their experiences, maybe things would have gone differently. Those early years shaped my belief that giving everyone a voice empowers the powerless and pushes society to be better over time.
Back then, I was building an early version of Facebook for my community, and I got to see my beliefs play out at smaller scale.
Got that? Zuckerberg is implying Facebook was a manifestation of his belief that giving people a voice would make the world a better place. Except we know that isn’t true.
Like, not even remotely.
Facebook’s origin story is an incredibly well documented — if messy — one, and, unfortunately for the CEO, it paints him in a rather unflattering light.
For those blissfully unaware, the development of TheFacebook followed on Zuckerberg’s creation of a “Hot or Not” clone called Facemash, which scraped Harvard students’ photos from an online directory and then asked students to rank the respective hotness of those pictured.
Contemporaneous reporting by Harvard’s student newspaper, the Crimson, laid it all out in clear detail.
“The site was created entirely by Zuckerberg over the last week in October, after a friend gave him the idea,” reads the 2003 article. “The website used photos compiled from the online facebooks of nine Houses, placing two next to each other at a time and asking users to choose the ‘hotter’ person.”
Now, Zuckerberg has repeatedly insisted that Facemash was totally separate from Facebook.
“The claim that Facemash was somehow connected to the development of Facebook… it isn’t, it wasn’t,” he told Congress in 2018.
If we are to believe that claim, which is itself dubious, then we are still left with scores of records showing that Zuckerberg made Facebook with dating services in mind.
“Like,” Business Insider reports Zuckerberg as writing to his friend Adam D’Angelo just before the launch of TheFacebook.com, “I don’t think people would sign up for the facebook thing if they knew it was for dating.”
Of his notorious decision to delay working on a competitor’s social network dubbed Harvard Connection so that he could get TheFacebook up in time?
“I’m going to fuck them,” Business Insider reports him as telling a friend.
Even Zuckerberg himself has, in the past, provided a sanitized retelling of his justification for launching Facebook that had nothing to do with the lofty claims he made today.
“Ten years ago,” CNBC reports him as telling Freakonomics Radio in 2018, “you know, I was just trying to help connect people at colleges and a few schools.”
Now, there is itself nothing wrong with launching a dating or social website. However, when that site morphs into the democracy-eating beast that is the present-day Facebook, understanding how and why that transition happened is of some pretty serious import.
Self mythologizing your company’s origin story to make yourself into a T-shirt-sporting statesman, and assuming we’re all dumb enough to lap up those lies reflects an ongoing desire on the part of Zuckerberg to bend reality to his will.
For a man with such unparalleled power over both our elections and personal information, that should bother all of us. Unless, of course, us “fucks” are too dumb to notice.
When you first sign up you’ll be put on a waiting list and asked to invite others, or you can sign up for a subscription. It costs US$13/mo or US$100/year.
We will empower you to make your own choices about what content you are served, and to directly edit misleading headlines, or flag problem posts. We will foster an environment where bad actors are removed because it is right, not because it suddenly affects our bottom-line.
WT:Social will be focused on news and members will be asked to edit misleading headlines. Articles will be shared in a timeline that presents content by the newest stuff first, rather than algorithmically-sorted like Facebook and Twitter.