The New Copycats: How Facebook Squashes Competition From Startups

Mr. Zuckerberg is sensitive to anything that might disrupt Facebook, even the teeniest startup, say current and former executives and employees.

Facebook uses an internal database to track rivals, including young startups performing unusually well, people familiar with the system say. The database stems from Facebook’s 2013 acquisition of a Tel Aviv-based startup, Onavo, which had built an app that secures users’ privacy by routing their traffic through private servers. The app gives Facebook an unusually detailed look at what users collectively do on their phones, these people say.

The tool shaped Facebook’s decision to buy WhatsApp and informed its live-video strategy, they say. Facebook used Onavo to build its early-bird tool that tips it off to promising services and that helped Facebook home in on Houseparty.

.. Mr. Rubin didn’t want to sell but was under pressure from his board to keep Houseparty’s options open, Mr. Elman says. “If a company like Facebook or Snapchat needs your team’s expertise, that might be a better return for shareholders than the risk of going big,” Mr. Elman says he told Mr. Rubin.

.. In December, Facebook began its group-video-chat offensive. Its Messenger app introduced the feature with the ability to see up to six people in a conversation, compared with the eight-person rooms on Houseparty.

In February, Facebook invited Houseparty users between the ages of 13 and 17 to come to its offices in Menlo Park, Calif., to participate in a study and keep a diary for a week afterward that they would share with Facebook, offering as an inducement $275 Amazon gift cards.

.. Last month, it recruited a vice president of engineering, Kinshuk Mishra, who had helped Spotify AB, the music-streaming service, fend off Apple Music. It introduced a new chat feature called “passing notes” to attract more users.

The New Copycats: How Facebook Squashes Startup Competition (wsj.com)

> Facebook uses an internal database to track rivals… The database stems from Facebook’s 2013 acquisition of a Tel Aviv-based startup, Onavo, which had built an app that secures users’ privacy by routing their traffic through private servers. The app gives Facebook an unusually detailed look at what users collectively do on their phones…

WTF is this shady-ass sh*t. Way to “secure users’ privacy,” Facebook.

From the sound of Onavo’s App Store reviews they are using deceptive marketing of the “Your phone is infected, install this now!!” variety. Yet they have a lot of positive but suspiciously brief reviews balancing them out. So Facebook bought a company that MITMs unsuspecting users for profit, using scammer marketing techniques and fake reviews to drive installs, then leverages that to knife babies. “Don’t be too proud,” indeed.

I hope there is cause for Apple to remove this app from the App Store (like deceptive marketing or exploitive practices). Or for a bunch of us good folks to leave negative reviews. These guys depend on informed people avoiding these apps and not leaving reviews.

.. You have to intercept to gather metadata… but semantics aside, they are deceiving users.First there is the marketing scam reported in the app store reviews, people who installed it because some web site told them they have a virus and they need this thing to fix it.

Second, the only mention of their logging practices is buried below the fold in the last line of their description: “Onavo receives and analyzes information about your mobile data and app use.” This is just vague enough to deceive a user that believes it is merely to support their user-facing features, i.e. giving you a report on what you use… not Facebook for spying purposes. Of course, most users never even get that far in the description. They’re installing this to “secure their phone” because of a scary ad they saw.

These guys know exactly what they’re doing. Most of their users, not so much. That’s where we come in. The App Store exists to help protect users from this kind of exploitation and I hope Apple and our community takes action.

.. What struck me from the article was how facebook knew what social networks are competitive threats. They’re tracking what apps you use on your phone.”Facebook uses an internal database to track rivals, including young startups performing unusually well, people familiar with the system say. The database stems from Facebook’s 2013 acquisition of a Tel Aviv-based startup, Onavo, which had built an app that secures users’ privacy by routing their traffic through private servers. The app gives Facebook an unusually detailed look at what users collectively do on their phones, these people say.

The tool shaped Facebook’s decision to buy WhatsApp and informed its live-video strategy, they say. Facebook used Onavo to build its early-bird tool that tips it off to promising services and that helped Facebook home in on Houseparty”

.. Facebook is what Microsoft was in the 1990s. Using its existing market dominance to crush potential competitors by offering their distinctive offerings as mere features of its existing popular products.This did lead to a lot of momentum to the anti-trust proceedings against Microsoft.

I wonder if that encourages Facebook to not do this so obviously in the future? Or maybe it isn’t at all worried about anti-trust for the near term.

I am sure Google, Amazon and Microsoft continue to do doing this as well, but it seems that Facebook is doing this most successfully or at least most prominently with its total destruction of Snap.

 

.. This is just killer:

> In December, Facebook began its group-video-chat offensive. Its Messenger app introduced the feature with the ability to see up to six people in a conversation, compared with the eight-person rooms on Houseparty.

> In February, Facebook invited Houseparty users between the ages of 13 and 17 to come to its offices in Menlo Park, Calif., to participate in a study and keep a diary for a week afterward that they would share with Facebook, offering as an inducement $275 Amazon gift cards.

Facebook: You Are the Product

In the far distant days of October 2012, when Facebook hit one billion users, 55 per cent of them were using it every day. At two billion, 66 per cent are. Its user base is growing at 18 per cent a year – which you’d have thought impossible for a business already so enormous. Facebook’s biggest rival for logged-in users is YouTube, owned by its deadly rival Alphabet (the company formerly known as Google), in second place with 1.5 billion monthly users.

.. Three of the next four biggest apps, or services, or whatever one wants to call them, are WhatsApp, Messenger and Instagram, with 1.2 billion, 1.2 billion, and 700 million users respectively (the Chinese app WeChat is the other one, with 889 million). Those three entities have something in common: they are all owned by Facebook. No wonder the company is the fifth most valuable in the world, with a market capitalisation of $445 billion.

.. He said that the company was changing its ‘mission statement’, its version of the canting pieties beloved of corporate America. Facebook’s mission used to be ‘making the world more open and connected’. A non-Facebooker reading that is likely to ask: why? Connection is presented as an end in itself, an inherently and automatically good thing. Is it, though?

.. Facebook is generally agreed to have played a big, perhaps even a crucial, role in the election of Donald Trump. The benefit to humanity is not clear. This thought, or something like it, seems to have occurred to Zuckerberg, because the new mission statement spells out a reason for all this connectedness. It says that the new mission is to ‘give people the power to build community and bring the world closer together’.

.. Facebook is in a long line of such enterprises, though it might be the purest ever example of a company whose business is the capture and sale of attention. Very little new thinking was involved in its creation. As Wu observes, Facebook is ‘a business with an exceedingly low ratio of invention to success’.

What Zuckerberg had instead of originality was the ability to get things done and to see the big issues clearly. The crucial thing with internet start-ups is the ability to execute plans and to adapt to changing circumstances. It’s Zuck’s skill at doing that – at hiring talented engineers, and at navigating the big-picture trends in his industry – that has taken his company to where it is today.

..  The movie Zuckerberg is a highly credible character, a computer genius located somewhere on the autistic spectrum with minimal to non-existent social skills. But that’s not what the man is really like. In real life, Zuckerberg was studying for a degree with a double concentration in computer science and – this is the part people tend to forget – psychology. People on the spectrum have a limited sense of how other people’s minds work; autists, it has been said, lack a ‘theory of mind’. Zuckerberg, not so much. He is very well aware of how people’s minds work and in particular of the social dynamics of popularity and status.

.. The initial launch of Facebook was limited to people with a Harvard email address; the intention was to make access to the site seem exclusive and aspirational. (And also to control site traffic so that the servers never went down. Psychology and computer science, hand in hand.) Then it was extended to other elite campuses in the US. When it launched in the UK, it was limited to Oxbridge and the LSE. The idea was that people wanted to look at what other people like them were doing, to see their social networks, to compare, to boast and show off, to give full rein to every moment of longing and envy, to keep their noses pressed against the sweet-shop window of others’ lives.

.. This focus attracted the attention of Facebook’s first external investor, the now notorious Silicon Valley billionaire Peter Thiel. Again, The Social Network gets it right: Thiel’s $500,000 investment in 2004 was crucial to the success of the company. But there was a particular reason Facebook caught Thiel’s eye, rooted in a byway of intellectual history. In the course of his studies at Stanford – he majored in philosophy – Thiel became interested in the ideas of the US-based French philosopher René Girard, as advocated in his most influential book, Things Hidden since the Foundation of the World. Girard’s big idea was something he called ‘mimetic desire’. Human beings are born with a need for food and shelter. Once these fundamental necessities of life have been acquired, we look around us at what other people are doing, and wanting, and we copy them. In Thiel’s summary, the idea is ‘that imitation is at the root of all behaviour’.

.. Girard was a Christian, and his view of human nature is that it is fallen. We don’t know what we want or who we are; we don’t really have values and beliefs of our own; what we have instead is an instinct to copy and compare. We are homo mimeticus. ‘Man is the creature who does not know what to desire, and who turns to others in order to make up his mind. We desire what others desire because we imitate their desires.’ Look around, ye petty, and compare. The reason Thiel latched onto Facebook with such alacrity was that he saw in it for the first time a business that was Girardian to its core: built on people’s deep need to copy. ‘Facebook first spread by word of mouth, and it’s about word of mouth, so it’s doubly mimetic,’ Thiel said. ‘Social media proved to be more important than it looked, because it’s about our natures.’ We are keen to be seen as we want to be seen, and Facebook is the most popular tool humanity has ever had with which to do that.

.. The view of human nature implied by these ideas is pretty dark. If all people want to do is go and look at other people so that they can compare themselves to them and copy what they want – if that is the final, deepest truth about humanity and its motivations – then Facebook doesn’t really have to take too much trouble over humanity’s welfare, since all the bad things that happen to us are things we are doing to ourselves. For all the corporate uplift of its mission statement, Facebook is a company whose essential premise is misanthropic.

.. The highest-profile recent criticisms of the company stem from its role in Trump’s election. There are two components to this, one of them implicit in the nature of the site, which has an inherent tendency to fragment and atomise its users into like-minded groups. The mission to ‘connect’ turns out to mean, in practice, connect with people who agree with you. We can’t prove just how dangerous these ‘filter bubbles’ are to our societies, but it seems clear that they are having a severe impact on our increasingly fragmented polity. Our conception of ‘we’ is becoming narrower.

..  The portmanteau terms for these developments are ‘fake news’ and ‘post-truth’, and they were made possible by the retreat from a general agora of public debate into separate ideological bunkers. In the open air, fake news can be debated and exposed; on Facebook, if you aren’t a member of the community being served the lies, you’re quite likely never to know that they are in circulation. It’s crucial to this that Facebook has no financial interest in telling the truth. No company better exemplifies the internet-age dictum that if the product is free, you are the product.

.. Facebook’s customers aren’t the people who are on the site: its customers are the advertisers who use its network and who relish its ability to direct ads to receptive audiences. Why would Facebook care if the news streaming over the site is fake? Its interest is in the targeting, not in the content. This is probably one reason for the change in the company’s mission statement. If your only interest is in connecting people, why would you care about falsehoods? They might even be better than the truth, since they are quicker to identify the like-minded. The newfound ambition to ‘build communities’ makes it seem as if the company is taking more of an interest in the consequence of the connections it fosters.

.. in an interesting paper published by its internal security division. ‘Fake news’, they argue, is an unhelpful, catch-all term because misinformation is in fact spread in a variety of ways:

  1. Information (or Influence) Operations – Actions taken by governments or organised non-state actors to distort domestic or foreign political sentiment.
  2. False News – News articles that purport to be factual, but which contain intentional misstatements of fact with the intention to arouse passions, attract viewership, or deceive.
  3. False Amplifiers – Co-ordinated activity by inauthentic accounts with the intent of manipulating political discussion (e.g. by discouraging specific parties from participating in discussion, or amplifying sensationalistic voices over others).
  4. Disinformation – Inaccurate or manipulated information/content that is spread intentionally. This can include false news, or it can involve more subtle methods, such as false flag operations, feeding inaccurate quotes or stories to innocent intermediaries, or knowingly amplifying biased or misleading information.

..  One man’s fake news is another’s truth-telling, and Facebook works hard at avoiding responsibility for the content on its site – except for sexual content, about which it is super-stringent. Nary a nipple on show. It’s a bizarre set of priorities, which only makes sense in an American context, where any whiff of explicit sexuality would immediately give the site a reputation for unwholesomeness. Photos of breastfeeding women are banned and rapidly get taken down. Lies and propaganda are fine.

.. The key to understanding this is to think about what advertisers want: they don’t want to appear next to pictures of breasts because it might damage their brands, but they don’t mind appearing alongside lies because the lies might be helping them find the consumers they’re trying to target.

.. Jonathan Taplin points to an analysis on Buzzfeed: ‘In the final three months of the US presidential campaign, the top-performing fake election news stories on Facebook generated more engagement than the top stories from major news outlets such as the New York TimesWashington PostHuffington Post, NBC News and others.’ This doesn’t sound like a problem Facebook will be in any hurry to fix.

.. The fact is that fraudulent content, and stolen content, are rife on Facebook, and the company doesn’t really mind, because it isn’t in its interest to mind. Much of the video content on the site is stolen from the people who created it. An illuminating YouTube video from Kurzgesagt, a German outfit that makes high-quality short explanatory films, notes that in 2015, 725 of Facebook’s top one thousand most viewed videos were stolen.

.. Zuckerberg himself has spoken up on this issue, in a Facebook post addressing the question of ‘Facebook and the election’. After a certain amount of boilerplate bullshit (‘Our goal is to give every person a voice. We believe deeply in people’), he gets to the nub of it. ‘Of all the content on Facebook, more than 99 per cent of what people see is authentic. Only a very small amount is fake news and hoaxes.’ More than one Facebook user pointed out that in their own news feed, Zuckerberg’s post about authenticity ran next to fake news. In one case, the fake story pretended to be from the TV sports channel ESPN. When it was clicked on, it took users to an ad selling a diet supplement. As the writer Doc Searls pointed out, it’s a double fraud, ‘outright lies from a forged source’, which is quite something to have right slap next to the head of Facebook boasting about the absence of fraud.

.. Facebook needs content, obviously, because that’s what the site consists of: content that other people have created. It’s just that it isn’t too keen on anyone apart from Facebook making any money from that content. Over time, that attitude is profoundly destructive to the creative and media industries. Access to an audience – that unprecedented two billion people – is a wonderful thing, but Facebook isn’t in any hurry to help you make money from it.

.. Facebook is in essence an advertising company which is indifferent to the content on its site except insofar as it helps to target and sell advertisements.

.. A version of Gresham’s law is at work, in which fake news, which gets more clicks and is free to produce, drives out real news, which often tells people things they don’t want to hear, and is expensive to produce.

.. Its news feed directs traffic at you based not on your interests, but on how to make the maximum amount of advertising revenue from you.

.. In the early years of Facebook, Zuckerberg was much more interested in the growth side of the company than in the monetisation. That changed when Facebook went in search of its big payday at the initial public offering

.. Naomi, between chats with Cox, was clicking away on her laptop, paying little attention to the Zuckian harangue. I peered over her shoulder at her screen. She was scrolling down an email with a number of links, and progressively clicking each one into existence as another tab on her browser. Clickathon finished, she began lingering on each with an appraiser’s eye. They were real estate listings, each for a different San Francisco property.

Martínez took note of one of the properties and looked it up later. Price: $2.4 million. He is fascinating, and fascinatingly bitter, on the subject of class and status differences in Silicon Valley, in particular the never publicly discussed issue of the huge gulf between early employees in a company, who have often been made unfathomably rich, and the wage slaves who join the firm later in its story.

.. When the time came for the IPO, Facebook needed to turn from a company with amazing growth to one that was making amazing money. It was already making some, thanks to its sheer size – as Martínez observes, ‘a billion times any number is still a big fucking number’ – but not enough to guarantee a truly spectacular valuation on launch. It was at this stage that the question of how to monetise Facebook got Zuckerberg’s full attention. It’s interesting, and to his credit, that he hadn’t put too much focus on it before – perhaps because he isn’t particularly interested in money per se. But he does like to win.

.. If I want to reach women between the ages of 25 and 30 in zip code 37206 who like country music and drink bourbon, Facebook can do that. Moreover, Facebook can often get friends of these women to post a ‘sponsored story’ on a targeted consumer’s news feed, so it doesn’t feel like an ad. As Zuckerberg said when he introduced Facebook Ads, ‘Nothing influences people more than a recommendation from a trusted friend. A trusted referral is the Holy Grail of advertising.’

..  (Particular segments of voters too can be targeted with complete precision. One instance from 2016 was an anti-Clinton ad repeating a notorious speech she made in 1996 on the subject of ‘super-predators’. The ad was sent to African-American voters in areas where the Republicans were trying, successfully as it turned out, to suppress the Democrat vote. Nobody else saw the ads.)

.. What this means is that even more than it is in the advertising business, Facebook is in the surveillance business. Facebook, in fact, is the biggest surveillance-based enterprise in the history of mankind. It knows far, far more about you than the most intrusive government has ever known about its citizens. It’s amazing that people haven’t really understood this about the company. I’ve spent time thinking about Facebook, and the thing I keep coming back to is that its users don’t realise what it is the company does. What Facebook does is watch you, and then use what it knows about you and your behaviour to sell ads. I’m not sure there has ever been a more complete disconnect between what a company says it does – ‘connect’, ‘build communities’ – and the commercial reality.

.. Note that the company’s knowledge about its users isn’t used merely to target ads but to shape the flow of news to them. Since there is so much content posted on the site, the algorithms used to filter and direct that content are the thing that determines what you see: people think their news feed is largely to do with their friends and interests, and it sort of is, with the crucial proviso that it is their friends and interests as mediated by the commercial interests of Facebook. Your eyes are directed towards the place where they are most valuable for Facebook.

..  Wu’s history of attention merchants shows that there is a suggestive pattern here: that a boom is more often than not followed by a backlash, that a period of explosive growth triggers a public and sometimes legislative reaction.

.. forms of backlash, both major and minor, are all but inevitable.’ Wu calls a minor form of this phenomenon the ‘disenchantment effect’.

.. Ad Week estimates the annual cost of click fraud at $7 billion, about a sixth of the entire market.

.. Estimates of fraudulent traffic’s market share are variable, with some guesses coming in at around 50 per cent; some website owners say their own data indicates a fraudulent-click rate of 90 per cent. This is by no means entirely Facebook’s problem, but it isn’t hard to imagine how it could lead to a big revolt against ‘ad tech’, as this technology is generally known, on the part of the companies who are paying for it

.. A customers’ revolt could overlap with a backlash from regulators and governments.

.. Facebook has done a huge amount to lower the quality of public debate and to ensure that it is easier than ever before to tell what Hitler approvingly called ‘big lies’ and broadcast them to a big audience. The company has no business need to care about that, but it is the kind of issue that could attract the attention of regulators.

.. were it to be generally understood that Facebook’s business model is based on surveillance, the company would be in danger. The one time Facebook did poll its users about the surveillance model was in 2011, when it proposed a change to its terms and conditions
.. The other thing that could happen at the level of individual users is that people stop using Facebook because it makes them unhappy.
.. The resulting paper, published in the Proceedings of the National Academy of Sciences, was a study of ‘social contagion’, or the transfer of emotion among groups of people, as a result of a change in the nature of the stories seen by 689,003 users of Facebook. ‘When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.’ The scientists seem not to have considered how this information would be received, and the story played quite big for a while.
.. the more people use Facebook, the more unhappy they are. A 1 per cent increase in ‘likes’ and clicks and status updates was correlated with a 5 to 8 per cent decrease in mental health.
.. In addition, they found that the positive effect of real-world interactions, which enhance well-being, was accurately paralleled by the ‘negative associations of Facebook use’. In effect people were swapping real relationships which made them feel good for time on Facebook which made them feel bad.
.. Russians, about a hundred million of whom are on the net, tend not to use Facebook because they prefer their native copycat site VKontakte.
..Facebook and Google are the new colonial powers.’
.. I am scared of Facebook. The company’s ambition, its ruthlessness, and its lack of a moral compass scare me. It goes back to that moment of its creation, Zuckerberg at his keyboard after a few drinks creating a website to compare people’s appearance, not for any real reason other than that he was able to do it. That’s the crucial thing about Facebook, the main thing which isn’t understood about its motivation: it does things because it can. Zuckerberg knows how to do something, and other people don’t, so he does it. Motivation of that type doesn’t work in the Hollywood version of life, so Aaron Sorkin had to give Zuck a motive to do with social aspiration and rejection. But that’s wrong, completely wrong. He isn’t motivated by that kind of garden-variety psychology. He does this because he can, and justifications about ‘connection’ and ‘community’ are ex post facto rationalisations.

Fake News Expert On How False Stories Spread And Why People Believe Them

In particular, there’s a huge cluster of websites in English about health issues because they find that that content does really well.

And if they sign up, for example, for Google AdSense, an ad program, they can get money as people visit their sites and it’s pretty straightforward. So they tried election sites, and over time they all came to realize that the stuff that did the best was pro-Trump stuff. They got the most traffic and most traction.

.. And I think there was an element almost – in some of the people I was speaking to, there was almost an element of pride saying, you know, we’re here in this small country that most Americans probably don’t even think about, and we’re able to, you know, put stuff out and earn money and to run a business. And I think there was a bit of pride in that. One of the people that I spoke to, who was a bit older – he was in his 20s – you know, he said that yeah, I mean, people know that a lot of the content is false. But that’s what works.

.. They all said that when it came down to it, the fake stuff performed better on Facebook. And if you weren’t doing some stuff that was misleading or fake, you were going to get beat by people who were.

.. And then the other type of content that performed really well was, you know, memes, like a photo that just sort – kind of expressed a very partisan opinion. These – you know, they weren’t necessarily factually based, but they really kind of riled up the base. And for the pages that were partisan pages on the right and the left, if you had stuff that really appealed to people’s existing beliefs – really appealed to, you know, a negative perception of Hillary Clinton, a negative perception of Donald Trump – even if it, you know, completely bent the truth, that would perform much better than a sort of purely factual thing.

.. So at the core of this is – there’s two factors that are at play here. So one is a human factor and one is kind of a platform or algorithmic factor. So on the human side, there’s a lot of research out there going back a very long time that looks at sort of how humans deal with information. And one of the things that we love as humans – and this this affects all of us. We shouldn’t think of this as just being something for people who are very partisan. We love to hear things that confirm what we think and what we feel and what we already believe. It’s – it makes us feel good to get information that aligns with what we already believe or what we want to hear.

.. And on the other side of that is when we’re confronted with information that contradicts what we think and what we feel, the reaction isn’t to kind of sit back and consider it. The reaction is often to double down on our existing beliefs. So if you’re feeding people information that basically just tells them what they want to hear, they’re probably going to react strongly to that. And the other layer that these pages are very good at is they bring in emotion into it, anger or hate or surprise or, you know, joy. And so if you combine information that aligns with their beliefs, if you can make it something that strikes an emotion in them, then that gets them to react.

.. And that’s where the kind of platform and algorithms come in. Which is that on Facebook, you know, the more you interact with certain types of content, the more its algorithms are going to feed you more of that content. So if you’re reading stuff that aligns perfectly with your political beliefs, it makes you feel really good and really excited and you share it, Facebook is going to see that as a signal that you want more of that stuff. So that’s why the false misleading stuff does really well is because it’s highly emotion-driven.

.. Whereas when you come in as the debunker, what you’re doing is actively going against information that people are probably already, you know, willing to believe and that gets them emotionally. And to tell somebody I’m sorry that thing you saw and shared is not true is you coming in in a very negative way unfortunately.

And so the reaction is often for people to get defensive and to disagree with you. And just in general you just seem like kind of a spoil sport. You’re ruining the fun or you’re getting in the way of their beliefs. And a lot of times when I put debunkings out there, you know, some of the reactions I get are people saying, well, it might as well be true. You know, he could have said that or that could have happened. Or, of course, you get accusations that, you know, you’re biased. And so the debunkings just don’t appeal as much to us on a psychological level. There’s some emotional resistance to wanting to be wrong. That’s a very natural human thing. And they’re just not as shareable because the emotion there isn’t as real and raw as something that makes you angry, for example.

.. So the one that comes to mind right away, this is a story that was on a website that is made to look like ABC News but its domain is slightly different. And the story that was published, you know, long before the election claimed that a protester had been paid $3,500 to go and protest at Trump rally. And this fed into perceptions that the people who are against Trump were being paid by big interests.

And that story did pretty well on Facebook. It got a fair amount of engagement. But it was tweeted by Eric Trump. It was tweeted by Corey Lewandowski, who was a campaign manager for Donald Trump, and it was tweeted by Kellyanne Conway, who was his campaign manager, not that long before the election. So when you have people in positions of power and influence putting out fake news – and I want to say, you know, there’s no evidence that they knew it was fake and put it out there to fool people. I think in each case they genuinely believed it was true because, as we’ve discussed, I think it fed into the message their campaign wanted to put out.

..

One that was really popular actually was one that falsely claimed he had given a quote to People magazine many years ago basically saying that if I ever ran for president, I would run as a Republican because conservatives are so stupid they’ll believe anything. And this was turned into a meme.

It spread a lot on Facebook. It was debunked so many times. We debunked it at BuzzFeed. Snopes has debunked it. And it just kept going and going and going because this is something I think a lot of Democrats wanted to believe.

.. But – so I think anyone who believes that fake news won Trump the election is wrong. There’s no data to support that. And I say this as somebody who’s been looking at this data in a lot of different ways. There’s no smoking gun. There’s – I don’t think we’ll ever get it.

.. 75 percent of the time, the Americans who were shown a fake news headline and had remembered it from the election believed it to be accurate.

And that’s a really shocking thing. It’s impossible to go the next step and say, well, they voted because of that. But I think one of the things this election has shown is that people will believe fake news, misinformation will spread and people will believe it and it will become part of their worldview.