Exclusive: WhatsApp Cofounder Brian Acton Gives The Inside Story On #DeleteFacebook And Why He Left $850 Million Behind

Now he’s talking publicly for the first time. Under pressure from Mark Zuckerberg and Sheryl Sandberg to monetize WhatsApp, he pushed back as Facebook questioned the encryption he’d helped build and laid the groundwork to show targeted ads and facilitate commercial messaging.

Acton also walked away from Facebook a year before his final tranche of stock grants vested. “It was like, okay, well, you want to do these things I don’t want to do,” Acton says. “It’s better if I get out of your way. And I did.” It was perhaps the most expensive moral stand in history. Acton took a screenshot of the stock price on his way out the door—the decision cost him $850 million.

.. “As part of a proposed settlement at the end, [Facebook management] tried to put a nondisclosure agreement in place,” Acton says. “That was part of the reason that I got sort of cold feet in terms of trying to settle with these guys.”

.. That kind of answer masks the kind of issues that just prompted Instagram’s founders to abruptly quit. Kevin Systrom and Mike Krieger reportedly chafed at Facebook and Zuckerberg’s heavy hand. Acton’s account of what happened at WhatsApp—and Facebook’s plans for it—provides a rare founder’s-level window into a company that’s at once the global arbiter of privacy standards and the gatekeeper of facts, while also increasingly straying from its entrepreneurial roots.

.. Despite a transfer of several billion dollars, Acton says he never developed a rapport with Zuckerberg. “I couldn’t tell you much about the guy,” he says. In one of their dozen or so meetings, Zuck told Acton unromantically that WhatsApp, which had a stipulated degree of autonomy within the Facebook universe and continued to operate for a while out of its original offices, was “a product group to him, like Instagram.”

.. So Acton didn’t know what to expect when Zuck beckoned him to his office last September, around the time Acton told Facebook brass that he planned to leave. Acton and Koum had a clause in their contract that allowed them to get all their stock, which was being doled out over four years, if Facebook began “implementing monetization initiatives” without their consent.

.. The Facebook-WhatsApp pairing had been a head-scratcher from the start. Facebook has one of the world’s biggest advertising networks; Koum and Acton hated ads. Facebook’s added value for advertisers is how much it knows about its users; WhatsApp’s founders were pro-privacy zealots who felt their vaunted encryption had been integral to their nearly unprecedented global growth.

.. This dissonance frustrated Zuckerberg. Facebook, Acton says, had decided to pursue two ways of making money from WhatsApp. First, by showing targeted ads in WhatsApp’s new Status feature, which Acton felt broke a social compact with its users. “Targeted advertising is what makes me unhappy,” he says. His motto at WhatsApp had been “No ads, no games, no gimmicks”—a direct contrast with a parent company that derived 98% of its revenue from advertising. Another motto had been “Take the time to get it right,” a stark contrast to “Move fast and break things.”

.. Facebook also wanted to sell businesses tools to chat with WhatsApp users. Once businesses were on board, Facebook hoped to sell them analytics tools, too. The challenge was WhatsApp’s watertight end-to-end encryption, which stopped both WhatsApp and Facebook from reading messages.

.. For his part, Acton had proposed monetizing WhatsApp through a metered-user model, charging, say, a tenth of a penny after a certain large number of free messages were used up. “You build it once, it runs everywhere in every country,” Acton says. “You don’t need a sophisticated sales force. It’s a very simple business.”

.. Acton’s plan was shot down by Sandberg. “Her words were ‘It won’t scale.’ ”

.. “I called her out one time,” says Acton, who sensed there might be greed at play. “I was like, ‘No, you don’t mean that it won’t scale. You mean it won’t make as much money as . . . ,’ and she kind of hemmed and hawed a little. And we moved on. I think I made my point. . . . They are businesspeople, they are good businesspeople. They just represent a set of business practices, principles and ethics, and policies that I don’t necessarily agree with.”

.. When Acton reached Zuckerberg’s office, a Facebook lawyer was present. Acton made clear that the disagreement—Facebook wanted to make money through ads, and he wanted to make it from high-volume users—meant he could get his full allocation of stock. Facebook’s legal team disagreed, saying that WhatsApp had only been exploring monetization initiatives, not “implementing” them.

.. Zuckerberg, for his part, had a simple message: “He was like, This is probably the last time you’ll ever talk to me.”

.. Acton graduated from Stanford with a bachelor’s in computer science and eventually became one of the first employees at Yahoo in 1996, making millions in the process. His biggest asset from that time at Yahoo: befriending Koum, a Ukrainian immigrant he clicked with over their similar no-nonsense style.

.. WhatsApp, persuading a handful of former Yahoo colleagues to fund a seed round while he took on cofounder status and wound up with a roughly 20% stake.

.. two things sparked Zuckerberg’s mega-offer in early 2014. One was hearing that WhatsApp’s founders had been invited to Google’s Mountain View headquarters for talks, and he did not want to lose them to a competitor.

.. He recalls Zuckerberg being “supportive” of WhatsApp’s plans to roll out end-to-end encryption, even though it would block attempts to harvest user data. If anything, he was “quick to respond” during the discussions. Zuckerberg “was not immediately evaluating ramifications in the long term.”

.. told them that they would have “zero pressure” on monetization for the next five years.

.. Facebook prepared Acton to meet with around a dozen representatives of the European Competition Commission in a teleconference. “I was coached to explain that it would be really difficult to merge or blend data between the two systems,”

.. Later he learned that elsewhere in Facebook, there were “plans and technologies to blend data.” Specifically, Facebook could use the 128-bit string of numbers assigned to each phone as a kind of bridge between accounts. The other method was phone-number matching, or pinpointing Facebook accounts with phone numbers and matching them to WhatsApp accounts with the same phone number.

.. Within 18 months, a new WhatsApp terms of service linked the accounts and made Acton look like a liar. “I think everyone was gambling because they thought that the EU might have forgotten because enough time had passed.” No such luck: Facebook wound up paying a $122 million fine for giving “incorrect or misleading information” to the EU—a cost of doing business

.. Linking these overlapping accounts was a crucial first step toward monetizing WhatsApp. The terms-of-service update would lay the groundwork for how WhatsApp could make money. During the discussions over these changes, Facebook sought “broader rights” to WhatsApp user data, Acton says, but WhatsApp’s founders pushed back, reaching a compromise with Facebook management. A clause about no ads would remain, but Facebook would still link the accounts to present friend suggestions on Facebook and offer its advertising partners better targets for ads on Facebook.

.. By then, three years since the deal, Zuckerberg was growing impatient, Acton says, and he expressed his frustrations at an all-hands meeting for WhatsApp staffers. “The CFO projections, the ten-year outlook—they wanted and needed the WhatsApp revenues to continue to show the growth to Wall Street,”

.. Internally, Facebook had targeted a $10 billion revenue run rate within five years of monetization, but such numbers sounded too high to Acton—and reliant on advertising.

.. Acton had left a management position on Yahoo’s ad division over a decade earlier with frustrations at the Web portal’s so-called “Nascar approach” of putting ad banners all over a Web page. The drive for revenue at the expense of a good product experience “gave me a bad taste in my mouth,” Acton remembers. He was now seeing history repeat.

.. He has supercharged a small messaging app, Signal, run by a security researcher named Moxie Marlinspike with a mission to put users before profit, giving it $50 million and turning it into a foundation. Now he’s working with the same people who built the opensource encryption protocol that is part of Signal and protects WhatsApp’s 1.5 billion users and that also sits as an option on Facebook Messenger, Microsoft’s Skype and Google’s Allo messenger. Essentially, he’s re-creating WhatsApp in the pure, idealized form it started: free messages and calls, with end-to-end encryption and no obligations to ad platforms.

Richard Rohr Meditation: A View from the Bottom

Only by solidarity with other people’s suffering can comfortable people be converted. Otherwise we are disconnected from the cross—of the world, of others, of Jesus, and finally of our own necessary participation in the great mystery of dying and rising. People who are considered outsiders and at the bottom of society—the lame, poor, blind, prostitutes, tax collectors, “sinners”—are the ones who understand Jesus’ teaching. It’s the leaders and insiders (the priests, scribes, Pharisees, teachers of the law, and Roman officials) who crucify him.

.. Brian McLaren is not afraid to say directly that it is time for us to acknowledge Christianity’s past fraught with imperialism and colonialism:

About forty years before 1492, Pope Nicholas V issued an official document called Romanus Pontifex . . . which serves as the basis for what is commonly called the Doctrine of Discovery, the teaching that whatever Christians “discover,” they can take and use as they wish. . . . Christian global mission is defined as to “invade, search out, capture, vanquish, and subdue” non-Christians around the world, and to steal “all movable and immovable goods” and to “reduce their persons to perpetual slavery”—and not only them, but their descendants. And notice the stunning use of the word convert: “to convert them to his and their use and profit.” [2]

.. In addition to this doctrine, selective use and interpretation of the Bible was used to justify slavery for centuries. Scripture is still used by some today to exclude and judge LGBTQIA individuals, even though Jesus said very little about sexuality and a great deal about other things we conveniently ignore.

Roy Moore is unfit to serve

electing Mr. Moore would be a sure way to worsen Washington’s problems. His unapologetic extremism would pour gasoline on the already raging fire of partisanship and dysfunction.

.. If you think the Senate has failed to attend to the country’s real problems, just wait until he gets there. Mr. Moore spins up “facts” to serve his worldview, as when he claimed that

..  once a federal court had made a decision on the matter, he should have complied. Instead he refused, leading to his removal from the bench. After eventually making his way back into black robes, he was sidelined a second time after again refusing to obey federal court rulings.

.. Mr. Moore’s mockery of his judicial oath is only one indication of questionable moral fitness. Another is how he profited from his charity.

Facebook: You Are the Product

In the far distant days of October 2012, when Facebook hit one billion users, 55 per cent of them were using it every day. At two billion, 66 per cent are. Its user base is growing at 18 per cent a year – which you’d have thought impossible for a business already so enormous. Facebook’s biggest rival for logged-in users is YouTube, owned by its deadly rival Alphabet (the company formerly known as Google), in second place with 1.5 billion monthly users.

.. Three of the next four biggest apps, or services, or whatever one wants to call them, are WhatsApp, Messenger and Instagram, with 1.2 billion, 1.2 billion, and 700 million users respectively (the Chinese app WeChat is the other one, with 889 million). Those three entities have something in common: they are all owned by Facebook. No wonder the company is the fifth most valuable in the world, with a market capitalisation of $445 billion.

.. He said that the company was changing its ‘mission statement’, its version of the canting pieties beloved of corporate America. Facebook’s mission used to be ‘making the world more open and connected’. A non-Facebooker reading that is likely to ask: why? Connection is presented as an end in itself, an inherently and automatically good thing. Is it, though?

.. Facebook is generally agreed to have played a big, perhaps even a crucial, role in the election of Donald Trump. The benefit to humanity is not clear. This thought, or something like it, seems to have occurred to Zuckerberg, because the new mission statement spells out a reason for all this connectedness. It says that the new mission is to ‘give people the power to build community and bring the world closer together’.

.. Facebook is in a long line of such enterprises, though it might be the purest ever example of a company whose business is the capture and sale of attention. Very little new thinking was involved in its creation. As Wu observes, Facebook is ‘a business with an exceedingly low ratio of invention to success’.

What Zuckerberg had instead of originality was the ability to get things done and to see the big issues clearly. The crucial thing with internet start-ups is the ability to execute plans and to adapt to changing circumstances. It’s Zuck’s skill at doing that – at hiring talented engineers, and at navigating the big-picture trends in his industry – that has taken his company to where it is today.

..  The movie Zuckerberg is a highly credible character, a computer genius located somewhere on the autistic spectrum with minimal to non-existent social skills. But that’s not what the man is really like. In real life, Zuckerberg was studying for a degree with a double concentration in computer science and – this is the part people tend to forget – psychology. People on the spectrum have a limited sense of how other people’s minds work; autists, it has been said, lack a ‘theory of mind’. Zuckerberg, not so much. He is very well aware of how people’s minds work and in particular of the social dynamics of popularity and status.

.. The initial launch of Facebook was limited to people with a Harvard email address; the intention was to make access to the site seem exclusive and aspirational. (And also to control site traffic so that the servers never went down. Psychology and computer science, hand in hand.) Then it was extended to other elite campuses in the US. When it launched in the UK, it was limited to Oxbridge and the LSE. The idea was that people wanted to look at what other people like them were doing, to see their social networks, to compare, to boast and show off, to give full rein to every moment of longing and envy, to keep their noses pressed against the sweet-shop window of others’ lives.

.. This focus attracted the attention of Facebook’s first external investor, the now notorious Silicon Valley billionaire Peter Thiel. Again, The Social Network gets it right: Thiel’s $500,000 investment in 2004 was crucial to the success of the company. But there was a particular reason Facebook caught Thiel’s eye, rooted in a byway of intellectual history. In the course of his studies at Stanford – he majored in philosophy – Thiel became interested in the ideas of the US-based French philosopher René Girard, as advocated in his most influential book, Things Hidden since the Foundation of the World. Girard’s big idea was something he called ‘mimetic desire’. Human beings are born with a need for food and shelter. Once these fundamental necessities of life have been acquired, we look around us at what other people are doing, and wanting, and we copy them. In Thiel’s summary, the idea is ‘that imitation is at the root of all behaviour’.

.. Girard was a Christian, and his view of human nature is that it is fallen. We don’t know what we want or who we are; we don’t really have values and beliefs of our own; what we have instead is an instinct to copy and compare. We are homo mimeticus. ‘Man is the creature who does not know what to desire, and who turns to others in order to make up his mind. We desire what others desire because we imitate their desires.’ Look around, ye petty, and compare. The reason Thiel latched onto Facebook with such alacrity was that he saw in it for the first time a business that was Girardian to its core: built on people’s deep need to copy. ‘Facebook first spread by word of mouth, and it’s about word of mouth, so it’s doubly mimetic,’ Thiel said. ‘Social media proved to be more important than it looked, because it’s about our natures.’ We are keen to be seen as we want to be seen, and Facebook is the most popular tool humanity has ever had with which to do that.

.. The view of human nature implied by these ideas is pretty dark. If all people want to do is go and look at other people so that they can compare themselves to them and copy what they want – if that is the final, deepest truth about humanity and its motivations – then Facebook doesn’t really have to take too much trouble over humanity’s welfare, since all the bad things that happen to us are things we are doing to ourselves. For all the corporate uplift of its mission statement, Facebook is a company whose essential premise is misanthropic.

.. The highest-profile recent criticisms of the company stem from its role in Trump’s election. There are two components to this, one of them implicit in the nature of the site, which has an inherent tendency to fragment and atomise its users into like-minded groups. The mission to ‘connect’ turns out to mean, in practice, connect with people who agree with you. We can’t prove just how dangerous these ‘filter bubbles’ are to our societies, but it seems clear that they are having a severe impact on our increasingly fragmented polity. Our conception of ‘we’ is becoming narrower.

..  The portmanteau terms for these developments are ‘fake news’ and ‘post-truth’, and they were made possible by the retreat from a general agora of public debate into separate ideological bunkers. In the open air, fake news can be debated and exposed; on Facebook, if you aren’t a member of the community being served the lies, you’re quite likely never to know that they are in circulation. It’s crucial to this that Facebook has no financial interest in telling the truth. No company better exemplifies the internet-age dictum that if the product is free, you are the product.

.. Facebook’s customers aren’t the people who are on the site: its customers are the advertisers who use its network and who relish its ability to direct ads to receptive audiences. Why would Facebook care if the news streaming over the site is fake? Its interest is in the targeting, not in the content. This is probably one reason for the change in the company’s mission statement. If your only interest is in connecting people, why would you care about falsehoods? They might even be better than the truth, since they are quicker to identify the like-minded. The newfound ambition to ‘build communities’ makes it seem as if the company is taking more of an interest in the consequence of the connections it fosters.

.. in an interesting paper published by its internal security division. ‘Fake news’, they argue, is an unhelpful, catch-all term because misinformation is in fact spread in a variety of ways:

  1. Information (or Influence) Operations – Actions taken by governments or organised non-state actors to distort domestic or foreign political sentiment.
  2. False News – News articles that purport to be factual, but which contain intentional misstatements of fact with the intention to arouse passions, attract viewership, or deceive.
  3. False Amplifiers – Co-ordinated activity by inauthentic accounts with the intent of manipulating political discussion (e.g. by discouraging specific parties from participating in discussion, or amplifying sensationalistic voices over others).
  4. Disinformation – Inaccurate or manipulated information/content that is spread intentionally. This can include false news, or it can involve more subtle methods, such as false flag operations, feeding inaccurate quotes or stories to innocent intermediaries, or knowingly amplifying biased or misleading information.

..  One man’s fake news is another’s truth-telling, and Facebook works hard at avoiding responsibility for the content on its site – except for sexual content, about which it is super-stringent. Nary a nipple on show. It’s a bizarre set of priorities, which only makes sense in an American context, where any whiff of explicit sexuality would immediately give the site a reputation for unwholesomeness. Photos of breastfeeding women are banned and rapidly get taken down. Lies and propaganda are fine.

.. The key to understanding this is to think about what advertisers want: they don’t want to appear next to pictures of breasts because it might damage their brands, but they don’t mind appearing alongside lies because the lies might be helping them find the consumers they’re trying to target.

.. Jonathan Taplin points to an analysis on Buzzfeed: ‘In the final three months of the US presidential campaign, the top-performing fake election news stories on Facebook generated more engagement than the top stories from major news outlets such as the New York TimesWashington PostHuffington Post, NBC News and others.’ This doesn’t sound like a problem Facebook will be in any hurry to fix.

.. The fact is that fraudulent content, and stolen content, are rife on Facebook, and the company doesn’t really mind, because it isn’t in its interest to mind. Much of the video content on the site is stolen from the people who created it. An illuminating YouTube video from Kurzgesagt, a German outfit that makes high-quality short explanatory films, notes that in 2015, 725 of Facebook’s top one thousand most viewed videos were stolen.

.. Zuckerberg himself has spoken up on this issue, in a Facebook post addressing the question of ‘Facebook and the election’. After a certain amount of boilerplate bullshit (‘Our goal is to give every person a voice. We believe deeply in people’), he gets to the nub of it. ‘Of all the content on Facebook, more than 99 per cent of what people see is authentic. Only a very small amount is fake news and hoaxes.’ More than one Facebook user pointed out that in their own news feed, Zuckerberg’s post about authenticity ran next to fake news. In one case, the fake story pretended to be from the TV sports channel ESPN. When it was clicked on, it took users to an ad selling a diet supplement. As the writer Doc Searls pointed out, it’s a double fraud, ‘outright lies from a forged source’, which is quite something to have right slap next to the head of Facebook boasting about the absence of fraud.

.. Facebook needs content, obviously, because that’s what the site consists of: content that other people have created. It’s just that it isn’t too keen on anyone apart from Facebook making any money from that content. Over time, that attitude is profoundly destructive to the creative and media industries. Access to an audience – that unprecedented two billion people – is a wonderful thing, but Facebook isn’t in any hurry to help you make money from it.

.. Facebook is in essence an advertising company which is indifferent to the content on its site except insofar as it helps to target and sell advertisements.

.. A version of Gresham’s law is at work, in which fake news, which gets more clicks and is free to produce, drives out real news, which often tells people things they don’t want to hear, and is expensive to produce.

.. Its news feed directs traffic at you based not on your interests, but on how to make the maximum amount of advertising revenue from you.

.. In the early years of Facebook, Zuckerberg was much more interested in the growth side of the company than in the monetisation. That changed when Facebook went in search of its big payday at the initial public offering

.. Naomi, between chats with Cox, was clicking away on her laptop, paying little attention to the Zuckian harangue. I peered over her shoulder at her screen. She was scrolling down an email with a number of links, and progressively clicking each one into existence as another tab on her browser. Clickathon finished, she began lingering on each with an appraiser’s eye. They were real estate listings, each for a different San Francisco property.

Martínez took note of one of the properties and looked it up later. Price: $2.4 million. He is fascinating, and fascinatingly bitter, on the subject of class and status differences in Silicon Valley, in particular the never publicly discussed issue of the huge gulf between early employees in a company, who have often been made unfathomably rich, and the wage slaves who join the firm later in its story.

.. When the time came for the IPO, Facebook needed to turn from a company with amazing growth to one that was making amazing money. It was already making some, thanks to its sheer size – as Martínez observes, ‘a billion times any number is still a big fucking number’ – but not enough to guarantee a truly spectacular valuation on launch. It was at this stage that the question of how to monetise Facebook got Zuckerberg’s full attention. It’s interesting, and to his credit, that he hadn’t put too much focus on it before – perhaps because he isn’t particularly interested in money per se. But he does like to win.

.. If I want to reach women between the ages of 25 and 30 in zip code 37206 who like country music and drink bourbon, Facebook can do that. Moreover, Facebook can often get friends of these women to post a ‘sponsored story’ on a targeted consumer’s news feed, so it doesn’t feel like an ad. As Zuckerberg said when he introduced Facebook Ads, ‘Nothing influences people more than a recommendation from a trusted friend. A trusted referral is the Holy Grail of advertising.’

..  (Particular segments of voters too can be targeted with complete precision. One instance from 2016 was an anti-Clinton ad repeating a notorious speech she made in 1996 on the subject of ‘super-predators’. The ad was sent to African-American voters in areas where the Republicans were trying, successfully as it turned out, to suppress the Democrat vote. Nobody else saw the ads.)

.. What this means is that even more than it is in the advertising business, Facebook is in the surveillance business. Facebook, in fact, is the biggest surveillance-based enterprise in the history of mankind. It knows far, far more about you than the most intrusive government has ever known about its citizens. It’s amazing that people haven’t really understood this about the company. I’ve spent time thinking about Facebook, and the thing I keep coming back to is that its users don’t realise what it is the company does. What Facebook does is watch you, and then use what it knows about you and your behaviour to sell ads. I’m not sure there has ever been a more complete disconnect between what a company says it does – ‘connect’, ‘build communities’ – and the commercial reality.

.. Note that the company’s knowledge about its users isn’t used merely to target ads but to shape the flow of news to them. Since there is so much content posted on the site, the algorithms used to filter and direct that content are the thing that determines what you see: people think their news feed is largely to do with their friends and interests, and it sort of is, with the crucial proviso that it is their friends and interests as mediated by the commercial interests of Facebook. Your eyes are directed towards the place where they are most valuable for Facebook.

..  Wu’s history of attention merchants shows that there is a suggestive pattern here: that a boom is more often than not followed by a backlash, that a period of explosive growth triggers a public and sometimes legislative reaction.

.. forms of backlash, both major and minor, are all but inevitable.’ Wu calls a minor form of this phenomenon the ‘disenchantment effect’.

.. Ad Week estimates the annual cost of click fraud at $7 billion, about a sixth of the entire market.

.. Estimates of fraudulent traffic’s market share are variable, with some guesses coming in at around 50 per cent; some website owners say their own data indicates a fraudulent-click rate of 90 per cent. This is by no means entirely Facebook’s problem, but it isn’t hard to imagine how it could lead to a big revolt against ‘ad tech’, as this technology is generally known, on the part of the companies who are paying for it

.. A customers’ revolt could overlap with a backlash from regulators and governments.

.. Facebook has done a huge amount to lower the quality of public debate and to ensure that it is easier than ever before to tell what Hitler approvingly called ‘big lies’ and broadcast them to a big audience. The company has no business need to care about that, but it is the kind of issue that could attract the attention of regulators.

.. were it to be generally understood that Facebook’s business model is based on surveillance, the company would be in danger. The one time Facebook did poll its users about the surveillance model was in 2011, when it proposed a change to its terms and conditions
.. The other thing that could happen at the level of individual users is that people stop using Facebook because it makes them unhappy.
.. The resulting paper, published in the Proceedings of the National Academy of Sciences, was a study of ‘social contagion’, or the transfer of emotion among groups of people, as a result of a change in the nature of the stories seen by 689,003 users of Facebook. ‘When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.’ The scientists seem not to have considered how this information would be received, and the story played quite big for a while.
.. the more people use Facebook, the more unhappy they are. A 1 per cent increase in ‘likes’ and clicks and status updates was correlated with a 5 to 8 per cent decrease in mental health.
.. In addition, they found that the positive effect of real-world interactions, which enhance well-being, was accurately paralleled by the ‘negative associations of Facebook use’. In effect people were swapping real relationships which made them feel good for time on Facebook which made them feel bad.
.. Russians, about a hundred million of whom are on the net, tend not to use Facebook because they prefer their native copycat site VKontakte.
..Facebook and Google are the new colonial powers.’
.. I am scared of Facebook. The company’s ambition, its ruthlessness, and its lack of a moral compass scare me. It goes back to that moment of its creation, Zuckerberg at his keyboard after a few drinks creating a website to compare people’s appearance, not for any real reason other than that he was able to do it. That’s the crucial thing about Facebook, the main thing which isn’t understood about its motivation: it does things because it can. Zuckerberg knows how to do something, and other people don’t, so he does it. Motivation of that type doesn’t work in the Hollywood version of life, so Aaron Sorkin had to give Zuck a motive to do with social aspiration and rejection. But that’s wrong, completely wrong. He isn’t motivated by that kind of garden-variety psychology. He does this because he can, and justifications about ‘connection’ and ‘community’ are ex post facto rationalisations.