How Mark Zuckerberg Can Save Facebook — and Us

“The first inning’s prevailing ethos was that any technology that makes the world more open by connecting us or makes us more equal by empowering us individually must, in and of itself, be a force for good,” Seidman began. “But, in inning two, we are coming to grips with the reality that the power to make the world more open and equal is not in the technologies themselves. It all depends on how the tools are designed and how we choose to use them. The same amazing tech that enables people to forge deeper relationships, foster closer communities and give everyone a voice can also breed isolation, embolden racists, and empower digital bullies and nefarious actors.”

.. “The world is fused. So there no place anymore to stand to the side and claim neutrality — to say, ‘I am just a businessperson’ or ‘I am just running a platform.’ ”
.. In the fused world, Seidman said, “the business of business is no longer just business. The business of business is now society. And, therefore, how you take or don’t take responsibility for what your technology enables or for what happens on your platforms is inescapable.
.. “Software solutions can increase our confidence that we can stay a step ahead of the bad guys. But, fundamentally, it will take more ‘moralware’ to regain our trust. Only one kind of leadership can respond to this kind of problem — moral leadership.”
.. What does moral leadership look like here?

“Moral leadership means truly putting people first and making whatever sacrifices that entails,”

.. “That means not always competing on shallow things and quantity — on how much time people spend on your platform — but on quality and depth. It means seeing and treating people not just as ‘users’ or ‘clicks,’ but as ‘citizens,’ who are worthy of being accurately informed to make their best choices. It means not just trying to shift people from one click to another, from one video to another, but instead trying to elevate them in ways that deepen our connections and enrich our conversations.”

.. It means, Seidman continued, being “fully transparent about how you operate, and make decisions that affect them — all the ways in which you’re monetizing their data. It means having the courage to publish explicit standards of quality and expectations of conduct, and fighting to maintain them however inconvenient. It means having the humility to ask for help even from your critics. It means promoting civility and decency, making the opposite unwelcome.

.. At the height of the Cold War, when the world was threatened by spreading Communism and rising walls, President John F. Kennedy vowed to “pay any price and bear any burden” to ensure the success of liberty.

Mark Zuckerberg Facebook Post on Cambridgee Analytica

In 2013, a Cambridge University researcher named Aleksandr Kogan created a personality quiz app. It was installed by around 300,000 people who shared their data as well as some of their friends’ data. Given the way our platform worked at the time this meant Kogan was able to access tens of millions of their friends’ data.

In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the data apps could access. Most importantly, apps like Kogan’s could no longer ask for data about a person’s friends unless their friends had also authorized the app. We also required developers to get approval from us before they could request any sensitive data from people. These actions would prevent any app like Kogan’s from being able to access so much data today.

In 2015, we learned from journalists at The Guardian that Kogan had shared data from his app with Cambridge Analytica. It is against our policies for developers to share data without people’s consent, so we immediately banned Kogan’s app from our platform, and demanded that Kogan and Cambridge Analytica formally certify that they had deleted all improperly acquired data. They provided these certifications.

Last week, we learned from The Guardian, The New York Times and Channel 4 that Cambridge Analytica may not have deleted the data as they had certified. We immediately banned them from using any of our services. Cambridge Analytica claims they have already deleted the data and has agreed to a forensic audit by a firm we hired to confirm this. We’re also working with regulators as they investigate what happened.

.. Second, we will restrict developers’ data access even further to prevent other kinds of abuse. For example, we will remove developers’ access to your data if you haven’t used their app in 3 months. We will reduce the data you give an app when you sign in — to only your name, profile photo, and email address.

.. In the next month, we will show everyone a tool at the top of your News Feed with the apps you’ve used and an easy way to revoke those apps’ permissions to your data. We already have a tool to do this in your privacy settings, and now we will put this tool at the top of your News Feed to make sure everyone sees it.

Why the Outrage?

William Davies on Cambridge Analytica

If forty thousand people scattered across Michigan, Wisconsin and Pennsylvania had changed their minds about Donald Trump before 8 November 2016, and cast their votes instead for Hillary Clinton, this small London-based political consultancy would not now be the subject of breathless headlines and Downing Street statements. Cambridge Analytica could have harvested, breached, brain-washed and honey-trapped to their evil hearts’ content, but if Clinton had won, it wouldn’t be a story.

.. It’s true that Cambridge Analytica was recruited to work on the Trump campaign, though not necessarily because of its Machiavellian brilliance. Steve Bannon, Trump’s campaign manager, was on the board of the company at the time, and probably tossed it a contract for some data analysis so as to keep things between friends.

.. First, there is no firm evidence that Cambridge Analytica provided consultancy services to any of the major players in the EU referendum of 2016. Nix initially bragged in an article that it had, but confessed to the Digital, Culture, Media and Sport Select Committee in February this year that the article had been drafted by a ‘slightly overzealous PR consultant’.

.. Second, there is not – and cannot be – any evidence that it swung the election for Trump (by the same token, it isn’t strictly provable that it didn’t), though unsurprisingly the company claims otherwise. This still appears painful for Clinton herself to accept. Interviewed for one of the Channel 4 reports, she speaks of Cambridge Analytica’s ‘massive propaganda effort [which] affected the thought processes of voters’. And yet data analysis is at the heart of modern political campaigning. Clinton, after all, preferred to study data on Michigan from the comfort of her Brooklyn campaign office than actually to visit the state, even as panicking Michigan Democrats pleaded with her to spend time there in the final weeks. If things had turned out differently, there would no doubt have been star-struck puff pieces on the bleeding edge data analytics that were behind the election of America’s first female president.

.. no one, surely, will be surprised to discover that data collected in one arena is put to work in another. Using data in novel (and secretive) ways is virtually the governing principle of the digital economy – what Shoshana Zuboff has termed ‘surveillance capitalism’, and Nick Srnicek calls ‘platform capitalism’.

.. It’s worth remembering that throughout the 1990s, the internet was viewed as a threat to capitalism as much as an opportunity. Napster was the iconic example. It wasn’t clear where the profits lay, once information was abundant and individual anonymity was the norm. What changed, as Zuboff and Srnicek both explore in different ways, was that the internet began to be treated as a surveillance device of potentially global proportions: cheaper, better or free services were provided on condition that the ‘user’ would be tracked in everything they did and anchored in their offline identity. The fact that most tech giants made – and in Uber’s case still make – vast losses for the first few years of their existence is integral to this strategy. People must be lured into using a service and then kept using it by whatever means necessary; only later is this power converted into revenue.

.. The second aspect of the recent scandal is grubbier but ultimately less significant. If its own sales pitch is to be believed (an ‘if’ that grows larger by the day), Cambridge Analytica likes to play dirty.

.. Throwaway remarks, that the candidate is just a ‘puppet’ to the campaign team and that ‘facts’ are less important than ‘emotion’, look shady when caught on a hidden camera, but they’re not categorically different from the early ruthlessness of New Labour operators such as Alastair Campbell, Philip Gould and Peter Mandelson. Nor is there any reason to assume that New Labour’s 1990 analogue methods of data analysis – focus group and polling – are less informative or useful than automated psychometrics.

.. a displacement of horror that really stems from something deeper. Part of that must lie with Trump and Trumpism. A terrible event must surely have been delivered by equally terrible means.

.. Cambridge Analytica looks conveniently like a smoking gun, primarily because it has repeatedly bragged that it is one. Nix and Turnbull do for the events of 2016 what ‘Fabulous’ Fab Tourre, former Goldman Sachs banker, and Fred ‘The Shred’ Goodwin, former boss of RBS, did for the banking crisis of 2008, providing grotesque personalities on which to focus anger and alarm.

.. But as with the financial crisis, the circus risks distracting from the real institutional and political questions, in this case concerning companies such as Facebook and the model of capitalism that tolerates, facilitates and even celebrates their extensive and sophisticated forms of data harvesting and analysis.

.. Just as environmentalists demand that the fossil fuel industry ‘leave it in the ground,’ the ultimate demand to be levelled at Silicon Valley should be ‘leave it in our heads.’ The real villain here is an expansionary economic logic that insists on inspecting ever more of our thoughts, feelings and relationships. The best way to thwart this is the one Silicon Valley fears the most: anti-trust laws. Broken into smaller pieces, these companies would still be able to monitor us, but from disparate perspectives that couldn’t easily (or secretly) be joined up