Microsoft, Facebook, trust and privacy

I’ve been reminded of this ancient history a lot in the last year or two as I’ve looked at news around abuse and hostile state activity on Facebook, YouTube and other social platforms, because much like the Microsoft macro viruses, the ‘bad actors’ on Facebook did things that were in the manual. They didn’t prise open a locked window at the back of the building – they knocked on the front door and walked in. They did things that you were supposed to be able to do, but combined them in an order and with malign intent that hadn’t really been anticipated.

It’s also interesting to compare the public discussion of Microsoft and of Facebook before these events. In the  1990s, Microsoft was the ‘evil empire’, and a lot of the narrative within tech focused on how it should be more open, make it easier for people to develop software that worked with the Office monopoly, and make it easier to move information in and out of its products. Microsoft was ‘evil’ if it did anything to make life harder for developers. Unfortunately, whatever you thought of this narrative, it pointed in the wrong direction when it came to this use case. Here, Microsoft was too open, not too closed.

Equally, in the last 10 years   – that is is too hard to get your information out and too hard for researchers to pull information from across the platform. People have argued that Facebook was too restrictive on how third party developers could use the platform. And people have objected to Facebook’s attempts to enforce the single real identities of accounts. As for Microsoft, there may well have been justice in all of these arguments, but also as for Microsoft, they pointed in the wrong direction when it came to this particular scenario. For the Internet Research Agency, it was too easy to develop for Facebook, too easy to get data out, and too easy to change your identity. The walled garden wasn’t walled enough.

.. Conceptually, this is almost exactly what Facebook has done: try to remove existing opportunities for abuse and avoid creating new ones, and scan for bad actors.

Microsoft Facebook
Remove openings for abuse Close down APIs and look for vulnerabilities Close down APIs and look for vulnerabilities
Scan for bad behavior Virus and malware scanners Human moderation

(It’s worth noting that these steps were precisely what people had previously insisted was evil – Microsoft deciding what code you can run on your own computer and what APIs developers can use, and Facebook deciding (people demanding that Facebook decide) who and what it distributes.)

  • .. If there is no data stored on your computer then compromising the computer doesn’t get an attacker much.
  • An application can’t steal your data if it’s sandboxed and can’t read other applications’ data.
  • An application can’t run in the background and steal your passwords if applications can’t run in the background.
  • And you can’t trick a user into installing a bad app if there are no apps.

Of course, human ingenuity is infinite, and this change just led to the creation of new attack models, most obviously phishing, but either way, none of this had much to do with Microsoft. We ‘solved’ viruses by moving to new architectures that removed the mechanics that viruses need, and where Microsoft wasn’t present.

.. In other words, where Microsoft put better locks and a motion sensor on the windows, the world is moving to a model where the windows are 200 feet off the ground and don’t open.

.. Much like moving from Windows to cloud and ChromeOS, you could see this as an attempt to remove the problem rather than patch it.

  • Russians can’t go viral in your newsfeed if there is no newsfeed.
  • ‘Researchers’ can’t scrape your data if Facebook doesn’t have your data. You solve the problem by making it irrelevant.

This is one way to solve the problem by changing the core mechanics, but there are others. For example, Instagram does have a one-to-many feed but does not suggest content from people you don’t yourself follow in the main feed and does not allow you to repost into your friends’ feeds. There might be anti-vax content in your feed, but one of your actual friends has to have decided to share it with you. Meanwhile, problems such as the spread of dangerous rumours in India rely on messaging rather than sharing – messaging isn’t a panacea. 

Indeed, as it stands Mr Zuckerberg’s memo raises as many questions as it answers – most obviously, how does advertising work? Is there advertising in messaging, and if so, how is it targeted? Encryption means Facebook doesn’t know what you’re talking about, but the Facebook apps on your phone necessarily would know (before they encrypt it), so does targeting happen locally? Meanwhile, encryption in particular poses problems for tackling other kinds of abuse: how do you help law enforcement deal with child exploitation if you can’t read the exploiters’ messages (the memo explicitly talks about this as a challenge)? Where does Facebook’s Blockchain project sit in all of this?

There are lots of big questions, though of course there would also have been lots of questions if in 2002 you’d said that all enterprise software would go to the cloud. But the difference here is that Facebook is trying (or talking about trying) to do the judo move itself, and to make a fundamental architectural change that Microsoft could not.

Mark Zuckerberg’s Delusion of Consumer Consent

He said Facebook users want tailored ads. According to our research, that’s not true.

.. In a recent Wall Street Journal commentary, Mark Zuckerberg claimed that Facebook users want to see ads tailored to their interests. But the data show the opposite is true.
..large majorities don’t want personalized ads — and when they learn how companies find out information about them, even greater percentages don’t want them.
..To Mr. Zuckerberg, protecting ad personalization from privacy rules is key. His essay argues that regulatory intervention would take away a “free” goody from the public. Facebook makes virtually all its revenues from advertising, and it has created enormous amounts of data about the people who use Facebook and the larger internet. In his essay, Mr. Zuckerberg defends Facebook from a chorus of critics who rail against a business model that they argue uses and abuses people’s information under the guise of transparency, choice and control. Mr. Zuckerberg therefore has an interest in arguing that he and his colleagues well understand what his audience wants. “People consistently tell us that if they’re going to see ads, they want them to be relevant,” he writes. “That means we need to understand their interests.”
.. Sixty-one percent of respondents said no, they did not want tailored ads for products and services, 56 percent said no to tailored news, 86 percent said no to tailored political ads, and 46 percent said no to tailored discounts. But when we added in the results of the second set of questions about tracking people on that firm’s website, other websites and offline, the percentage that in the end decided they didn’t want tailoring ranged from 89 percent to 93 percent with political ads, 68 percent to 84 percent for commercial ads, 53 percent to 77 percent for discounts, and 64 percent to 83 percent for news.

How Google Tracks Your Personal Information

An insider’s account of the dark side of search engine marketing

Today, Google provides marketers like me with so much of your personal data that we can infer more about you from it than from any camera or microphone.

There have never been more opportunities for marketers like me to exploit your data. Today, 40,000 Google search queries are conducted every second. That’s 3.5 billion searches per day, 1.2 trillion searches per year.

When you search on Google, your query travels to a data center, where up to 1,000 computers work together to retrieve the results and send them back to you. This whole process usually happens in less than one-fifth of a second.

Most people don’t realize that while this is going on, an even faster and more mysterious process is happening behind the scenes: An auction is taking place.

Facebook’s Monopoly Tyranny is no Longer a Zuckerberg Secret

Meanwhile a huge trove of Facebook emails that just leaked and it shows the true nature of the Zuckerberg legacy. The UK Parliament published a trove of top-secret Facebook executive emails on December 5th. tl;dr it’s exactly what you’d expect.

  • Mark Zuckerberg personally approved Facebook’s decision to cut off social network Vine’s data. (so much for a capitalism of fair competition)
  • Facebook tried to figure out how to grab users’ call data without asking permission. (likely spying on their real-time conversations)
  • Certain key apps were white-listed and given greater access to user data even after a broader clampdown. (Netflix and Airbnb among the favored friends)
  • Mark Zuckerberg privately admitted that what’s good for the world isn’t necessarily what’s good for Facebook. (Where’s the world tour Mark?)
  • Mark Zuckerberg suggested users’ data was worth 10 cents a year. (Heck, is that even worth selling?)

In the process of creating one of the most corrupt business models ever invented, Facebook chose profits over its users. In a weird twist of fate it’s the UK that seems to have stood up to Facebook, where American doesn’t even regulate its offending tech companies.

.. The leaked Emails show Execs discussed the single biggest threat to Facebook. They ended up disrupting journalism, diverting internet traffic and turning into a weaponized platform used against the state, democracy and capitalism, slowing down rivals and thwarting innovation itself allowing Chinese companies like Tencent and ByteDance to overtake them.

.. Facebook staff in 2012 discussed selling access to user data to major advertisers, basically selling your info without your consent. Facebook’s profit seeking greed led to the centralization of data where the richer get richer on the poor public’s data.

.. There is evidence that Facebook’s refusal to share data with some apps caused them to fail. Facebook picked the winners in a fake internet, even deceiving advertising that video (on its platform) was the next big thing. Facebook was later found to have falsified video metrics significantly to deceive advertisers and brands.

You can view all 250 pages of the Facebook documents right here.