In Need of Guidance (Richard Rohr)

After he retired, my father cried in my arms and said, “I don’t know who I am now. I don’t know who I am. . . . Pray with me, pray with me.” Here I was a grown-up man, a priest, supposed to be strong for my father. I didn’t know how to do it. I guess I said the appropriate priestly words. But I didn’t know how to guide him into the second half of life, and he was begging for a guide.

The church wasn’t much of a guide in such things. The common sermon was on the evil of abortion. My mom in her 70s would come home and say, “Why does the priest keep telling us the same thing? I can’t have babies anymore!” That’s what happens when the Church doesn’t grow up or support its growing members. We focus on something that’s quantifiable and seemingly clear and has no subtlety to it. It’s mostly black and white thinking, usually about individual body-based sins. We know who the sinners are, and we know who the saints are, and we don’t have to struggle with the mixed blessing that every human being is. We’re all mixed blessings and partly sinners, and we always will be. But this wisdom only comes later, when we’ve learned to listen to the different voices that guide us in the second half of life.

These deeper voices will sound like risk, trust, surrender, uncommon sense, destiny, love. They will be the voices of an intimate stranger, a voice that’s from somewhere else, and yet it’s my deepest self at the same time. It’s the still, small voice that the prophet Elijah slowly but surely learned to hear (see 1 Kings 19:11-13).

Microsoft, Facebook, trust and privacy

I’ve been reminded of this ancient history a lot in the last year or two as I’ve looked at news around abuse and hostile state activity on Facebook, YouTube and other social platforms, because much like the Microsoft macro viruses, the ‘bad actors’ on Facebook did things that were in the manual. They didn’t prise open a locked window at the back of the building – they knocked on the front door and walked in. They did things that you were supposed to be able to do, but combined them in an order and with malign intent that hadn’t really been anticipated.

It’s also interesting to compare the public discussion of Microsoft and of Facebook before these events. In the  1990s, Microsoft was the ‘evil empire’, and a lot of the narrative within tech focused on how it should be more open, make it easier for people to develop software that worked with the Office monopoly, and make it easier to move information in and out of its products. Microsoft was ‘evil’ if it did anything to make life harder for developers. Unfortunately, whatever you thought of this narrative, it pointed in the wrong direction when it came to this use case. Here, Microsoft was too open, not too closed.

Equally, in the last 10 years   – that is is too hard to get your information out and too hard for researchers to pull information from across the platform. People have argued that Facebook was too restrictive on how third party developers could use the platform. And people have objected to Facebook’s attempts to enforce the single real identities of accounts. As for Microsoft, there may well have been justice in all of these arguments, but also as for Microsoft, they pointed in the wrong direction when it came to this particular scenario. For the Internet Research Agency, it was too easy to develop for Facebook, too easy to get data out, and too easy to change your identity. The walled garden wasn’t walled enough.

.. Conceptually, this is almost exactly what Facebook has done: try to remove existing opportunities for abuse and avoid creating new ones, and scan for bad actors.

Microsoft Facebook
Remove openings for abuse Close down APIs and look for vulnerabilities Close down APIs and look for vulnerabilities
Scan for bad behavior Virus and malware scanners Human moderation

(It’s worth noting that these steps were precisely what people had previously insisted was evil – Microsoft deciding what code you can run on your own computer and what APIs developers can use, and Facebook deciding (people demanding that Facebook decide) who and what it distributes.)

  • .. If there is no data stored on your computer then compromising the computer doesn’t get an attacker much.
  • An application can’t steal your data if it’s sandboxed and can’t read other applications’ data.
  • An application can’t run in the background and steal your passwords if applications can’t run in the background.
  • And you can’t trick a user into installing a bad app if there are no apps.

Of course, human ingenuity is infinite, and this change just led to the creation of new attack models, most obviously phishing, but either way, none of this had much to do with Microsoft. We ‘solved’ viruses by moving to new architectures that removed the mechanics that viruses need, and where Microsoft wasn’t present.

.. In other words, where Microsoft put better locks and a motion sensor on the windows, the world is moving to a model where the windows are 200 feet off the ground and don’t open.

.. Much like moving from Windows to cloud and ChromeOS, you could see this as an attempt to remove the problem rather than patch it.

  • Russians can’t go viral in your newsfeed if there is no newsfeed.
  • ‘Researchers’ can’t scrape your data if Facebook doesn’t have your data. You solve the problem by making it irrelevant.

This is one way to solve the problem by changing the core mechanics, but there are others. For example, Instagram does have a one-to-many feed but does not suggest content from people you don’t yourself follow in the main feed and does not allow you to repost into your friends’ feeds. There might be anti-vax content in your feed, but one of your actual friends has to have decided to share it with you. Meanwhile, problems such as the spread of dangerous rumours in India rely on messaging rather than sharing – messaging isn’t a panacea. 

Indeed, as it stands Mr Zuckerberg’s memo raises as many questions as it answers – most obviously, how does advertising work? Is there advertising in messaging, and if so, how is it targeted? Encryption means Facebook doesn’t know what you’re talking about, but the Facebook apps on your phone necessarily would know (before they encrypt it), so does targeting happen locally? Meanwhile, encryption in particular poses problems for tackling other kinds of abuse: how do you help law enforcement deal with child exploitation if you can’t read the exploiters’ messages (the memo explicitly talks about this as a challenge)? Where does Facebook’s Blockchain project sit in all of this?

There are lots of big questions, though of course there would also have been lots of questions if in 2002 you’d said that all enterprise software would go to the cloud. But the difference here is that Facebook is trying (or talking about trying) to do the judo move itself, and to make a fundamental architectural change that Microsoft could not.

Warning! Everything Is Going Deep: ‘The Age of Surveillance Capitalism’

Deep learning, deep insights, deep artificial minds — the list goes on and on. But with unprecedented promise comes some unprecedented peril.

Around the end of each year major dictionaries declare their “word of the year.” Last year, for instance, the most looked-up word at Merriam-Webster.com was “justice.” Well, even though it’s early, I’m ready to declare the word of the year for 2019.

The word is “deep.”

Why? Because recent advances in the speed and scope of digitization, connectivity, big data and artificial intelligence are now taking us “deep” into places and into powers that we’ve never experienced before — and that governments have never had to regulate before. I’m talking about

  • deep learning,
  • deep insights,
  • deep surveillance,
  • deep facial recognition,
  • deep voice recognition,
  • deep automation and
  • deep artificial minds.

..Which is why it may not be an accident that one of the biggest hit songs today is “Shallow,” from the movie “A Star Is Born.” The main refrain, sung by Lady Gaga and Bradley Cooper, is: “I’m off the deep end, watch as I dive in. … We’re far from the shallow now.”

.. We sure are. But the lifeguard is still on the beach and — here’s what’s really scary — he doesn’t know how to swim! More about that later. For now, how did we get so deep down where the sharks live?

The short answer: Technology moves up in steps, and each step, each new platform, is usually biased toward a new set of capabilities. Around the year 2000 we took a huge step up that was biased toward connectivity, because of the explosion of fiber-optic cable, wireless and satellites.

Suddenly connectivity became so fast, cheap, easy for you and ubiquitous that it felt like you could touch someone whom you could never touch before and that you could be touched by someone who could never touch you before.

Around 2007, we took another big step up. The iPhone, sensors, digitization, big data, the internet of things, artificial intelligence and cloud computing melded together and created a new platform that was biased toward abstracting complexity at a speed, scope and scale we’d never experienced before.

So many complex things became simplified. Complexity became so fast, free, easy to use and invisible that soon with one touch on Uber’s app you could page a taxi, direct a taxi, pay a taxi, rate a taxi driver and be rated by a taxi driver.