The Moral Economy of Tech

As computer programmers, our formative intellectual experience is working with deterministic systems that have been designed by other human beings. These can be very complex, but the complexity is not the kind we find in the natural world. It is ultimately always tractable. Find the right abstractions, and the puzzle box opens before you.

The feeling of competence, control and delight in discovering a clever twist that solves a difficult problem is what makes being a computer programmer sometimes enjoyable.

But as anyone who’s worked with tech people knows, this intellectual background can also lead to arrogance. People who excel at software design become convinced that they have a unique ability to understand any kind of system at all, from first principles, without prior training, thanks to their superior powers of analysis.

.. Approaching the world as a software problem is a category error that has led us into some terrible habits of mind.

BAD MENTAL HABITS

First, programmers are trained to seek maximal and global solutions. Why solve a specific problem in one place when you can fix the general problem for everybody, and for all time? We don’t think of this as hubris, but as a laudable economy of effort. And the startup funding culture of big risk, big reward encourages this grandiose mode of thinking. There is powerful social pressure to avoid incremental change, particularly any change that would require working with people outside tech and treating them as intellectual equals.

.. Instead of relying on algorithms, which we can be accused of manipulating for our benefit, we have turned to machine learning, an ingenious way of disclaiming responsibility for anything. Machine learning is like money laundering for bias. It’s a clean, mathematical apparatus that gives the status quo the aura of logical inevitability.

.. Google Ventures, for example, is seriously funding research into immortality. Their head VC will call you a “deathist” for pointing out that this is delusional.

.. Those who benefit from the death of privacy attempt to frame our subjugation in terms of freedom, just like early factory owners talked about the sanctity of contract law. They insisted that a worker should have the right to agree to anything, from sixteen-hour days to unsafe working conditions, as if factory owners and workers were on an equal footing.

.. Many of you had to obtain a US visa to attend this conference. The customs service announced yesterday it wants to start asking people for their social media profiles. Imagine trying to attend your next conference without a LinkedIn profile, and explaining to the American authorities why you are so suspiciously off the grid.

..  All of the major players in the surveillance economy cooperate with their own country’s intelligence agencies, and are spied on (very effectively) by all the others.

.. Try to imagine this policy enacted using the tools of modern technology. The FBI would subpoena Facebook for information on every user born abroad. Email and phone conversations would be monitored to check for the use of Arabic or Spanish, and sentiment analysis applied to see if the participants sounded “nervous”. Social networks, phone metadata, and cell phone tracking would lead police to nests of hiding immigrants.

We could do a really good job deporting people if we put our minds to it.

.. That this toolchain for eliminating enemies of the state is only allowed to operate in poor, remote places is a comfort to those of us who live elsewhere, but you can imagine scenarios where a mass panic would broaden its scope.

.. Or imagine what the British surveillance state, already the worst in Europe, is going to look like in two years, when it’s no longer bound by the protections of European law, and economic crisis has driven the country further into xenophobia.

.. Or take an example from my home country, Poland. Abortion has been illegal in Poland for some time, but the governing party wants to tighten restrictions on abortion by investigating every miscarriage as a potential crime. Women will basically be murder suspects if they lose their baby. Imagine government agents combing your Twitter account, fitness tracker logs, credit card receipts and private communications for signs of potential pregnancy, with the results reported to the police to proactively protect your unborn baby.

.. When we talk about the moral economy of tech, we must confront the fact that we have created a powerful tool of social control. Those who run the surveillance apparatus understand its capabilities in a way the average citizen does not. My greatest fear is seeing the full might of the surveillance apparatus unleashed against a despised minority, in a democratic country.

What we’ve done as technologists is leave a loaded gun lying around, in the hopes that no one will ever pick it up and use it.

..  I am very suspicious of attempts to change the world that can’t first work on a local scale. If after decades we can’t improve quality of life in places where the tech élite actually lives, why would we possibly make life better anywhere else?

.. We should not listen to people who promise to make Mars safe for human habitation, until we have seen them make Oakland safe for human habitation.

.. The goal should be not to make the apparatus of surveillance politically accountable (though that is a great goal), but to dismantle it.

.. This is not the first time an enthusiastic group of nerds has decided to treat the rest of the world as a science experiment. Earlier attempts to create a rationalist Utopia failed for interesting reasons, and since we bought those lessons at a great price, it would be a shame not to learn them.

There is also prior art in attempts at achieving immortality, limitless wealth, and Galactic domination. We even know what happens if you try to keep dossiers on an entire country.

If we’re going to try all these things again, let’s at least learn from our past, so we can fail in interesting new ways, instead of failing in the same exasperating ways as last time.

Edward Snowden is Strangley Free as a Robot

A couple nights earlier, at the New York Times building, Wizner had watched Snowden trounce Fareed Zakaria in a public debate over computer encryption. “He did Tribeca,” the lawyer added, referring to a surprise appearance at the film festival, where Snowden had drawn gasps as he crossed the stage at an event called the Disruptive Innovation Awards.

.. Snowden’s body might be confined to Moscow, but the former NSA computer specialist has hacked a work-around: a robot. If he wants to make his physical presence felt in the United States, he can connect to a wheeled contraption called a BeamPro, a flat-screen monitor that stands atop a pair of legs, five-foot-two in all, with a camera that acts as a swiveling Cyclops eye.

.. It all amounts to an unprecedented act of defiance, a genuine enemy of the state carousing in plain view.

..  Glenn Greenwald, one of Snowden’s original journalistic collaborators, jokingly talks about taking the Snowbot on the road. “I would love to let it loose in the parking lot of Fort Meade,” where the NSA is headquartered, he said.

The Privacy Problem with Digital Assistants

Questions directed at Siri and Google’s voice search get sent to their respective companies, paired with unique device IDs that aren’t connected to specific users. Apple stores Siri requests with device IDs for six months, and then deletes the ID and keeps the audio for another 18 months

.. Allo will, however, come with an “incognito mode” that enables end-to-end encryption and makes message history disappear after a while, à la Snapchat.

In the Apple Case, a Debate Over Data Hits Home

A Wall Street Journal/NBC News survey released last week found that 42 percent of Americans believed Apple should cooperate with law enforcement officials to help them gain access to the locked phone, while 47 percent said Apple should not cooperate. Asked to weigh the need to monitor terrorists against the threat of violating privacy rights, the country was almost equally split, the survey found.

.. A CNN poll the same month found that 45 percent of Americans were somewhat or very worried that they or someone in their family would become a victim of terrorism.

.. Now, people are beginning to understand that their smartphones are just the beginning. Smart televisions, Google cars, Nest thermostats and web-enabled Barbie dolls are next.

.. Officials had hoped the Apple case involving a terrorist’s iPhone would rally the public behind what they see as the need to have some access to information on smartphones. But many in the administration have begun to suspect that the F.B.I. and the Justice Department may have made a major strategic error by pushing the case into the public consciousness.