Sir Tim versus Black Mirror

On a sunny summer morning in June, professor Jonathan Zittrain is hosting Sir Tim Berners-Lee in a Harvard Law School classroom. The audience is a smattering of visiting scholars at the Berkman Klein Center for Internet and Society and a few local techies involved with open source software development. I’d come to the room half an hour early to snag a seat, but I needn’t have bothered, as the crowd to see the man who invented the World Wide Web is attentive, but thin.

Image for post

Image for post

Jonathan Zittrain, one of the world’s leading scholars of creativity in an internet-connected universe, points out that Sir Tim’s current work is attempting to make a second correction in the arc of the internet. His first innovation, thirty years ago, was “the conceptualization and the runaway success of the World Wide Web.” Sir Tim’s current idea is a protocol — Solid — and a company — Inrupt — which want to make the Web as it is now significantly better. Just what are Solid and Inrupt? That’s what a smattering of us are here to find.

Sir Tim draws an arc on the chalkboard behind him. “People talk about the meteoric rise of the web — of course, meteors go down.” Referencing internet disinformation expert Joan Donavan, sitting in the audience, he notes “If you study the bad things on the web, there’s hundreds and thousands to study.” Almost apologetically, he explains that “there was a time when you could see things that were new [online], but not the ways they were bad.” For Sir Tim, the days of blogs were pretty good ones. “When you made a blog, you tried to make it high quality, and you tried to make your links to high quality blogs. You as a blogger were motivated by your reading counter, which led to a virtuous system based on custodianship as well as authorship.” Wistfully, he noted, “You could be forgiven for being fairly utopian in those days.”

What came out of this moment in the web’s evolution was a “true scale-free network, based on HTTP and HTML.” (Scale-free networks follow a Pareto distribution, with a small number of highly connected nodes and a “long tail” of less-connected nodes.) “It was extraordinary to discover that when you connect humanity, they form scale-free networks at all different levels. We put out HTTP and HRTML and ended up with humanity forming scale-free networks on a planetary — okay, a tenth of a planet — scale.”

Sir Tim noted that much of what was most interesting about the web was in the long tail, the less connected and less popular nodes. Zittrain invokes philosopher David Weinberger’s maxim, “In the future, everyone will be famous for 15 people” to acknowledge this idea, and Sir Tim pushes back: “That’s not scale free. What’s possible is that for n people on the planet, we might have root-n groups. We’re not trying to make one network for everyone, not trying to design something for Justin Bieber tweeting.”

So why doesn’t blogosphere still work? Sir Tim blames the Facebook algorithms which determine what you read, breaking network effects and leading to a huge amount of consolidation. Zittrain wonders whether Facebook’s power is really all that new — didn’t Google’s search algorithm have similar effects? Sir Tim demurs — “Google just looks at all links and takes an eigenvector — it’s still using the web to search.” There’s a fascinating parenthetical where Sir Tim explains that he never thought search engines were possible. “Originally, we thought no one would be able to crawl the entire web — you would need so much storage, it wouldn’t be possible. We hadn’t realized that disk space would become ridiculously cheap.” Jonathan Zittrain likens the moment when Google comes into being as a science fiction moment, where our ability to comprehend the universe as limited by the speed of light suddenly allows us to transcend those barriers — prior to search, we might only know our local quadrant of the web, while search suddenly made it possible to encounter any content, anywhere.

Sir Tim brings us back to earth by discussing clickbait. “Blogging was driven by excitement around readership. But eventually ads come into play — if I am writing, I should have recompense.” What follows is content written specifically to generate money, like the fake news content written by Macedonian bloggers that might have influenced US elections. Zittrain generously references my “The Internet’s Original Sin” article, and Sir Tim notes that “some people argue that if you start off with advertising, you’re never going to have a successful web.”

The consequence of a monetized web, Sir Tim believes, is consolidation, designed to give advertisers larger audiences to reach. That consolidation leads to silos: “My photos are on Flickr, but my colleagues are all on LinkedIn? How do I share them? Do I have to persuade all my friends to move over to the platform I’m on?”

Zittrain offers two possible solution the problem: interoperability, where everything shares some common data models and can exchange data, or dramatic consolidation, where LinkedIn, for instance, just runs everything. Sir Tim isn’t overly optimistic about either, noting that totalitarian societies might be able to demand deep interop, but that it seems unlikely in our market democracy. And while consolidation is easier to work within, “consolidation is also incredibly frustrating. If you want to make a Facebook app, you need to work within not only the Facebook API, but the Facebook paradigm, with users, groups, and likes. Silos are very bad for innovation.”

Returning to the arc he’s drawn on the blackboard, Sir Tim notes that the meteor is crashing into earth. “We don’t need to imagine future web dystopias. We’ve got a television show where every single episode illustrates a different form of dysfunction.” The arc of the Web is long and it leads towards Black Mirror.

In March of this year, Sir Tim launched the #ForTheWeb campaign to celebrate the thirtieth anniversary of the Web. For Tim, the campaign was meant to feature the web worth saving, not to demand that either governments or Facebook fix it for us. “We need to fix networks and communities all at once, because it’s a sociotechnical system,” he explains. “We need to work inside the companies and inside the government. Some things are simple to fix — net neutrality, cheaper broadband, those were relatively simple. This isn’t simple. Free speech and hate speech are complicated and need complex social processes around them.” And while #ForTheWeb is a space for articulating the key values we want to support for a future direction of the web, that new direction needs a technical component as well. We need a course correction — what’s the White Mirror scenario?

Sir Tim pushes up the blackboard featuring the web as a meteor crashing back to earth. On the board below it, he starts drawing a set of cylinders. Solid is based around the idea of pods, personal data stores that could live in the cloud or which you could control directly. “Solid is web technology reapplied,” Sir Tim explains. “You use apps and web apps, but they don’t store your data at all.”

Returning to his photo sharing scenario, Sir Tim imagines uploading photos taken from a digital camera. The camera asks where you want to store the data. “You have a Solid pod at home, and one at work — you decide where to put them based on what context you want to use them in. Solid is a protocol, like the web. Pods are Solid-compatible personal clouds. Apps can talk to your pod.” So sharing photos is no longer about making LinkedIn and Flickr talk to each other — it’s simply about both of them talking to your pod, which you control.

The web was all about interoperability — this is a solution for interoperability,” explains Sir Tim. “You choose where to store your information and the pods do access control, There’s a single sign on that leads to a WebID. Those WebIDs plus access controls are a common language across the Solid world.” These WebIDs support groups as well as individuals… and groups have pages where you can see who belongs to them. Apps look up the group and deliver information accordingly. The content delivery mechanism underneath Solid is WebDAV, a versioning and authoring protocol that Sir Tim has supported from very early on as a way of returning the Web to its read/write roots, though he notes that Solid plans on running on protocols that will be much faster.

Zittrain picks up the legal implications of this new paradigm: “Right now, each web app or service has custody of the data it uses — LinkedIn has a proprietary data store behind it. But there might also be some regulations that govern what LinkedIn can do with that data — how does that work in a Solid world?”

Ducking the legal question, Sir Tim looks into ways we might bootstrap personal data pods. “Because of GDPR, the major platforms bave been forced to create a way for people to export their content. You’d expect that Google, Facebook and others would fight this tooth and nail — instead they’re cooperating.” Specifically, they’re developing the Data Transfer Project, a common standard for data export that allows you not only to export your data, but to import it into a different platform. “They’ve gone to the trouble of designing common data models, which is brilliant from the Solid point of view.”

Zittrain suggests that we can think of Solid’s development in stages. In Stage 0, you might be able to retrieve your data from a platform, possibly from the API, possibly by scraping it, and you might get sued in the process. In Step 1, you can get your data through a Data Transfer dump. In Step 2, companies might begin making the data available regularly through Solid-compatible APIs. In Step 3, the Solid apps start working off the data that’s been migrated into personal pods.

Sir Tim notes that exciting things start to happen in Step 3. “My relationship with a bank is just a set of transactions and files. I can get a static copy of how the bank thinks of my current relationships. What I would like is for all those changes to be streamed to my Solid pod.” He concedes, “I probably don’t want to have the only copy.” Much of what’s interesting about Solid comes from the idea that pods can mirror each other in different ways — we might want to have a public debate in which all conversations are on the record and recorded, or an entirely ephemeral interaction, where all we say to one another disappears. This is one of many reasons, Sir Tim explains, “Solid does not use Blockchain. At all.”

Zittrain persists in identifying some of the challenges of this new model, referencing the Cambridge Analytica scandal that affected Facebook. “If the problem is privacy, specifically an API that made it easy to get not only my data, but my friends’ data, how does Solid help with this? Doesn’t there need to be someone minding controls of the access lists?”

Solid, Sir Tim explains, is not primarily about privacy. Initially, people worried about their personal data leaking, a compromising photo that was supposed to be private becoming public. Now we worry about how our data is aggregated and used. The response shouldn’t be to compensate people for that data usage. Instead, we need to help combat the manipulation. “Data is not oil. It doesn’t work that way, it’s not about owning it.” One of Sir Tim’s core concerns is that people offer valuable services, like free internet, in exchange for access to people’s datastream.

Zittrain points out that the idea that you own your own data — which is meant to be empowering — includes a deeply disempowering possibility. You now have the alienable right of giving away your own data.

Sir Tim is more excited about the upsides: “In a Solid world, my doctor has a Solid ID and I can choose the family photo that has a picture of my ankle and send it to the doctor for diagnosis. And I can access my medical data and share it with my cousin, if I choose.” Financial software interoperates smoothly, giving you access to your full financial picture. “All your fitness stuff is in your Solid Pod, and data from your friends if they want to share it so you can compete.” He imagines a record of purchases you’ve made on different sites, not just Amazon, and the possibility of running your own AI on top of it to make recommendations on what to buy next.

A member of the audience asks whether it’s really realistic for individuals to make decisions about how to share their data — we may not know what data it is unsafe to share, once it gets collected and aggregated. Can Solid really prevent data misuse?

“The Solid protocol doesn’t tell you whether these services spy on you, but the spirit of Solid is that they don’t,” offers Sir Tim. Apps are agents acting on your behalf. Not all Solid apps will be beneficent, he notes, but we can train certified developers to make beneficent apps, and offer a store of such apps. Zittrain, who wrote a terrific book about the ways in which app stores can strangle innovation, is visible uncomfortable and suggests that people may need help knowing who to trust in a Solid world. “Imagine a party able to be designated as a helper with respect to privacy. Maybe a grandchild is a helper for a grandmother. Maybe we need a new role in society — a fiduciary whose responsibility is to help you make trust decisions.” Zittrain’s question links Sir Tim’s ideas about Solid to an idea he’s been developing with Jack Balkin about information fiduciaries, the idea that platforms like Facebook might be required to treat our personal data with the legal respect that doctors, lawyers and accountants are forced to apply to personal data.

Another question wonders who will provide the hardware for Solid pods. Zittrain points out that Solid could run on Eben Moglen’s “Freedom Box”, a long-promised personal web server designed to put control of data back into users hands. Sir Tim suggests that your cable or ISP router might run a Pod in the future.

My question for Sir Tim focuses on adoption. Accepting for the moment the desirability of a Solid future — and, for the most part, I like Sir Tim’s vision a great deal — how do we get from here to there? For the foreseeable future, billions of people are using proprietary social networks that surveil their users and cling to their data. When Sir Tim last disrupted the Internet, it was an academic curiosity, not an industry worth hundreds of billions. How do we get from here to there?

Sir Tim remembers the advent of the web as a struggle. “Remember when Gopher was taking off exponentially, and the web was growing really slowly? Remember that things that take off fast can drop off fast.” Gopher wasn’t free, and its proprietary nature led it to die quickly; “People seem locked into Facebook — one of the rules of Solid is not to disturb them.” People who will adopt Solid will work around them, and when people begin using Solid, that group could explode exponentially. “The billion people on Facebook don’t affect the people using a Solid community.”

Returning to the 80s, Sir Tim notes that it was difficult for the Web to take off — there were lots of non-internet documentation systems that seemed like they might win. What happened was that CERN’s telephone directory was put on the web, and everyone got a web browser to access that directory. It took a while before people realized that they might want to put other information on top of the directory.

We don’t want everyone using Facebook to switch to Solid tomorrow — we couldn’t handle the scale.” Instead, Sir Tim offers, “We want people who are passionate about it to work within it. The reward is being part of another revolution.”


There’s something very surreal about a moment in which thousands of researchers and pundits are studying what’s wrong with social media and the Web, and surprisingly few working on new models we can use to move forward. The man who built the web in the first place is now working on alternative models to save us from the Black Mirror universe and the broader academic and professional world seems… surprisingly uninterested.

I can certainly see problems with Solid apps — your Pod will become a honeypot of private information that’s a great target for hackers. Apps will develop to collect as much of your Pod data as possible, unless they’re both regulated and technically prevented from doing so. Unless Pods are mostly on very fast cloud services, apps that draw from multiple pods will be significantly slower than the web as it operates today.

But there’s so much to like in Sir Tim’s vision. My lab and I are working now on the idea that what the world needs now is not a better Facebook, but thousands of social networks, with different rules, purposes and community standards. Like Sir Tim, we’re not looking to replace Facebook but to create new communities for groups of 5 to 50,000, self-governing and capable of different behaviors than the communities with hundreds of millions of users and central corporate governance are capable of. There’s no reason why the networks we’re imagining couldn’t live atop Solid.

It’s hard to remember how small and strange an experiment the web was in 1989, or even in 1994. I remember dropping out of graduate school to work on a web startup. My motivation wasn’t that I might make a lot of money — that seemed extraordinarily unlikely. It was that someone was willing to pay me to work on something that seemed… right. Like a plausible and desirable future. And for me, at least, Solid seems plausible and desirable in much the same way. It also seems roughly as hard to love as the Web was in 1994, with its grey backgrounds and BLINK tag — Solid.Community allows you to register an ID, which at present doesn’t seem to let you do anything, though you can read the Github repository and see how you might create a chat app atop Solid.

Can Sir Tim revolutionize the Internet again? I have no idea. But someone needs to, because a web that crashes to earth is a Black Mirror episode I don’t want to see.


Jaron Lanier: How the Internet Failed and How to Recreate It

Transcript

00:03
[Music]
00:07
welcome everybody I’m Nathaniel Deutsch
00:09
I’m the director of the humanities
00:11
Institute here at UC Santa Cruz and I
00:14
want to welcome everyone here to see
00:17
Jaron Lanier
00:19
he’ll be talking tonight I also want to
00:21
welcome everybody who is watching this
00:22
on a live stream that we have running at
00:26
the same time we’re very thankful for
00:28
the support from the Peggy Downes Baskin
00:31
humanities endowment for
00:32
interdisciplinary ethics for supporting
00:35
this lecture we’re also very thankful
00:37
for the Andrew W mellon foundation for
00:39
supporting year-long series of events
00:42
that will be hosting at the humanities
00:43
Institute on data and democracy and
00:46
Jaron Lanier stalk is going to be
00:48
launching that series in addition to the
00:52
event tonight we will be hosting in the
00:55
coming year a series of other events
00:57
including questions that matter at
01:00
khumba jazz center on January 29th which
01:04
we invite all of you to I know there’s
01:06
some of you have been to some of our
01:07
past questions that matter events and
01:09
also an event that we have been planning
01:13
actually for a while and has become even
01:16
more necessary because of the events of
01:18
recent days and that is a conversation
01:21
on anti-semitism in the internet which
01:25
we will be hosting on a date to be
01:27
announced
01:28
I want to thank there’s many people I
01:30
could thank but I’ll leave out the names
01:33
I’ve already cleared it with them I’m
01:34
just gonna thank thank their units the
01:38
humanities Institute staff which is as
01:39
always amazing and then the staff of the
01:43
humanities divisions development office
01:45
which is also always amazing so thank
01:47
you everyone for all the work that went
01:49
into this event tonight’s program will
01:52
include a lecture followed by again and
01:54
I think some music followed by a
01:58
question-and-answer session and book
01:59
signing and I’ll be talking a little bit
02:01
more about the book signing later
02:03
questions and answers will be
02:04
facilitated by note cards and we have
02:07
some uh sure’s that are moving around
02:10
the room and if you would like to
02:13
a question please raise your hand now
02:14
and they will hand you cards and you can
02:18
write out the the question they’ll pick
02:20
it up and then they’ll give it to me and
02:21
I will be facilitating the question and
02:23
answer that way so I’ve had the pleasure
02:29
of spending the day with Jaron and I can
02:32
tell you that he is a fascinating person
02:35
a very generous person as well with his
02:39
time he met with some students earlier
02:41
today and had a conversation with them
02:44
which was wonderful for him to do and
02:47
tonight he will be giving a lecture
02:50
we’re lucky to have him here he’s a path
02:53
breaking computer scientist a virtual
02:55
reality pioneer if I’m not mistaken you
02:58
coined the phrase mutual virtual reality
03:02
he’s a composer and artist and author
03:04
who writes and numerous topics including
03:06
technology the social impact of
03:08
Technology the philosophy of
03:09
consciousness and information internet
03:11
politics and the future of humanism and
03:14
one of the things that we believe in so
03:16
strongly at the humanities Institute is
03:18
that conversations about technology
03:19
cannot simply be left to a computer
03:22
scientists no offense to any computer
03:24
scientists in the room we love you too
03:25
but we also think that it is critical to
03:28
have people who work in the humanities
03:30
involved in those conversations and this
03:32
is part of why we are doing this tonight
03:35
he is the author of best-selling and
03:37
award-winning books including you are
03:39
not a gadget a manifesto and who owns
03:41
the future most recently he’s the author
03:43
of 10 arguments for deleting your social
03:45
media account right now his lecture
03:49
tonight is entitled how the internet
03:51
failed and how to recreate it please
03:54
join me in welcoming Jaron Lanier
03:56
[Applause]
04:04
hey how are you all any students here is
04:12
this all this is the the adult okay good
04:15
good ah good excellent there for you
04:17
here I’m going to start with some music
04:21
because some of what I have to talk
04:24
about is not the most cheerful stuff
04:26
because our times aren’t universally
04:28
cheerful lately and music is how I
04:32
survive anyway any of you heard me play
04:36
this thing okay
04:45
[Music]
05:06
[Applause]
05:44
[Music]
05:57
[Music]
06:06
[Music]
06:16
you all know weight is up you all know
06:19
what that is right yeah it’s called a
06:24
cab
06:25
it’s from Laos it’s arguably the origin
06:33
of digital information if you look at it
06:38
it’s got a parallel set of objects that
06:42
are either off Iran there’s 16 of them
06:45
in this one 16-bit number they go back
06:49
many thousands of years they appear to
06:52
be older than the abacus in ancient
06:55
times they were traded across the Silk
06:58
Route from Asia and were known to the
07:01
ancient Greeks and Romans the Romans
07:04
made their own copy which was called a
07:06
hydrolyse and it was a giant egotistical
07:10
Roman version that was so big it has to
07:14
be run on Steam it was operated by teams
07:19
of slave boys because despite have
07:22
Festus is best efforts they didn’t have
07:24
computer AI yet and the slave boys
07:29
couldn’t quite operate all the planks
07:31
that open to close the holes and sink
07:33
and so they developed this crossbar
07:35
system and we know about it because
07:37
there’s a surviving hydrolyse believe it
07:39
or not and that automation evolved along
07:45
with the hydrolyse in in two directions
07:47
it turned into the mediaeval pipe organ
07:50
and there were player mechanisms on the
07:52
earliest pipe organs experimentally and
07:55
it also turned into a family of string
07:58
instruments that had various assists
08:02
like the early pre clavichord
08:04
instruments that eventually evolved as
08:07
the piano the notion of automating these
08:10
things was always present so there were
08:12
always attempts to make player pianos
08:14
around Mozart’s time somebody made a
08:18
non-deterministic player piano which
08:20
meant it didn’t play exactly the same
08:22
thing twice Mozart was inspired by that
08:26
made some music that included dice rolls
08:29
but another person who was inspired was
08:32
a guy named jacquard who used the
08:34
similar mechanism to make a programmable
08:36
loom that in turn inspired somebody
08:40
named Charles Babbage to make a
08:42
programmable calculator and his daughter
08:46
ada
08:46
to articulate a lot of ideas about
08:48
software for the first time and what it
08:49
meant to be a programmer and then in
08:52
turn that all inspired a dimm’d fellow
08:55
named Alan Turing to formalize the whole
08:58
thing and invent the modern computer so
09:01
there’s a direct line this is it this is
09:03
the origin of digital information now of
09:07
course it’s not the only line and if I
09:10
was if I was paid to be a historian I
09:12
wouldn’t have told you that story with
09:14
such authority and yet I’m not so this
09:23
is a charming tale it’s a happy place to
09:26
begin it’s a it’s a reminder that
09:30
inventions can bring delight and joy and
09:33
it’s part of why I’m a technologist but
09:38
unfortunately we have some matters to
09:40
discuss here that are not quite so happy
09:45
we live in a world that has been
09:50
darkening lately it’s not just a
09:57
historical lensing effect where it feels
10:00
worse than ever it’s bad in a new way
10:03
there’s something weird going on and I
10:05
want to begin by trying to distinguish
10:09
what’s going on with our present moment
10:11
of darkness as compared to earlier times
10:14
because this is tricky
10:16
it’s almost impossible I think to not be
10:20
embedded in one’s moment in time it’s
10:22
almost impossible not to have illusions
10:27
due to where you’re situated right and
10:29
so I don’t claim to have perfected the
10:32
art of absolute objectivity at all I’m
10:35
struggling and I’m sure that I don’t
10:38
have it quite right but I want to share
10:39
with you my attempts up to this
10:41
now the first thing to say is that by
10:46
many extremely crucial measures we’re
10:49
living in spectacularly good times where
10:53
the beneficiaries of a steady
10:55
improvement in the average standard of
10:58
living in the world we’ve seen a
11:01
lowering of most kinds of violence we’ve
11:04
seen an improvement in health in most
11:07
ways and for most people it’s actually
11:11
kind of remarkable in many ways these
11:14
are really good times and those trend
11:17
lines go way back over over centuries
11:20
we’ve seen steady improvement of
11:22
societies kind of gotten its act
11:23
together and we’ve been able to hold on
11:26
to a few memories about things that
11:28
didn’t work so we’ve tried new things
11:30
we’ve we’ve developed relatively more
11:33
humane societies and relatively better
11:36
science and better Public Health and
11:38
it’s amazing it’s wonderful it’s
11:42
something that’s a precious gift to us
11:47
from earlier generations that we should
11:49
be unendingly grateful for and I always
11:54
keep that in mind I always keep in mind
11:56
that just in our modern human-made world
11:59
just the fact you can walk into a
12:01
building and it doesn’t collapse on us
12:02
is a tribute to the people who made it
12:04
and the people who funded them and
12:07
regulated them and the people that
12:09
taught them there’s like this whole
12:11
edifice of love that’s apparent all the
12:14
time that we can forget about and during
12:16
times that feel dark one of the
12:18
antidotes is gratitude and just in these
12:20
simple things
12:21
I feel extraordinary gratitude and it
12:25
reminds me of how overall there’s been
12:27
so much success in the project of
12:29
Science and Technology it’s so easy to
12:33
lose sight of that and yet there is
12:35
something really screwy going on that
12:38
seems to me to be fairly distinct from
12:42
previous problems it’s a new sneaky
12:45
problem we’ve brought upon ourselves and
12:48
we have yet to fully invent our way out
12:50
of it
12:51
so what exactly is going on
12:54
I think at a most fundamental level
12:59
we’ve created a way of managing
13:03
information among ourselves that
13:05
detaches us from reality I think that is
13:10
the most serious problem if the only
13:14
problem was that our technology makes us
13:18
at times more batty
13:21
more irritable paranoid more
13:26
mean-spirited more separated more lonely
13:30
if that kind of problem was what we were
13:33
talking about that would be important it
13:36
would be serious it would be important
13:38
to address it but what really scares me
13:42
about the present moment is that I fear
13:44
we’ve lost the ability to have a
13:47
societal conversation about actual
13:49
reality about things like climate change
13:53
the need to have adequate food and water
13:56
for peak population which is coming the
13:58
need for dealing with changes in the
14:01
profile of diseases that are coming
14:04
there’s so many so many issues are real
14:08
they’re not just fantasy issues their
14:10
existential real issues climate above
14:14
all and the question is are we still
14:18
able to have a conversation about
14:20
reality or not
14:21
that becomes the existential question of
14:23
the moment and so far the way we’ve been
14:26
running things has been pulling us away
14:28
from reality that scares me and I think
14:32
that’s the core darkness that we have to
14:34
address we can survive everything else
14:36
but we cannot survive if we fail to
14:38
address that now in the title of this
14:42
lecture I promised a little bit of
14:44
history how the internet got screwed up
14:46
or something like that so I’ll tell you
14:48
a bit about that but I want to focus
14:51
more on trying to characterize this
14:54
issue a little more tightly and trying
14:58
to explain at least my thoughts on how
15:00
to remedy it and maybe some other
15:02
people’s thoughts to try to give you a
15:04
bit of a sense of it
15:06
now to begin with one of the infuriating
15:11
aspects of our current problem is that
15:13
it was well foreseen in advance that’s
15:16
the thing about it nobody can claim that
15:18
they were surprised and I can point to
15:22
many folks who were talking about this
15:24
in advance I’m good as good a starting
15:26
place as any is to talk about iam
15:28
foresters story the machine stops who
15:31
here has read it ok well a few people
15:36
terrifying right all right
15:38
the machine stops was written I believe
15:41
in 1907 is that right it might have been
15:43
on nine but you know a century in a
15:47
decade ago or so and it foresees a world
15:50
remarkably like ours it’s a world and
15:53
this was written well before touring
15:54
well before any of this stuff
15:57
I mean before there was computation and
15:59
it describes a world of people in front
16:02
of their screens interacting social
16:04
networking doing search and getting lost
16:07
in a bunch of stupid and
16:10
finally when the machine experiences a
16:14
crash there’s this calamity on earth and
16:17
people become so dependent on it that
16:19
the loss of this machine becomes a
16:21
calamity in itself and at the very end
16:23
of the book people are crawling out from
16:24
their screens and looking at the real
16:26
world and saying oh my god the Sun and
16:29
it’s like this it’s a really amazing
16:32
piece because it’s possibly the most
16:34
precious thing prescient thing that’s
16:37
ever been written at all it was written
16:40
in part as a response to the techie
16:44
utopianism of the day it was a response
16:47
to writers like HG Wells saying wait a
16:50
second these are still going to be
16:52
people we have to think about what this
16:53
will mean to people it’s often the case
16:56
that the first arrive or on a scene has
16:58
a clearer view and can have this kind of
17:01
lucidity that later people find it very
17:03
difficult to achieve and I think
17:04
something like that happens very long
17:06
ago but then honestly we could talk
17:11
about Touring’s last writing just before
17:13
his suicide where he was realizing the
17:16
even though he played as great a role as
17:19
anyone in defeating fascism he hadn’t
17:21
defeated fascism at all because here he
17:24
was being destroyed for his identity you
17:28
all know the story of trade by now it’s
17:30
not obscure anywhere there was a movie
17:31
and everything for a long time I would
17:33
speak to computer science classes nobody
17:35
knew about Turing’s death at all which
17:37
is a scandal but at this point I think
17:39
everyone knows and if you read his final
17:42
writings you read this kind of in a way
17:45
an inner glow of somebody who does have
17:48
some kind of a faith and some kind of a
17:50
stronger Center but also this kind of
17:52
sense of defeat and by the way it’s
17:55
within the context of that that he
17:57
invented artificial intelligence that he
17:58
invented the Turing test and this notion
18:00
of this person who would transcend this
18:03
non person who could transcend sexuality
18:06
and be just this pristine abstract
18:08
platonic being an escaped oppression
18:11
perhaps but anyway so we have that in
18:15
the immediate early generation of
18:17
computer scientists we had Norbert
18:19
Wiener who here has read Norbert Wiener
18:22
I don’t see a single young person’s hand
18:26
up and unfold if you’re young if you’re
18:29
a student and you haven’t read any of
18:30
these people would you please correct
18:31
that and read them seriously you’ll
18:33
you’ll be so happy if you take this
18:34
advice I’m actually read these people so
18:37
Norbert wieners one of the very first
18:39
computer scientists first generation and
18:43
he wrote books that were incredibly
18:45
prescient about this he wrote a book
18:46
called the human use of human beings and
18:49
he pointed out if you could attach a
18:51
computer to input and output devices
18:53
interacting with a person you could get
18:55
algorithms that would enacted adaptive
18:58
behaviors technologies to take control
19:01
of the person and he viewed this as this
19:05
extraordinary moral failure that to be
19:08
avoided any SS thought experiment at the
19:10
end of the book Reese’s well you could
19:12
imagine some kind of global system where
19:13
everybody would have devices on them
19:15
attached to such algorithms that would
19:17
be manipulating them in ways they
19:19
couldn’t quite follow and this would
19:21
bring humanity to a disastrous end but
19:23
of course this is only a thought
19:24
experiment no such thing is feasible
19:25
because there wouldn’t be enough
19:27
bandwidth on the radio waves and all
19:28
this
19:28
you know he then explained why it
19:30
couldn’t be done and of course we built
19:32
exactly the thing he warned about I
19:35
could give many other examples I worked
19:40
on it myself in 92 I wrote an essay
19:41
describing how little AI BOTS could
19:45
create fake social perception in order
19:47
to confuse people and throw elections
19:49
big deal
19:51
lots of people were prescient about this
19:53
this wasn’t a surprise we knew and
19:57
that’s the thing that’s so depressing
20:00
there was a lot of good cautionary
20:03
science fiction there were a lot of good
20:05
cautionary essays there were good
20:07
cautionary technical writings and we
20:10
ignored all of it we ignored it all how
20:15
could that have happened so I I would
20:21
rather tell the story about how
20:23
everybody was surprised and a lot of
20:25
people who are entrepreneurs in Silicon
20:27
Valley were surprised but only because
20:29
they don’t like reading don’t be like
20:33
them so the social history of how
20:39
everything screwed up is a reasonable
20:41
way to talk about the particular way in
20:43
which is screwed up so I’m gonna give it
20:46
a try the first thing to say is that in
20:51
the generation of media technologists
20:57
and artists and viewers from immediately
21:00
before computation went pop in like the
21:03
60s into the 70s into the 80s some of
21:07
the personality dysfunctions and some of
21:10
the craziness was already apparent we
21:11
started to see this notion that anybody
21:14
could be a celebrity and people became
21:16
obsessed with this idea that maybe I
21:17
could be one and maybe there’s something
21:19
wrong with me if I’m not and this kind
21:21
of mass media insecurity obsession thing
21:28
I it’s hard to trace the moment when
21:33
this personality dysfunction really hit
21:35
the mainstream and really started to
21:37
darken the world
21:38
we were talking earlier actually about
21:41
what moment to choose I was thinking
21:42
actually the assassination of John
21:44
Lennon because here you had somebody who
21:46
basically just wanted to be famous for
21:47
being able to be a kill a random killer
21:49
and that was a little new if you look at
21:53
crappy evil people earlier sure there
21:57
was someone to be famous I don’t know
21:58
Bonnie and Clyde or something like that
22:00
but there are a few different things
22:02
about them one thing is that they were
22:06
also stealing money there was a kind of
22:07
a way in which they were I don’t know
22:10
there’s some kind of a part of a system
22:11
they had peers they weren’t they weren’t
22:13
typically total loners the most typical
22:18
profile of really evil person before was
22:21
actually a hyper conformist the typical
22:23
Nazi was actually somebody who didn’t
22:25
want to stand out who just was going
22:27
with the flow and and fully internalized
22:30
the the social milieu around them and
22:32
because it felt normal and that’s that’s
22:34
been a much more typical way that people
22:37
behaved appallingly in history this this
22:40
sort of weird loner celebrity seeker
22:43
thing I’m sure it existed before but it
22:45
started to become prominent I I want to
22:48
say something I’ve never said publicly
22:49
before but it’s just been gnawing at me
22:50
for many years I’m old enough to have
22:53
had some contact back in the day with
22:56
both Marshall McLuhan and Andy Warhol
22:58
who were two figures who had a kind of a
23:01
loose way of talking about this early
23:04
but they didn’t condemn it they just
23:06
stood aloof and say oh we’re super smart
23:09
and wise for being able to see this
23:10
happening and what they should have done
23:11
as they should have said this is
23:13
and I it’s actually really been
23:15
bothering me I’ve never said that before
23:16
I feel it should be said because once
23:19
again the first people on the scene
23:20
sometimes have a kind of a vision and
23:23
they should be judgmental about it the
23:24
way M Forster was and I feel like they
23:27
maybe failed us morally at that point
23:29
because they saw it better than a lot of
23:30
other people maybe than anybody at that
23:32
time anyway that’s maybe not useful to
23:35
say now but at some point it has to be
23:37
said let’s fast forward a little bit
23:41
computation starts to get cheap enough
23:43
that it’s starting to creep out of the
23:44
lab this is the early 1980s
23:48
and here we hit another juncture there
23:54
was this thing that happened oh man I
23:57
was right there for it it was the birth
23:59
of the open free software idea there was
24:01
a friend of mine named Richard Stallman
24:04
any chance Richards here no I guess not
24:08
anyway you never know when saw I saw
24:10
four things anyway Richard had this
24:13
horrible he like one day he just art
24:16
saying oh my god Mike my girlfriend’s
24:18
been killed like my lovers been killed I
24:20
said oh my god that’s horrible but what
24:22
it really was was the software system
24:24
he’d been working on for this kind of
24:26
computer and what had happened is it had
24:28
gone into a commercial mode where the
24:30
companies and it was I think all the
24:31
Lisp machine which would probably nobody
24:33
remembers anymore a sort of early
24:35
attempt to make an AI specialized
24:37
computer and he he was upset he said he
24:43
sort of melded his anger about this with
24:46
a kind of an anti-capitalist feeling
24:47
said no software must be free it must be
24:50
just the thing that’s distributed it
24:51
can’t be property property is theft and
24:54
it really spoke to a lot of people it
24:57
melded with with these other ideas that
24:59
were going on at the time and so it
25:02
became this kind of feeling I would say
25:05
sort of a leftist feeling that was
25:07
profound and remains to this day a lot
25:10
of times if somebody wants to do
25:11
something useful with tech they’ll have
25:13
to put in the word open-source lately
25:15
they also have to put in blockchain and
25:17
so very typically it’s open source it’s
25:19
got blockchain and then then you know
25:21
it’s good so there was this other thing
25:26
going on which is this feeling that the
25:27
purpose of computers was to hide and
25:31
that’s that deserves a little bit of
25:33
explanation
25:34
they were America has always had this
25:37
divide this red-blue divide or whatever
25:40
remember it used to be a north-south
25:41
divide we but we fought one of history’s
25:43
horrible wars once is a civil war and so
25:48
people on what we’d call now the red
25:49
side of the divide we’re very upset
25:54
there was a Democratic president named
25:56
Jimmy Carter that a few people other
25:58
than me in the room might be old enough
25:59
to remember and there was a period when
26:02
there was an Arab oil embargo and we did
26:05
we had long lines at gas stations and he
26:07
imposed a 55 mile an hour speed limit on
26:09
the freeways which a lot of people
26:12
really hated because I wanted to drive
26:13
fast and so this thing sprang up called
26:16
CB radio and CB radios were these little
26:19
analog radios you’d install on your car
26:21
and you’d create a false persona a
26:24
handle and then you’d warn other people
26:27
about where the police were hiding so
26:29
that you could all drive fast
26:30
collectively by sharing information and
26:32
it was all anonymous he could never
26:34
trace it and this thing was huge this
26:36
had as high a profile at the time as
26:38
Twitter does today probably there were
26:40
songs celebrating it it was a really big
26:42
deal but then on the left side of
26:45
America on the blue side people also
26:48
wanted to hide and in that case there
26:52
were two things going on one is the
26:53
draft hadn’t quite died down and it was
26:55
still the Vietnam era and that was just
26:57
terrifying because people didn’t really
27:01
believe in that war and the idea of
27:03
being drafted into this horribly violent
27:05
war that appeared to have no good
27:07
purpose just absolutely broke people’s
27:09
hearts and terrified people so they
27:11
wanted to hide and a lot of people did
27:13
and then there was marijuana and the
27:16
drug laws and a lot of people really
27:19
were hiding from those as well so you
27:21
basically had both red and blue America
27:23
feeling like the number one priority for
27:27
freedom for goodness is to be able to
27:29
hide from the government so encryption
27:32
and hiding and fake personas became this
27:36
celebrated thing so this in this milieu
27:41
there was this idea that online
27:44
networking which didn’t really exist yet
27:45
I mean we had networks but they were all
27:46
very specialized and isolated there
27:48
wasn’t a broad internet yet there would
27:50
be this idea that everything would be
27:52
free and open everything would be
27:54
anonymous and it’d just be like this
27:56
giant black weird place where you
27:59
everything you never knew anything but
28:01
you were also free and nobody could find
28:02
you
28:03
hmm okay so that was that was this
28:06
starting idea so there were a few other
28:09
things that fed into it another thing
28:11
was that there was a famous rock band
28:12
called the Grateful Dead that encouraged
28:14
people to tape their songs and didn’t
28:16
care about privacy and all this there
28:17
are all these different factors
28:19
now oh this was going on and then
28:21
simultaneously this other thing happened
28:23
which is we started to have the figure
28:26
of the glorified practically superhuman
28:31
tech entrepreneur and these were in the
28:34
80s they but these were figures like
28:36
Steve Jobs Bill Gates people we still
28:39
remember of course bill still with us
28:41
and they were just worshipped they were
28:45
the coolest people ever well around
28:47
around here in California people hated
28:49
Bill but they loved Steve and there was
28:55
this kind of interesting problem which
28:58
is we not we didn’t just like our tech
29:02
entrepreneurs we made them into sort of
29:04
superhuman figures the the phrase dent
29:08
the universe is associated with jobs
29:09
it’s this notion that there’s this this
29:12
kind of michi and super power to create
29:16
the flow of reality to direct the future
29:18
because you are the tech entrepreneur
29:19
and computation is reality and the way
29:22
we set these architectures will create
29:24
future societies and that’ll ultimately
29:25
change the shape of the universe once we
29:27
get even greater powers over physics and
29:30
there was just like this no end to the
29:32
fantastical thinking we were at the
29:34
birth point for every form of absolute
29:36
God like you know immortality and
29:39
shape-shifting and every crazy thing I
29:40
was a little bit of that I’m sorry to
29:43
say I was I kind of got a little off
29:44
I was pretty intense in the 80s myself
29:47
but anyway there was this feeling that
29:50
the entrepreneur could just just like
29:55
was had more cosmic power than the
29:58
average person okay so now here you have
30:00
a dilemma that had been kind of sneaking
30:03
up and nobody had really faced it on the
30:05
one hand everything’s supposed to be
30:07
free everything’s supposed to be
30:09
anonymous everything is supposed to be
30:11
like this completely open thing but on
30:14
the other hand
30:15
we love our entrepreneurs we worship our
30:17
entrepreneurs the entrepreneurs are
30:18
inventing reality so it should be clear
30:21
that there’s a bit of a potential
30:23
conflict here everything must be free
30:24
but we worship entrepreneurs how do we
30:27
do it how do we do it how do we do it
30:28
and so a set of compromises were created
30:32
over the years that ended up giving us
30:35
the worst of both sides of that I would
30:37
say so I’ll give you the the story is is
30:40
long and interesting but I’ll give you
30:42
just a few highlights one thing that
30:45
happened is when we finally got around
30:47
to actually creating the Internet
30:49
we decided it has to be super bare-bones
30:52
it would represent machines because
30:54
without having a number representing a
30:56
machine you can’t have an Internet but
30:58
it wouldn’t represent people it didn’t
30:59
intrinsically have accounts built-in for
31:01
humans it had no storage for humans
31:05
built-in it had no transactions it had
31:07
no authentication it had no persistence
31:10
of information guaranteed it had no
31:12
historical function we had it was like
31:14
super bare-bones just this thing
31:16
connects with that thing that’s all it
31:17
did and the reason why was that we were
31:21
supposed to leave room for future
31:23
entrepreneurs those who we worshipped
31:26
you know the Internet so if I was about
31:30
to say the Internet as you know was
31:31
invented by Al Gore some of you would
31:33
laugh and that’s because it was a laugh
31:36
line for a while because he was a
31:37
democratic he was a vice president and
31:40
before that a senator from Tennessee and
31:42
he was accused of over claiming that
31:44
he’d invented the internet on a TV show
31:45
which didn’t happen however I think he
31:48
should claim it I think he did invent it
31:50
he didn’t invent it technologically not
31:52
at all all of the underlying stuff which
31:55
is called a packet switch network and a
31:56
few other elements that existed in lots
31:59
of instances from before he had this
32:01
idea of throwing some government money
32:03
into it to bribe everybody to become
32:04
interoperable so they’d just be one damn
32:06
network and people could actually
32:07
connect that really was him and he
32:10
deserves credit for having done that
32:12
unless you think it was a terrible idea
32:14
but when that was happening
32:16
I remember having conversations about is
32:18
like we by creating this thing in such
32:21
an incredibly bare-bones way we are
32:24
creating gifts of hundreds of billions
32:26
of
32:26
for persons unknown who will be required
32:28
to fill in these missing things that
32:30
everybody knows have to be filled in and
32:34
then a little while later this other
32:36
thing happened which is Tim berners-lee
32:37
who’s great came up with a world wide
32:39
web protocol and here he did this thing
32:44
up to that point all of the ideas for
32:48
how to create shared you know shareable
32:51
media experiences online which are
32:53
called hypertext after Ted Nelson had
32:55
come up with the first Network design

32:57
back in 1960 the HTTP is from his for
33:00
hypertext they a core tenet of these is
33:04
that anytime one thing on the internet
33:06
pointed at something else that other
33:07
thing had to know it was being pointed
33:08
out so that there were two-way legs you
33:11
always knew who was pointing at you and
the reason for that is that way you
could preserve context provenance
history you could create chains of
payment where if people mashed up stuff
from somebody else in that person mashed
up from somebody else you could pay
payments that would populate back to pay
for everybody who contributed so if you
wanted to have an economy of information
you could
the information wouldn’t be
dropped but Tim just had one wailings
you could point it somebody they have no
idea that we’re being pointed out and
the reason for that is that it’s just to
actually do the two-way links is
genuinely a pain in the butt it’s just
more work if you do one way links the
whole thing could spread a lot faster

anybody can do it it’s just a much
easier system and that embedded in it
not only this idea of virality or me
meanness where whatever can spread the
fastest is what wins

and so it was a quantity over quality
thing in my view that was another thing
that happened so another thing that
happened didn’t come from Silicon Valley
in the late 80s people in Wall Street
started to use automated trading in the
first flash crash from out of control
trading algorithms was 89 and they
figured out something very basic
although an Forester had described
exactly this problem so much earlier
which is that if you had a bigger
computer than everybody else and it was
more central getting more information
you could calculate ahead of everybody
in gained an information advantage and
in economics information advantages
everything so if you’ve had just a
little bit more information on everybody
else you could just turn that into money
and it wasn’t really new insight but it
had actually been implemented before
then shortly after that a company called
Walmart realized they could apply that
not just to financial instruments to
investments but to the real world and
they created a software model of their
supply chain and dominated it
they could
35:10
go to anybody who was involved somewhere
35:12
in giving them products and figure out
35:14
what their bottom line was so they could
35:15
negotiate everybody down they knew who
35:17
everybody’s competitor was they went
35:19
into every negotiation with superior
35:21
information when they built this giant
35:23
retail empire on information superiority
35:28
Dada all happened before anybody in
35:30
Silicon Valley started doing it okay now
fast-forward to the birth of Google so
you have these super bright kids Sergey
and Larry some of the students I talked
to today on campus here remind me of
35:43
what they were like at the same age
35:44
super bright super optimistic idealistic
35:47
actually focused and they were backed
35:54
into a corner in my view on the one hand
35:57
the whole hacker community the whole
tech community would have just slammed
them if they did anything other than
everything being free but on the other
hand everybody wanted them to be the
next Steve Jobs the next Bill Gates
that
was like practically a hunger like we
want we want our next star and the only
way to combine the two things was the
advertising model
the advertising model
would say you’ll get everything for free
you can be you know as far as you’re
concerned your experience is you just
36:26
ask for what you want and we give it to
36:28
you now the problem with that is that
36:31
because it’s an advertising thing you’re
36:35
actually being observed your information
36:37
is being taken you’re being watched and
36:39
there’s a true customer this other
36:41
person off to the side who at first you
36:45
were always aware of because you could
36:46
see their little ads you know they’re
36:48
like if your local dentist or whatever
36:50
it was cute at first it was harmless at
36:51
first
36:54
and unfortunately if they come up with
37:00
this thing
37:01
after I don’t know worse law had ended
37:04
in computers were as fast as they were
37:05
ever gonna get and we’d established a
37:08
whole regulatory and ethical substrate
37:10
for computation everything maybe it
37:12
could have worked but instead they did
37:14
it in a period where there was still a
37:16
whole lot of Moore’s law to happen so
37:18
all the computers got faster and faster
37:19
cheaper and cheaper more more plentiful
37:21
more and more storeit or more connection
37:22
the algorithms got better and better
37:25
machine learning kind of started to work
37:27
a little better a lot of these
37:29
algorithms kind of kind of figured it
37:30
out we had enough computation to do
37:32
experiments and get all kinds of things
37:34
working that hadn’t worked before all
37:36
kinds of little machine vision things I
37:38
sold them on machine vision company
actually and the whole thing kind of
accelerated and what started out as an
advertising model turned into something
very different and so here we get into
our description of at least my
perception of the state that we’re in
37:54
right now so I mentioned earlier that
37:58
Norbert Wiener had described what he
38:04
viewed as a potentially horrible outcome
38:06
for the future of computation where
you’d have a computer in real time
observing a person with sensors and
providing stimulus to that person in
38:14
some form with displays or other
38:16
effectors and implementing behavior
38:19
modification feedback loops in order to
38:23
influence the person and if that was
38:24
done globally it would detach humanity
38:27
from reality and bring our species to an
38:29
end that was the fear back in the 50s
38:31
now unfortunately this innocent little
advertising model which was supposed to
address both the desire to have
everything be this Wild West open thing
and the desire to have entrepreneurs
despite everything being free landed us
right in that pocket that’s exactly
where we went
38:53
now I should say a bit about behaviorism
38:56
because that’s another historical thread
38:58
that led to where we are behaviorism is
39:02
a discipline of reducing the number of
39:07
variables in the training of an organism
39:10
so that you can corporal’s them
39:12
rigorously and reproduce effects so
39:15
let’s say if you’re whispering into your
39:18
horses ear while you’re training your
39:19
horse
39:20
that’s not behaviorism if you’re
39:22
whispering into your kids ear even if
39:24
you do offer some treats once in a while
39:26
ten cards behavior that’s not
39:27
behaviorism it has elements of it but
39:30
hardware behaviors and reduces the
39:32
variables and says look what we want to
39:34
do if we want to isolate we want to say
39:36
here’s this organism it’s in a box
39:38
sometimes they’re called Skinner boxes
39:41
remembering BF Skinner one of the famous
39:43
behaviorists and we want to say if the
39:46
creature person human whatever does a
39:48
certain thing you want you give the
person the treat does something you
don’t want give them a punishment
typically maybe candy and electric shock
39:58
the timing and the occurrence of these
40:02
things is guided by an algorithm you
40:04
find him the algorithm you need to
40:06
discover how to change behavior patterns
40:08
this science of studying behavior
40:11
behaviorism yielded surprises really
40:16
interesting surprises very early on the
40:19
first celebrity behaviorist was probably
40:22
Pavlov you’ve all heard of Pavlov I’m
40:24
sure and he demonstrated famously that
40:27
he could get a dog to salivate upon
40:30
hearing a bell whereas previously the
40:32
dogs salivated
40:33
upon being given food and hearing the
40:36
Bell so he was able to create a purely
40:39
symbolic seeming stimulus to replace the
40:43
original concrete one that’s quite
40:45
important because in many areas today
40:48
where behaviors modified addictions are
40:50
created there only abstract stimuli this
40:53
is true for instance for gambling’s that
modern gambling is based on this so are
like little games like candy crush were
there pictures of candy instead of real
41:01
candy now I have no doubt someday
41:04
there’ll be some Facebook or Google
41:07
hovercraft you know drone over your head
41:10
that drops real candy and electric
41:11
shocks on your head but for the moment
41:13
we’re in this symbolic realm that that
41:16
pavlov uncovered another amazing result
41:21
is that you might think naively that’s
41:23
simply providing punishment and reward
41:26
as reliably and as immediately as
41:29
possible would be the most effective way
41:31
to change behavior patterns but actually
41:33
that’s not true it turns out that adding
41:36
an element of randomness makes the
41:39
algorithms more effective so we don’t
41:44
fully just to state the obvious nobody
41:46
really understands the brain as yet but
41:49
it appears that the brain is is
41:52
constantly in a natural state of seeking
41:55
patterns of trying to understand the
41:57
world so if you provide a slightly
41:59
randomized feedback pattern it doesn’t
42:02
confuse or repel the brain instead of
42:05
draws the brain in the brain is a gate
42:06
there must be something more to
42:07
understand there must be something more
42:09
and gradually you’re drawn and more and
42:11
more and more and so this is why the
42:15
randomness of when you win at gambling
42:17
is actually part of the addiction
42:19
algorithm that’s part of what makes it
42:21
happen
42:21
now in the case of social media what
42:24
happens is the reward is when you get
42:27
retweeted or you go viral something like
42:30
that the term of art in Silicon Valley
42:33
companies is usually a dopamine hit
42:35
which is not an entirely accurate
42:37
description but it’s the one that that’s
42:40
most commonly is for when you have a
42:41
quick rise of a positive reward but just
42:45
as the gambler becomes addicted to the
42:48
whole cycle where they’re losing more
42:50
often than they win a Twitter addict
42:53
gets addicted to the whole cycle where
42:56
they’re most often being being punished
42:59
by other people who are tweeting and
43:00
they only get a win once in a while
43:02
right it’s the same it’s the same
43:05
algorithm and indeed
43:09
one of the side effects so in the trade
43:14
the terminology we use is engagement we
43:17
have algorithms that drive engagement
43:19
and we hire zillions of people with
43:22
recent PhDs from psych departments this
43:24
whole program there’s a program called
43:26
persuasive technology at Stanford where
43:28
you can go get a PhD in this and then
43:30
you get hired by some tech company to
43:32
drive engagement but it’s it’s really
43:34
just a sanitized word for addiction so
43:40
we drive addiction using a variety of
43:42
these algorithms and we can study them
43:45
more than the classical behavior server
43:46
did because we can study a hundred
43:48
million instances at once and and and we
43:52
can put out a hundred million variations
43:53
on all kinds of people and correlated
43:55
with data for all those people and then
43:58
cycle and cycle in a cycle the
44:00
algorithms can find new pockets of
44:03
efficacy they can tweak themselves until
44:06
they work better and we don’t even know
44:07
why they’re far ahead of any ability we
44:10
have to really keep up with them and try
44:12
to interpret exactly why some things
44:13
work better than other things
44:14
now even so it’s important to get this
44:18
right the effect is in a way not that
44:22
dramatic so Facebook for instance has
44:26
published research bragging that it can
44:28
make people sad and they don’t realize
44:29
that they were made sad by Facebook now
44:31
by the way you might wonder why would
44:34
Facebook publish that wouldn’t they want
44:37
to hide that fact it sounds pretty bad
44:39
but you have to remember that you’re not
44:42
the customer of Facebook the customer is
44:44
the person off to the side we’ve created
44:46
a world in which any time two people
44:48
connect online it’s financed by a third
44:51
person who believes they can manipulate
44:52
the first two so to the degree Facebook
44:55
can can convince that the third party
44:58
that mysterious other who’s hoping to
45:00
have influence that they can have some
45:03
mystical magical unbounded sneaky form
45:06
of influence then Facebook makes more
45:08
money that’s why they published it and
45:11
I’ve been at events where this stuff is
45:14
sold by the various tech companies and
45:15
they there’s no end to the brags and the
45:18
exaggerations when it comes to telling
45:20
the true customers what their powers are
45:22
very different from their public stance
45:23
but at any rate the the the darkness of
45:33
this all is that when you use this
45:37
technique to addictive people and we
45:40
haven’t even gotten to the final stage
45:41
of influencing their behavior patterns
45:42
we’re still just at the first stage of
45:44
getting them addicted you create
45:46
personality dysfunctions associated with
45:49
addiction because it is a form of
45:50
behavioral addiction so if any of you
45:53
who have ever dealt with somebody who’s
45:55
a gambling addict the technical
45:58
qualities of gambling addiction are
45:59
similar to the technical qualities of
46:02
social media addiction now I was just
46:06
saying before that we have to get this
46:07
right and understand the the degree of
46:10
awfulness here because it’s actually
46:13
kind of slight but just very consistent
46:15
and distributed a gambling addiction can
46:18
be really ruinous somebody can destroy
46:20
their lives and their family a social
46:22
media addiction can be ruinous as we’ve
46:24
seen by unfortunate events in just the
46:27
last few days but more often there’s a
46:30
statistical distribution where a
46:32
percentage of people are kind of
46:35
slightly effective and have their
46:37
personality slightly changed so what
46:40
will happen is some percentage and in
46:42
some of the studies I’ve seen published
46:44
maybe it’s like 5%
46:46
show like a three percent change in
46:48
personality or something like that so
46:49
and this is over hundreds of millions of
46:51
people or even over billions so it’s a
46:53
very slight very distributed statistical
46:56
effect on people with just a few who are
46:59
really dramatically affected but the
47:02
problem with that is that it compounds
47:07
like compound interest a slight effect
47:10
that’s persistent consistent repeated
47:14
starts to darken the whole society so
47:17
let’s talk a little bit about the
47:18
addictive personality that’s brought out
47:20
by these things the way I characterize
47:23
it is it becomes paranoid
47:28
insecure a little sadistic it becomes
47:36
cranky now why why those qualities so I
47:44
have a hypothesis about this and here
47:46
I’m hypothesizing a little ahead of
47:50
experimental results in science so I
47:53
want to make that clear this is a
47:54
conjecture not not something that I can
47:57
cite direct evidence for what I but but
48:01
all the the components of it are all
48:03
well studied so it’s just putting
48:05
together things that are known and I
48:07
think I think this should therefore be
48:09
worthy of public discussion you can very
48:13
roughly bundle emotional responses from
48:17
people into two kind of bins when we’ll
48:22
call positive and the other will call
48:23
negative the positive ones are things
48:25
like affection trust optimism and a
48:32
person belief in a person faith in a
48:34
person comfort with a person relaxing
48:37
around a person all that kind of stuff
48:39
the qualities you want to feel in
48:40
yourself when you’re dating somebody
48:42
let’s say the negative ones are things
48:46
like fear anger jealousy rage feeling
48:53
aggrieved feeling a need for revenge
48:55
just all this stuff now in the negative
48:58
bin a lot of these emotions are similar
49:01
to another bin that’s been described
49:03
over many years which is the startle
49:05
responses or the fight-or-flight
49:06
responses and the thing about these
49:09
negative ones is that they rise quickly
49:11
and they take a while to fall so you can
49:15
become scared really fast you can become
49:17
angry really fast and the related
49:21
positive emotions tend to rise more
49:23
slowly but can can drop quickly they
49:25
have the reverse time profile so it
49:29
takes a long time to build trust but you
49:31
can lose trust very quickly it takes a
49:33
long time to become relaxed compared to
49:37
how quickly you can become
49:38
startled scared and nervous and on edge
49:41
no this isn’t universally true there are
49:44
some fast rising positive emotions I
49:46
just talked about the dopamine hits
49:48
earlier so that’s an exception but
49:50
overall they’re more fast rising
49:52
negative ones
49:53
now these algorithms that are measuring
49:57
you all the time in order to adapt the
50:00
customized feeds that you see and the
50:02
designs of the ads that you see and just
50:04
everything about your experience they’re
50:06
watching you watching you watching you
50:07
in a zillion ways expanding all the time
50:10
now they’re following your voice tone
50:13
and trying to discern things about your
50:14
emotions based on pure correlation
50:17
without necessarily much theory behind
50:18
it they’re watching your emotions as you
50:21
move they’re watching your eyes your
50:23
smile and of course they’re watching
50:25
what you click on what you type all that
50:28
and the thing is if you have an
50:32
emotional response that’s faster the
50:35
algorithms are going to pick up on it
50:36
faster because they’re trying to get as
50:39
much speed as possible they’re rather
50:42
like high-frequency trading algorithms
50:44
in that sense we intrinsically in
50:47
Silicon Valley try to make things that
50:49
respond quickly and act quickly and so
50:51
if you have a system that’s responding
50:54
to the fast rising emotions you’ll tend
50:56
to catch more of the negative ones
50:57
you’ll tend to catch more of the
50:58
startled emotions now here’s the thing
51:01
if you look at the literature and ask
51:04
the broad question if we accept this
51:08
idea of beaming emotions into positive
51:10
and negative feedback emotions as far as
51:14
behavior change goes is positive or
51:16
negative more influential on human
51:19
behavior and the answer you’ll get is a
51:21
really complex patchwork there’s
51:24
behaviors have been around for a long
51:26
time so there’s a lot of studies you can
51:28
read hard to know exactly how high
51:30
quality all the research is especially
51:32
the older stuff but in general you can
51:34
find lots of examples of positive
51:37
feedback working better than negative or
51:39
vice versa and it’s all very situational
51:42
a lot of it’s very subtle on how things
51:44
are framed for people all kinds of stuff
51:45
but overall I what I perceive from the
51:49
literature is
51:49
approximate purity between positive and
51:52
negative but if you ask which emotions
51:56
will the algorithms pick up on when
51:58
they’re trying to get the fastest
51:59
possible feedback it’s unquestionably
52:01
true that the negative ones are faster
52:03
all right
52:05
so what you see is the algorithm
52:06
suddenly flagging oh my god I got a rise
52:09
out of that person let’s do some more of
52:10
that because we’re engaging that person
52:12
and that stuff tends to be the stuff
52:15
that makes them angry paranoid
52:17
revengeful insecure nervous jealous all
52:21
these things and so what you see is this
52:24
feedback cycle where a certain kind of
52:28
dysfunctional personality trait is
52:30
brought out more and more and people
52:33
with similar dysfunctional personalities
52:36
are introduced to each other by the
52:38
system’s
52:38
so when it’s a personality look like
52:41
well the the addiction personality
52:43
online all named three people who have
52:47
recently displayed it rather blatantly
52:49
one is the president who I’m just not
52:52
going to bother to name because I’m sick
52:53
of idiot the second is Kanye the third
52:58
is Elon Musk three people all displaying
53:02
somewhat overlapping in my view
53:05
personality distortions now I’ve no I’ve
53:09
had slight contact with two of the above
53:12
three I’ll let you guess which two they
53:14
are well know I’ll say one of them’s
53:17
trouble I’ve met Trump a few times over
53:18
a very long period of time I’ve never
53:21
known him well I’ve never had a real
53:23
conversation with him but I will say
53:24
that in the 80s and 90s he didn’t seem
53:29
like somebody who was desperate for you
53:30
to like him he didn’t seem like somebody
53:33
who was nervous about what you thought
53:34
about him he didn’t seem like somebody
53:36
who was itching for a fight he didn’t
53:38
seem like somebody who was looking for
53:40
trouble and thought it would help him he
53:43
really just didn’t seem like that at all
53:44
he seemed I think he was still a con man
53:46
I think he was but he was kind of like a
53:49
happy con man is that you know it was
53:51
like a different persona
53:53
and and I think what you know remember
53:58
how I said before that the gambling
54:00
addict is addicted to the whole cycle
54:02
where they lose a lot before they win
54:04
and I think in the same way the Twitter
54:06
addict is addicted to a cycle where they
54:08
bring a lot of wrath upon themselves and
54:10
have to deal with a lot of negative
54:12
feedback before they get positive
54:14
feedback or that you know there’s a mix
54:16
it’s very much like the losing and
54:18
winning and gambling and so I think
54:21
what’s happened is he’s gotten himself
54:22
into this state where he’s he’s like
54:24
this really nervous narcissist and this
54:27
is kind of weird like this personality
54:30
of the person who really like this
54:31
really like me I think he likes me
54:33
this kind of weird nervous narcissistic
54:36
insecure person has not been a typical
54:39
authoritarian personality in the past
54:41
and yet it’s working now and I suspect
54:45
the reason why is a lot of the followers
54:47
who respond to it see themselves in that
54:49
insecurity which is really strange I
54:52
mean if you think about this in the past
54:55
the celebrity figure or the leader
54:57
typically wanted to display a
54:59
personality that was kind of
55:02
invulnerable and an aloof and unmeaning
55:06
self-sufficient uncaring about whether
55:09
whether they’re liked or not and yet
55:12
that’s not what’s going on here it’s
55:13
really strange and and then there’s this
55:16
issue of lashing out its it be so so
55:19
it’s it’s as if because you know that
55:22
you have to get a certain if there’s a
55:23
certain amount of punishment that goes
55:25
with that reward you actually seek out
55:28
some of the punishment because you’re oh
55:29
that’s actually a part of your addiction
55:31
so if you’re a gambling addict you
55:33
actually make some stupid bets it’s it’s
55:35
it’s true it’s just what happens so you
55:38
have Elon Musk
55:39
I’m calling this guy who tried to rescue
55:41
kids in a cave in Thailand a pedophile
55:43
out of nowhere all right same thing
55:46
twitter twitter addiction dysfunctional
55:49
personality Kanye I’m not even what you
55:52
know but but basically you have people
55:54
who are kind of degrading themselves and
55:57
making themselves into fools but in a
56:01
funny way in the current environment
56:03
and there’s a whole world of addicted
56:06
fans who actually relate to it see
56:08
themselves in it and it works it works
56:10
for the first time in history and it’s
56:12
really strange it’s really it’s a really
56:15
weird moment okay so I started by
56:20
talking about the problem of losing
56:22
touch with reality
56:23
now as you heard I have a book called
56:28
ten arguments for deleting your social
56:29
media accounts right now and it goes
56:31
through a lot of reasons to delete your
56:34
social media of which the closest to my
56:37
heart is actually the final one which is
56:39
a spiritual one it’s about how I think
56:41
that Silicon Valley is kind of creating
56:44
a new religion to replace old religions
56:47
and even atheism with this new faith
56:49
about AI and the superiority of tech and
56:54
how we’re creating the future and all
56:55
this and and I feel that that religion
56:57
is an inferior woman people are being
56:59
drawn into it through practice so that
57:00
that tenth argument is the one I care
57:02
most about but what I want to focus on
57:04
here is the existential argument which
57:06
is the loss of reality so the problem we
57:11
have here is that we’ve created so many
57:15
addicts so many people who are on edge
57:17
that they perceive essentially politics
57:24
before they perceive nature they
57:26
perceive the world of human
57:31
recriminations before they perceive
57:33
actual physical reality no I presented a
57:36
theory it’s in various of my books
57:39
called the pacts which which I will
57:42
recount to you now that’s a way of
57:44
thinking about this it goes like this
57:48
there’s some species that are
57:51
intrinsically social like a lot of ants
57:54
there’s some species that tend to be
57:57
solitary like a lot of octopuses some of
58:01
my favorite animals there are some
58:04
species that can switch that can be
58:07
either solitary or social depending on
58:11
circumstances
58:13
and a famous one that we refer to in
58:16
mythology and in our storytelling is the
58:18
wolf you could have a wolf pack or you
58:21
can have a lone wolf same wolves
58:24
different social structures different
58:26
different epistemology I would say when
58:30
you’re a lone wolf you’re responsible
58:33
for your own survival you have to pay
58:36
attention to your environment where will
58:38
you find water where will you find prey
58:40
how do you avoid being attacked where do
58:43
you find shelter how do you survive bad
58:44
weather you are attached to reality like
58:47
a scientist or like an artist you are
58:50
naturalist when you’re in a wolf pack
58:54
different story now you have to worry
58:57
about your peers they’re competing with
58:59
you you have to worry about those above
59:01
you in the pack will they trash you can
59:04
will you get their station you have to
59:06
piss on those below you because you have
59:08
to maintain your status but you have to
59:11
unify with all your fellow pack members
59:13
to oppose those other packs over there
59:15
the other so all of a sudden social
59:19
perception and politics has replaced
59:22
naturalism politics versus naturalism
59:25
those are the epistemologies of the lone
59:28
wolf and the wolf pack people are also
59:33
variable in exactly this way we can
59:36
function as individuals or we can
59:38
function as members of a pack now what
59:43
happens is exactly what I am at least
59:45
hypothesizing happens with wolves it’s a
59:47
kind of interesting interaction
59:48
interacting with scientists who actually
59:50
study wolves because I haven’t actually
59:52
spent that much time with wolves just a
59:53
little bit so they’re people who know a
59:54
lot more about wolves and let’s just say
59:57
my little portrayal is overly simplified
59:59
but just I mean I’m it’s like a little
60:03
cartoon but I hope it functions to
60:04
communicate so when we are thinking as
60:11
individuals we have a chance to be
60:13
naturalist so we have a chance to be
60:14
scientists and artists we have a chance
60:16
to perceive reality uniquely from our
60:20
own unique perspective a diverse
60:22
perspective as compared to everyone else
60:23
is that
60:24
we can then share when we join into a
60:28
pack mentality we perceive politics so
60:32
what happens on social media is because
60:34
the algorithms are trying to get a rise
60:37
out of you to up your engagement and
60:39
make you ripe for receiving behavior
60:42
modification you’re constantly being
60:44
pricked with little social anxiety rage
60:50
irritations all these little things all
60:53
these little status worries is my life
60:55
as good as that person’s life am i
60:57
lonely relative to all these people what
60:59
do they think of me am i smart enough am
61:02
i getting enough attention for this why
61:03
didn’t people care about the last thing
61:05
I did online blah blah blah blah blah
61:06
and there’s just like it’s not that any
61:08
of these things by themselves are
61:10
necessarily that serious but
61:11
cumulatively what they’re doing is
61:13
they’re shifting your mindset and
61:16
suddenly you’re thinking like a packed
61:18
feature you’re so the pack switch is set
61:20
and you’re thinking politically and when
61:23
you think politically you lose
61:25
naturalism you know I think both modes
61:29
of being have a place I think being I
61:32
think if people exclusively all the time
61:34
stayed in the lone setting that would be
61:37
bad for society that would be bad for
61:39
relationships would be bad for families
61:42
and so on however there needs to be a
61:45
balance there needs to be a healthy way
61:47
of going back and forth between them and
61:49
not getting lost in one or the other and
61:52
so the hypothesis I’d put forward is
61:54
that we’re giving people so many little
61:57
anxiety-producing bits of feedback that
61:59
we’re getting them into this pack
62:00
mentality where they’ve become hyper
62:04
political without maybe even quite
62:06
realizing it and losing touch with
62:08
reality no when I say losing touch with
62:10
reality that demands some evidence
62:14
because you might say well are we less
62:16
in touch than in the past
62:18
so remember at the start after the music
62:23
I gave you what I consider to be sort of
62:27
a positive framing and a lot of good
62:29
news absolute poverty has been reduced
62:32
absolute levels of violence have been
62:33
reduced absolute levels of disease have
62:35
been reduced and so
62:36
there are many ways in which we’re
62:38
bettering ourselves but there’s this
62:40
other thing going on which is bad enough
62:44
that it might be the undoing of all of
62:47
that and that is this loss of reality
62:50
now here’s what I want to point out I I
62:53
travel around a fair amount and I
62:55
visited places that would appear on the
62:58
surface to have very little in common
63:00
I’ll mention some of them Brazil Sweden
63:03
Turkey Hungary the United States what do
63:08
they all have in common what they have
63:10
in common is the rise not just we
63:14
sometimes characterized it as right-wing
63:16
populist politics I don’t think that’s
63:20
quite right I think what we actually are
63:23
seeing is the rise of cranky paranoid
63:30
unreal politics I think that’s a better
63:34
characterization and it’s really
63:36
remarkable how it’s all happened at
63:38
about the same time and it’s happened in
63:40
some poor parts of the world too it’s
63:41
not even it’s like so it’s an you could
63:44
say well it’s something about aging
63:45
populations all the cranky old people I
63:47
have you know freshmen will tell me that
63:50
to get our minor but you know their
63:52
countries that are very young that have
63:54
that problem Turkey Brazil it’s like oh
63:56
it’s diverse countries it’s that we
63:59
can’t have democracies unless they’re
64:01
they’re ethnically monolithic or
64:03
something brazil’s diverse oh it’s it’s
64:08
inequality we can’t have the problem is
64:11
that societies are just losing their
64:14
social safety net well you know Sweden
64:17
Germany not really they might have
64:19
anxiety about actually you know it’s
64:22
it’s not so all these places are really
64:25
different they have different histories
64:26
and yeah they’ve all had similar
64:29
dysfunctions and so you have to say well
64:31
what’s in common between all of them and
64:33
you can say something vague well they
64:35
all have anxiety about the future and
64:36
this that’s true but the obviously they
64:38
have in common is that people have moved
64:40
to this mode of connecting through
64:42
manipulative systems that are designed
64:44
for the benefit of third parties who
64:45
hope to manipulate everybody sneakily
64:47
that seems like the clear thing they all
64:50
have in common
64:51
Brazil recently I mean all the same crap
64:55
that we saw was happening on whatsapp
64:58
which is the big connector down there
65:00
and Facebook I think to their credit try
65:04
to help a little bit but they couldn’t
65:05
really do it cuz the whole system is
65:07
designed to be manipulative you know
65:08
it’s if if if you have a car – that’s
65:12
designed to roll it’s very hard to say
65:14
well we won’t let it roll very much I
65:16
mean whatever it does it’ll be rolling
65:17
if you have a manipulation system and
65:19
that’s what it’s designed for you can
65:21
try to get it to roll more slowly or
65:23
something but all it can really do is
65:24
manipulate that is what these things are
65:26
optimized for that’s what they’re built
65:28
for that’s how they make money
65:29
every penny of the many billions of
65:32
dollars that some of these companies
65:33
have taken in that are totally dependent
65:35
on this and of the big companies the
65:37
only ones really totally dependent are
65:39
Google and Facebook or almost suddenly
65:41
dependent it all comes from people who
65:43
believe they’ll be able to sneakily
65:44
influence somebody else by paying money
65:46
via these places that is what they do
65:47
there’s just no other way to describe it
65:50
and so you have the typical thing that
65:57
happens is that the algorithms there
66:01
isn’t any information in them that comes
66:03
from like angels or extraterrestrials it
66:06
all has to come from people so people
66:07
input some information and often it’s
66:09
very positive at first you know it’ll a
66:11
lot of the starter information that goes
66:13
into social networks ranges from
66:16
extremely positive and constructive and
66:18
constructive to just neutral and nothing
66:21
much so there might be people who are
66:23
trying to better themselves maybe
66:24
they’re trying to help each other with
66:26
health information or something like
66:27
that
66:28
then all this information starts they’ll
66:31
say what we’re gonna forward some of
66:33
this information to this person in that
66:34
person we’ll try a 10 million times and
66:36
we’ll see if we get a rise from anybody
66:39
that ups their engagement now the people
66:41
who will be engaged quote-unquote
66:43
engaged are the ones who dislike that
66:45
information so all of a sudden you’re
66:47
getting juice from finding exactly the
66:49
horrible people who hate whatever the
66:50
positive people started off with and so
66:53
this is why you see this phenomenon over
66:55
and over again where whenever somebody
66:57
finds a great way to use a social
66:58
network they have this
66:59
initial success and then it’s echoed
67:01
later on but horrible people giving even
67:03
more mileage out of the same stuff so
67:04
you start with an Arab Spring and then
67:06
you get Isis getting even more mileage
67:08
out of the same tools you start with
67:10
black lives matter you get these
67:12
horrible racist these horrible people
67:16
who just are blackening America getting
67:18
even more mileage out of the same tools
67:20
it just keeps on happening and by the
67:24
way you start with me too and then you
67:26
get in cells and proud boys and whatever
67:28
the next stupid things gonna be because
67:30
the algorithms are finding these people
67:32
as a matter of course introducing them
67:34
to each other and then putting them in
67:35
feedback loops where they get more and
67:36
more incited without anybody planning it
67:39
there’s no evil person sitting in a
67:41
cubicle intending this I or at least I
67:44
would be very surprised to find somebody
67:46
like that I know a lot of the people in
67:49
the different places and I just don’t
67:51
believe it I believe that we backed
67:54
ourselves into this weird corner and
67:56
we’re just not able to admit it and so
67:58
we’re just kind of stuck in this stupid
68:00
thing where we keep on doing this to
68:01
ourselves so what you end up with is
68:06
electorates that are driven you have
68:09
like enough of a percentage of people
68:11
who are driven to be a little cranky and
68:14
paranoid and a little irritated and they
68:17
might have legitimate reasons I’m not
68:18
saying that they’re totally disconnected
68:20
from real life complaints but their way
68:22
of framing it is based on whatever the
68:24
algorithms found could be forwarded to
68:26
them that would irritate them the most
68:27
which is a totally different criteria
68:29
than reality so whatever it is and so if
68:34
it’s in the case of the synagogue
68:37
shooter it was one set of in
68:40
the case of the pipe bomber guy was
68:41
other thing in the case of the guy who
68:43
set up the but it’s all similar it’s all
68:45
part of the same brew of stuff that
68:46
algorithms forward now in some cases the
68:51
algorithms might have tweaked the
68:53
messages a bit because the algorithms
68:54
can do things like play with fonts and
68:56
colors and timing and all kinds of
68:57
parameters to try to if those have a
68:59
slight effect of how much of rise they
69:01
can get but typically they come from
69:03
people who are also
69:05
just trying to get as much impact as
69:08
possible and I think what I think what’s
69:11
happened is we’ve created a whole world
69:13
of people who think that it’s honorable
69:18
to be a terribly socially insecure
69:21
nitwit who feels that the world is
69:23
against them and it’s desperate to get
69:24
attention in any way and if they can get
69:26
that attention that’s the ultimate good
69:28
and the president acts that way a lot of
69:31
people act that way
69:33
that’s what musk was doing and I could
69:36
many other figures and I think what
69:38
happens is these people become both the
69:40
source of new data that furthers the
69:41
cycle and of course it drives them and
69:44
so that there’s sort of multiple levels
69:48
of evil that result from this the
69:50
obvious one is these horrible people who
69:54
make our world unsafe and make it make
69:57
our world violent and break our hearts
69:59
and just keep on doing it over and over
70:01
again and this just off the sense that
70:03
just random people are self-radicalized
70:06
and turning themselves into the heart of
70:08
the most awful version of a human
70:09
imaginable but there aren’t that many of
70:12
them in absolute numbers and I said in
70:14
earlier in terms of absolute amounts of
70:17
violence there’s actually an overall
70:18
decrease in the world despite all this
70:20
horrible stuff with some notable
70:22
exceptions like in with Isis in the
70:24
Middle East and so forth
70:25
but overall you know actually that’s
70:27
that’s true however the second evil is
70:31
the one that I think actually threatens
70:33
our overall survival and that is the one
70:35
of gradually making it impossible to
70:38
have a conversation about reality it’s
70:41
really become impossible to have a
70:44
conversation about climate it’s become
70:46
impossible to have a conversation about
70:48
health it’s become impossible to have a
70:51
conversation about poverty it’s become
70:53
impossible to have a conversation about
70:55
refugees it’s become impossible to have
70:58
a conversation about anything real it’s
71:03
only become possible to have
71:05
conversations about what the algorithms
71:07
have found upsets people and on the
71:09
terms of upsetting because that’s the
71:11
only thing that’s allowed to matter
71:15
and that is terribly dark that is
71:19
terribly dark and terribly threatening
71:21
and what I the scenario I worry about is
71:25
I mean it’s conceivable that some sort
71:30
of repeat of what happened it’s hard for
71:34
me to even say this but some sort of
71:35
repeat of what happened in the late 30s
71:37
in Germany could come about here I can
71:39
imagine that scenario I can imagine it
71:42
vividly because my own grandfather
71:43
waited too late in Vienna and my mother
71:46
was taken as a child and survived the
71:49
concentration camp so I feel it’s very
71:51
keenly having a daughter myself and yet
71:56
I don’t think that’s the most likely bad
71:58
scenario here I think the more likely
72:00
bad scenario is that we just put up with
72:03
more and more shootings more and more
72:07
absolutely useless horrible people
72:10
becoming successful and one in one
72:12
theatre or another whether politicians
72:13
or company heads or entertainers or
72:17
whatever and gradually we don’t address
72:21
the climate gradually we don’t address
72:24
where we’re gonna get our fresh water
72:26
from gradually we don’t address where
72:28
we’re gonna get a new antibiotics from
72:30
gradually we don’t wonder how we’re
72:33
gonna stop the spread of viruses vaccine
72:37
paranoia is another one of these stupid
72:39
things that spread through these
72:41
channels gradually we see more and more
72:43
young men everywhere turning themselves
72:45
into the most jerky version of a young
72:47
man sort of various weenie suppress
72:51
supremacy movements under different
72:53
names from you know gamergate to in
72:57
cells – all right – proud boys –
73:00
whatever this is going to be like this
73:02
endlessly and then gradually one day
73:04
it’s too late and we haven’t faced
73:06
reality and that and we’re we no longer
73:10
have agriculture we no longer have our
73:12
coastal cities we no longer have a world
73:15
that we can survive in and I that is you
73:21
know it’s a kind of a what I worry about
73:23
is a terribly stupid cranky undoing
73:27
fight into us not a big dramatic one
73:30
it’s neither a whimper nor a bang but
73:33
just sort of a cranky rant that could be
73:38
our end and is that a laugh line I don’t
73:44
know you guys are pretty dark anyway so
73:51
what to do about it
73:53
so here there my characterization of the
73:57
problem overlaps strongly with a lot of
74:00
other people’s characterizations of the
74:02
problem mine is perhaps not identical to
74:06
the problem as described by many others
74:08
but there’s an F overlap that I think we
74:11
have a shared we meaning many people who
74:13
hope to change us have a shared sense of
74:15
what’s gone wrong no the first thing I
74:17
want to say in terms of optimism is Wow
74:19
is that better than things used to be if
74:21
I had been giving this talk even a few
74:24
years ago not long at all ago I would
74:28
have been giving the talk as a really
74:30
radical French figure who was saying
74:31
things that almost nobody accepted who
74:33
had lost friends over these ideas and
74:36
who was really kind of surviving on the
74:40
basis of my technical abilities in my
74:42
past rather than what I was saying
74:44
presently because it was so unpopular
74:45
the last especially since I would say
74:48
like brexit Trump but also just in
74:51
general like studies showing the
74:52
horrible increase in suicides and teen
74:55
girls that that scale with their social
74:58
media use all these horrible things that
74:59
have come out oh no something that’s
75:02
really different in Silicon Valley there
75:05
are genuinely substantial movements
75:07
among the people the companies to try to
75:09
change their act regulators at least in
75:12
Europe are starting to get teeth and
75:13
really look at it seriously the tech
75:17
companies are trying to find a way to
75:19
get out of the manipulation game they
75:22
haven’t necessarily succeeded and not
75:24
all of them are trying but some of them
75:26
are
75:27
and it’s a different world it’s a world
75:29
with a lot of people who are engaged so
75:31
now having presented and the problem as
75:34
I see it it’s possible to talk about the
75:38
solution now a lot of folks feel the
75:41
solution should be privacy rights the
75:44
European regulators are really into that
75:46
we had a major conference on that in
75:48
Brussels last week where Tim Cook who
75:52
runs Apple gave a fire-breathing talk
75:53
that kind of sounded like a talk I might
75:55
have given at some point in the past I
75:57
gave a talk there too and I was like wow
76:01
I’ve got the radical anymore it’s very
76:02
straight in a way in a way I kind of
76:04
mourn the loss of radicalness because
76:08
some part of me likes being the person
76:10
like at this outer edge and I’m not and
76:12
it’s kind of like oh god I’m supposed to
76:14
be the radical but anyway I am I think
76:20
it’s great that the Europeans are
76:21
pushing for privacy the theory behind
76:24
that is that the more the harder it is
76:29
for the manipulation machine to get at
76:31
your data the less it can manipulate and
76:33
the more maybe there’s a chance for
76:36
sanity there’s a peculiar race going on
76:39
because the societies and year of that
76:42
support regulation and have and have
76:45
regulators with teeth which we really
76:47
don’t have much of in the u.s. right now
76:48
are themselves under siege by these the
76:51
the cranky political parties who are
76:54
sometimes called right-wing populist but
76:56
I think should be just called you know
76:59
the crank parties and the the cranky
77:02
parties might bring these societies down
77:04
so there’s a race can the regulator’s
77:07
influence the technology in time to
77:09
preserve themselves or will the
77:11
technology destroy their politics before
77:13
they have a chance it’s a really so
77:16
that’s a race going on right now it’s
77:17
quite dramatic and I wouldn’t know how
77:19
to handicap it now the privacy approach
77:23
is hard because these systems are
77:27
complicated like if I say okay here’s
77:30
click on this button to consent to using
77:32
your data for this like I even obviously
77:34
can’t read them
77:36
thing and even if there’s some kind of
77:38
better regulation supporting it it’s
77:40
just nobody understands that even the
77:42
companies themselves don’t understand
77:43
their own data they don’t understand
77:44
their own security they don’t I mean
77:46
like this whole thing is beyond all of
77:48
us nobody’s nobody’s really doing it
77:50
that well everybody’s having data
77:52
breaches and discovering suddenly that
77:54
they were using data they didn’t think
77:56
they were using that’s happened
77:57
repeatedly at Google and Facebook in
77:59
particular so I’ve advocated a different
78:02
approach which is instead of using
78:08
regulators to talk about privacy get
78:11
lawyers and accountants to talk about
78:13
lost value from your data being stolen
78:15
now I have several reasons for that one
78:19
is I don’t think we’ll ever lose our
78:22
accountants and our lawyers I think
78:24
they’re more persistent than our
78:26
regulators that’s one reason and I’m not
78:30
going to do lawyer jokes because it’s
78:34
about the health society’s become some
78:35
mean-spirited I don’t like to make jokes
78:37
about classes of people even lawyers
78:38
anymore but in your so my best friends
78:43
are really you know them but anyway let
78:52
me give you an example that I like to
78:55
use to explain the economic approach
78:57
here there’s a tool online that I happen
79:01
to use frequently that I really like
79:03
which is automatic translation between
79:05
languages if you want to look at a
79:06
website in another language or send
79:08
somebody now you can go online and there
79:10
at least two companies that do this
79:12
pretty well now Microsoft and Google can
79:14
enter your text in one language a usable
79:16
translation comes out on the other side
79:18
convenient free great modernity however
79:24
here’s an interesting thing it turns out
79:28
that languages are alive every single
79:30
day there’s a whole world of public
79:31
events all of a sudden today I have to
79:35
be able to talk about the Tree of Life
79:36
shooter and you have to know what I mean
79:38
all of a sudden today I have to be able
79:40
to talk about the magibon Marie you need
79:42
to know what I mean so every single day
79:44
there all of these new reference points
79:46
that come out lately often horrible ones
79:48
sometimes my
79:49
once maybe a new music video and a new
79:53
meme that people like whatever so every
79:56
single day those of us who help maintain
79:59
such systems have to scrape meaning
80:02
steal tens of millions of example phrase
80:04
translations from people who don’t know
80:06
it’s being done to them so there are
80:08
tens of millions of people who are kind
80:10
of tricked into somehow translating this
80:12
phrase or that phrase in Google and
80:14
Microsoft have to grab these things and
80:15
incorporate them to update their systems
80:17
to make them work but at the same time
80:20
the people who are good at translating
80:22
are losing their jobs
80:24
the career prospects for a typical
80:27
language translator have been decimated
80:30
meaning their tenth of what they were
80:31
following exactly the pattern of other
80:34
information based work that’s been
80:36
destroyed by the everything must be free
80:39
movement recording musicians
80:41
investigative journalists crucially
80:43
photographers all of these people are
80:46
looking at about a tenth of the career
80:48
prospects that they used to have that’s
80:50
not to say that everything’s bleak all
80:51
there there are examples in each case of
80:54
a few people who find their way and this
80:57
gets to a very interesting technical
80:58
discussion which is I won’t but you get
81:01
a zipper curve where there are few
81:03
successful people and then it falls to
81:04
nothing whereas before you before you
81:06
had a bell curve but I can if anybody
81:08
wants to know more about that I can but
81:09
anyway you have a tiny number of
81:11
successful people but almost everybody
81:13
has lost their careers now wouldn’t it
81:15
make more sense if instead of making
81:20
money by providing free translations in
81:22
order to get other people who are called
81:24
advertisers to manipulate the people who
81:26
need the translations in some sneaky way
81:28
that they don’t understand and make the
81:31
whole world more cranky and less reality
81:32
oriented instead of that what if we went
81:37
to the people providing this phrase
81:39
translations and we just told them you
81:41
know if you could just give us the
81:42
phrase translations we really need then
81:45
our system would work better and we’d
81:47
pay you because then we’d have a better
81:48
system and then if we went to the people
81:50
who need translations and say free isn’t
81:53
really quite working because that way we
81:55
that means we have to get these other
81:56
people to manipulate you to have a
81:58
customer but we’ll make it really cheap
82:00
what about a die
82:01
a translation or something like that we
82:03
worked out some kind of a system where
82:05
the people who provide the translations
82:07
meet each other because it’s a network
82:08
we can introduce them they form a union
82:11
they collectively bargain with us for a
82:13
reasonable rate so that they can all
82:15
live put their kids through school and
82:17
then we get better working translators
82:20
and yeah you pay a dime you can afford a
82:22
dime and something everybody’s happier
82:24
no there are a few things about this
82:26
that are really good in my point of view
82:28
one is we no longer have these people
82:30
from the side paying to manipulate
82:32
people everything’s become clear – we
82:35
have a whole class of people making a
82:36
living instead of needing to go on the
82:38
dole instead of saying oh we need this
82:40
basic income because everybody is
82:41
worthless 3 we’re being honest instead
82:44
of lying which is a really big deal
82:46
right now we have to lie because we’re
82:49
not telling the people that were taking
82:50
their data we’re telling them oh you’re
82:52
buggy whips you’re worthless
82:53
but in secret we need you that’s a lie
82:56
and for there’s kind of a spiritual
82:59
thing here where we’re telling people
83:01
honestly when they’re still needed like
83:03
to tell people oh actually you’re
83:05
obsolete the robots taken over your job
83:08
when it’s not true when we still need
83:10
their data there’s something very cruel
83:12
about that it cuts to some sort of issue
83:14
of dignity and human Worth and it really
83:16
bothers me so for all these reasons this
83:18
seems like a better system to me and
83:21
sure we’d have to make accommodations
83:22
for those who can’t afford whatever the
83:24
rate would be for the language
83:25
translation but we can do that we’ve
83:27
almost figured out ways to do that if
83:28
we’re a decent society and we’d be a
83:30
more decent society because we wouldn’t
83:32
have an economy that’s strictly run on
83:35
making people into assholes so so that’s
83:41
why I advocate the economic approach so
83:44
I know it’s bad form but it can I refer
83:47
you to a paper to read go look up
83:50
something called blueprint for a better
83:52
digital society I’m sorry about the
83:54
title I didn’t make it up it’s an
83:56
editor’s fault Adi Ignatius it’s your
83:59
fault Adi and it’s a Harvard Business
84:01
Review recently you can find it online
84:02
very easily blueprint for better digital
84:05
society and it’s the latest version
84:07
about how to make this thing work and a
84:09
little bit about how to transition to it
84:13
so so that’s the solution I’ve been
84:16
exploring and promoting I think there’s
84:19
room for a lot of solutions another idea
84:22
is people like the Center for Humane
84:25
technology which is Tristan Harris in
84:27
another group called Common Sense Media
84:28
are trying to educate individuals about
84:31
how to be more aware of how they are
84:33
manipulated and how to make slight
84:35
adjustments to be manipulated a little
84:37
less worth trying remember it’s a sneaky
84:42
machine the whole industry is based on
84:43
fooling you so staying ahead of it is
84:45
gonna be work you can’t just do it once
84:47
and think you’re done it’ll be a
84:48
lifetime effort that’s why I think you
84:49
should just quit the things yeah when I
84:53
say can you please delete all your
84:55
social media accounts surely one of the
84:58
first thoughts and all your minds is
85:00
well that’s ridiculous I mean you’re not
85:02
going to get billions of people to
85:04
suddenly drop these things there’s
85:06
there’s two reasons why you’re correct
85:09
if you have that that that thought one
85:14
is that you’re addicted this is an
85:16
actual addiction you can’t just go to
85:18
somebody with a gambling addiction and
85:19
say oh just so you know any more than
85:21
you can do that if they have a heroin
85:23
addiction that’s not how addiction works
85:25
you can’t just say no it’s a prop it’s
85:27
hard
85:27
addiction is hard all of us have
85:29
addictions none of us are perfect but
85:31
this particular ones destroying our
85:33
future it’s really bad it’s not just
85:34
personal we hurt each other with this
85:36
one in an exceptional way so another
85:41
reason is network effect and that means
85:43
everybody already has like all their
85:45
pictures and all their past and all
85:47
their stuff on these properties that
85:49
belong to companies like Facebook and
85:50
for everybody to get off it all at once
85:53
they can continue to have connections
85:54
with each other is a coordination
85:56
problem that’s essentially impossible at
85:57
scale so that’s that’s a network effect
86:00
problem so why am I asking people to do
86:03
something that can only happen a little
86:05
and the reason why is even if it only
86:08
happens a little it’s incredibly
86:09
important so let me let me draw a
86:12
metaphor to some things that have
86:14
happened in the past we have in the past
86:17
had mass addictions that were tied to
86:24
corrupt
86:25
mercial motives at a large scale one
86:28
example is the cigarette industry
86:32
another example is big alcohol alcoholic
86:36
beverages I could mention others lead
86:39
paint is when I bring up in the book now
86:41
in these cases well actually the lead
86:44
paint was an addiction thing so I’ll
86:45
leave I’ll leave out lead paint so let’s
86:47
just talk about cigarettes and in the
86:50
case of cigarettes when I was growing up
86:52
it was almost impossible to challenge
86:55
cigarettes
86:56
I you know like cigarettes were manly
86:59
they were cool if you were on the red
87:02
side of America they were the cowboy
87:03
thing if you were on the blue side they
87:06
were the cool beatnik thing everybody
87:07
had a cigarette and you just couldn’t be
87:10
cool without your cigarette but enough
87:12
people finally realize that they could
87:14
get out from under it that at least it
87:16
allowed a conversation the addict will
87:19
defend it if you talk to somebody who’s
87:21
really addicted to cigarettes it’s very
87:22
hard for them to really get a clear view
87:25
of what the cigarette means to society
87:27
what it means to have cigarette in
87:28
public spaces there was a time in this
87:32
room would have been filled with
87:33
cigarette smoke and we would have been
87:35
gradually killing the students who were
87:36
attending I think I’m coughing in
87:42
sympathy with remembering what that was
87:43
like because it was really horrible
87:48
alcohol Mothers Against Drunk Driving
87:50
was or drunk drivers I forgot which it
87:53
is has been one of the most effective
87:55
political organizations they changed
87:57
laws they changed awareness they changed
87:59
outcomes and saved an enormous number of
88:01
lives despite the fact that once again
88:04
alcohol is cool it’s supposed to be cool
88:06
to drink at a frat party it supposed to
88:07
be cool to drink at your fancy
88:09
restaurant everybody loves drinking and
88:11
there’s this whole world event of
88:13
advertising liquor we found a reasonable
88:17
compromise in both cases we don’t throw
88:20
people who drink or smoke cigarettes in
88:23
jail like we’ve done for marijuana for
88:25
years instead we came up with a
88:28
reasonable policy don’t do it in public
88:30
don’t do it behind the wheel it worked
88:32
that was only possible because we had
88:36
enough people who were outside of the
88:38
addiction system
88:39
have a conversation in this case we
88:42
don’t have that in this case all the
88:44
journalists who should be helping us are
88:46
addicted to Twitter and making fools of
88:47
themselves if you’re a journalist in
88:50
this room you know I’m telling the truth
88:54
the same for politicians same for public
88:57
figures celebrities who might be helpful
88:59
we need to create just a space to have a
89:04
conversation outside of the addiction
89:06
system now you might be thinking oh my
89:08
god I’ll destroy my life if I’m not on
89:10
these things I don’t think it’s true I
89:13
think if you actually drop these things
89:14
you suddenly discover you can have any
89:15
life you want I’m not claiming that I’m
89:18
the most successful writer or public
89:21
speaker but I’m pretty successful I have
89:22
best-selling books I get around I you
89:25
know you hired me to come talk to you
89:29
and I’ve never had an account on any of
89:31
these things and you could say oh but
89:33
you’re an exception in this way well I
89:34
mean how much of an exception can I be I
89:37
play any points against me I’m like this
89:39
weirdo and and I still know seriously
89:42
you know I mean I still can do it if I
89:44
can do it probably other people can do
89:45
it too
89:46
I just don’t I think that there’s this
89:48
illusion that your whole life like
89:50
they’ll be you’ll just be erased if
89:52
you’re not on these things but that
89:53
illusion is exactly part of the problem
89:55
that’s that’s exactly part of this weird
89:59
existential insecure need for attention
90:05
at any cost bizarre personality
90:08
dysfunction that’s destroying us just
90:10
give it a rest
90:11
now here’s what I would say there was a
90:13
time it’s when if you were young
90:17
especially one of the priorities that
90:20
you felt in your life was to know
90:21
yourself and the only way to know
90:23
yourself is to test yourself and the way
90:25
you test yourself is maybe you’d go
90:26
trekking in the Himalayas or something I
90:30
used to hitchhike into central Mexico
90:32
when I was really young just a really
90:34
young teenager and that’s how I tested
90:36
myself these days I think the the
90:39
similar idea would be quitting your
90:41
social media and really deleting it like
90:43
you can’t you can’t like quit Facebook
90:45
and keep Instagram that’s you
90:46
have to actually delete the whole
90:48
thing
90:49
and and then it doesn’t mean you’re
90:52
doing it for your whole life
90:53
delete everything and then stay off
90:56
stuff for six months okay if you’re
90:59
young you can afford it it will not kill
91:01
you and then after six months you will
91:03
have learned and then you make a
91:05
decision in my opinion you should not
91:08
harm your life for the sake of the ideas
91:11
I’ve talked about today if it’s really
91:13
true that your career will be better or
91:15
whatever through using these things then
91:18
you need to follow your truth and do
91:21
what makes you succeed and if it’s
91:22
really true that being a serf just
91:24
stupid Silicon Valley giant is the thing
91:27
that helps your career okay but you have
91:32
to be the one making that decision and
91:33
if you haven’t tested yourself you don’t
91:36
have standing to even know so I’m not
91:40
telling you what’s right for you but I
91:41
demand that you discover what’s right
91:44
for you that I think is a fair demand
91:47
given the stakes and with that cheerful
91:51
closing I will call it
91:53
[Applause]
92:02
[Music]
92:06
so we have is that is a mic on so do we
92:10
have the question set people well we
92:14
were gonna have cards I don’t know if
92:15
any cards have made their way here’s a
92:17
card cards okay
92:19
I’m actually an unclear on how this
92:22
whole thing works okay well this is it
92:28
alright so normally I would get a bunch
92:32
of cards but but but I haven’t that
92:34
hasn’t happened yet okay lately I’ve
92:36
noticed that I was getting progressively
92:38
more cranky that’s now a technical term
92:41
I think you’ve introduced along with a
92:43
virtual-reality cranky from a lack of
92:46
sleep because of the excessive blue
92:48
light given off by screens ah have you
92:51
factored this effect into your theory oh
92:55
yeah
92:56
well there’s the time and stuff like
92:57
that there’s more as soon as soon as I
93:02
put blue filters on my screens I got a
93:03
lot less cranky okay there the the
93:08
problem of blue light keeping you up and
93:11
those are all real problems and in fact
93:13
you might want to just turn colour off
93:15
on your computer definitely turn colour
93:17
off on your phone and all seriousness
93:18
you don’t need it
93:19
for most things I have I have color I
93:23
use a phone but I definitely cover off
93:24
and like make those changes if you know
93:28
if you notice something like that yeah
93:32
you can right you can turn off the blue
93:38
light on your computer you can go into a
93:40
setting and you know the best way to do
93:43
it go to the visual accessibility
93:45
settings because they have these high
93:46
contrast settings for people who have
93:49
trouble focusing and they get rid of
93:50
color as we come stark contrast as an
93:54
example oh for God’s sakes I have to
94:00
enter this – I was going to show you
94:01
what it looks like but I’m not going to
94:02
bother with a code anyway you just you
94:04
can do it every major platform has this
94:06
ability it’s really that and you should
94:08
do it go to Common Sense Media org or to
94:15
Center for humane technologies website
94:17
and both of them have advice on how to
94:21
do things like this and another thing is
94:23
both I’m pretty sure both Windows and
94:26
Mac if it’s a computer have ways to make
94:28
the blue light go away as the evening
94:29
approaches there’s like this kind of
94:31
stuff this is real stuff and you should
94:33
pay attention to it and the technology
94:35
should serve you and not drive you crazy
94:36
but I do have to say this is not an
94:38
existential threat this is this is at
94:41
the level of too much sugar and
94:42
breakfast cereals or something like that
94:43
it is actually a real issue it’s it’s it
94:46
does have an effect on the health of the
94:48
population but it’s not going to destroy
94:49
us this other stuff I’m talking about is
94:51
at another level okay
94:53
so rather than the cards do we have
94:57
cards we do okay great let’s give some
95:02
cards is that your card oh okay great
95:07
okay here’s my card isn’t there a design
95:09
problem for publishing online if you
95:11
know who’s pointing at you how is that
95:14
related to the problem Allen turning
95:16
faced touring it says turning he hatched
95:20
the concept of a machine like
95:22
personality isn’t that too software what
95:25
listening and compassion is to human
95:27
communication yeah it’s a kind of
95:30
interesting question to me when I read
95:34
Turing’s final notes that the Turing
95:37
test comes up twice it comes up in a
95:39
little monograph he wrote and it comes
95:40
up in a sort of a little note there’s
95:43
two statements of it and in both of them
95:45
to me reading them there’s just this
95:49
profound sadness I feel like this is
95:51
this person who’s just screaming out so
95:55
some of you might I don’t know there’s a
95:56
whole history to this thing that what
95:58
trinket is he created a metaphor oh boy
96:02
let me try to do this as fast as I can
96:04
Turing did as much as anybody to defeat
96:07
the Nazis in World War two by braking
96:10
using one of the first computers that
96:11
ever existed to break a Nazi secret code
96:13
called enigma and he he was considered a
96:18
great war hero however he lived an
96:22
identity that was illegal at that time
96:24
which is that he was gay
96:25
and he was forced by the British
96:27
government after the war to accept a
96:30
bizarre crack treatment for being gay
96:32
which was to overdose on female sexual
96:35
hormones with this bizarre idea that
96:37
female hormones would balance his over
96:39
sexiness which was supposed to be the
96:41
gay it’s like so stupid it’s hard to
96:43
even repeat it and he started developing
96:46
female physiological characteristics as
96:48
a result of that treatment and it he
96:53
committed suicide by a sort of a weird
96:56
political thing where he laced an apple
96:58
with cyanide and ate it next to the
96:59
first computer sort of anti Eve or
97:03
something and he was a very brilliant
97:05
and poetic man and in the final couple
97:08
of weeks of his life he came up with
97:10
this idea of repurposing an old
97:14
Victorian parlor game that used to be
97:18
this thing we’d have a man and a woman
97:21
behind a curtain or a screen of some
97:26
kind and all they could do is pass
97:30
little messages to a judge and the judge
97:32
would have to tell who’s the man and
97:33
who’s the woman and each of them might
97:35
be trying to fool the judge which is
97:36
kind of a weird if you think about it
97:38
the Victorians were pretty kinky and
97:40
bizarre and and so what you’re doing is
97:45
as with behaviorism and as for the
97:47
internet you’re slicing away all of
97:48
these factors and just turning it into
97:50
like this limited stream of information
97:51
so it’s kind of like tweeting or
97:53
something and that so what Turing said
97:56
is what if you got rid of the woman and
97:58
you had a man in a computer and the
98:00
judge couldn’t tell them apart wouldn’t
98:03
then finally you have to admit that the
98:05
computer should be given rights and give
98:07
in stature and be treated and when you
98:10
read it I don’t the way I read it is
98:13
it’s this person saying oh my god I
98:14
figured out how to save the world from
98:16
these people who wanted to destroy
98:18
everybody based on being of the wrong
98:19
identity these people who wanted to kill
98:22
not only gays but of course Jews and
98:24
Gypsies and and and black people and
98:27
these horrible people and I came up with
98:30
this way of defeating them and now
98:31
you’re destroying me for who I am
98:33
and I feel like there’s this kind of
98:37
astonishing sadness in it and the way
98:40
it’s the way turns and so that was the
98:43
birth of the idea of artificial
98:44
intelligence and I feel like the way
98:46
it’s remembered is completely unlike
98:48
what it’s like to read the original you
98:50
know I feel like if you look at the have
98:51
you ever read the original Turing
98:53
because if you read the original Turing
98:54
I mean it’s like it’s intense you know
98:58
here’s this person who’s being tortured
98:59
to death it’s like it’s not some kind of
99:02
nerdy thing at all it’s it’s a it’s a
99:05
difficult it’s difficult to read the
99:07
documents and I think it was like this
99:12
crazy I think he knew he was about to
99:15
die and I think he was reaching out for
99:17
some sort of a fantasy of what kind of a
99:20
thing what would it take for people to
99:23
not be cruel what would it take and I
99:26
think in this very dark moment he
99:28
thought maybe giving up humanity
99:30
entirely and we’ll just maybe if we’re
99:32
just machines maybe we won’t do this to
99:35
ourselves and the thing about that of
99:37
course is we’ve turned ourselves sort of
99:40
into machines because we’ve all kind of
99:42
acting like machines to be able to use
99:44
this stuff you’re all sitting there all
99:45
day entering your like little codes to
99:47
get online that you’re sort of turning
99:49
into machines in practice and yet we’ve
99:51
just become more and more coral like
99:53
that that’s the the ultimate irony is
99:55
that it didn’t help so that’s my take on
99:58
it and this idea that AI is some could
100:02
be some form of compassion I think it’s
100:05
kind of I think it’s really
100:06
just a way of stealing data from people
100:07
who should be paid to translate AI is
100:11
theft
100:11
to paraphrase anyway okay we we a I is
100:23
just a way look all all we can do with
100:27
computers ever look to be a good
100:30
technologist you have to believe that
100:33
people are sort of mystically better
100:35
than machines otherwise you end up with
100:38
gobbledygook and nonsense you can’t
100:40
design for machines so AI has to be
100:43
understood as a channel for taking data
100:45
from one person to help another
100:47
I take I take the data from the
100:50
translators and I apply it through a
100:52
machine learning scheme or some kind of
100:54
scheme and I can get translations that
100:56
help people in a better way than I could
100:59
without that scheme in between which is
101:01
wonderful
101:02
so it’s technology to help people
101:04
connect in a way that’s more helpful if
101:05
you understand AI that way you elevate
101:08
people and you don’t confuse yourself
101:11
okay yeah we we we don’t have a lot of
101:13
time and we have a lot of great
101:15
questions so some questions are not
101:16
going to be able to be answered now
101:19
although I want to mention that there
101:21
will be a book signing and book
101:23
purchasing outside after the event is
101:27
over there’ll be two tables please if
101:29
you want to ask me long quest you can’t
101:31
go up through it come up to me and ask
101:32
like some open-ended giant question I’m
101:34
signing your book that would taken out
101:35
like by that you really can’t do that by
101:37
a book the people who are selling ebooks
101:39
have asked that you buy a book first
101:41
before you have it signed and that note
101:48
I’ll segue into there’s a couple
101:49
questions that are connected to this how
101:51
about your market solution arguably the
101:55
mess we’re in now comes from the
101:56
monopolistic and manipulative tendencies
101:58
inherent in markets given that the world
102:01
world has never known pure markets what
102:03
would keep this one pure oh it’s not
102:05
going to be pure it’s going to be
102:07
annoying and unfair and horrible but the
102:09
thing about it is it won’t be extent
102:10
existentially horrible
102:12
the thing about market so what I would I
102:15
believe about economic philosophies is
102:17
there’s never been one that’s worked out
102:19
in practice and instead just asked with
102:21
moral philosophies and theories of how
102:24
we learn and many many other areas where
102:26
we’re trying to deal with very complex
102:27
systems it’s not so much that we can
102:29
seek the perfect answer but we have to
102:31
trade-off between partial answers so to
102:33
me there’s never been a pure market
102:37
there’s never been and I don’t think
102:39
there ever could be but I think what you
102:41
can do is you can get a balance this was
102:43
the the Keynesian approach to economics
102:45
I think is very wise you get you you you
102:48
get a balance between reasonable
102:50
oversight and and in a reasonably
102:53
unfettered market and they’ll go through
102:54
cycle for the market will need help and
102:56
you just you trade off you trade
102:58
and I think that that’s that’s the only
103:01
path we have I think being eyed and
103:04
being an ideologue for any solution to a
103:07
highly complicated problem is always
103:08
wrong okay just two more then and
103:11
there’s a couple like this as well
103:14
what about the connection force of
103:15
social media eg for the feminist
103:18
movement like me to these online
103:21
communities raise awareness and create
103:23
supportive communities and then many
103:25
people who rely on social media for
103:27
community because of the demands of
103:28
capitalist jobs yeah yeah it’s just
103:32
that’s all true except that backfires
103:34
and the backfire is worse than the
103:36
original so like what happened the it
103:40
just keeps on happening I mean like
103:41
before me too there were there was a
103:44
problem of diversity in the gaming world
103:47
and a few women in gaming just wanted to
103:51
be able to say one or two things and not
103:52
be totally invisible and and then the
103:54
result of that was this for Asia’s thing
103:56
called gamergate that was just this
103:58
total never shut up totally wipe
104:01
everybody else out totally make it
104:02
everything horrible movement and then me
104:05
too has spawned this this other thing
104:09
that’s still rising which is the in
104:10
sells and the proud boys and all this
104:12
stuff and the problem is that in these
104:15
open systems at first your experience of
104:19
finding mutual support and creating
104:21
social changes as a entik it’s real it’s
104:23
just that there’s this machine you’re
104:25
not thinking about behind the scenes
104:26
that’s using the fuel you’re providing
104:29
in the form of the data to irritate
104:31
these other people because it gets even
104:33
more of a rise from them and you’re
104:35
creating this other thing that’s even
104:36
more powerful that’s horrible
104:38
even though it wasn’t your intent and
104:39
that’s the thing that keeps on happening
104:40
over and over again it doesn’t
104:42
invalidate the validity of the good
104:44
stuff that happens first it’s just that
104:46
it always backfires well not always but
104:48
typically and you end up you end up
104:53
being slammed and you don’t even like
104:55
one of the things that’s really bad
104:56
about it is that it’s you know it seems
104:59
like it’s just the fault of the creeps
105:01
who come up where it’s actually kind of
105:02
more the fault of the algorithms that
105:04
introduced the creeps to each other and
105:05
then got them excited in this endless
105:07
cycle of using your good intentions to
105:09
irritate the worst people so I mean
105:12
I know the thing is it’s cute
105:14
blacklivesmatter was great I think I
105:16
mean I think it’s wonderful and yet the
105:18
reaction to it was horrible and of a
105:21
higher magnitude and I just think we
105:23
have to find unfortunately until we can
105:26
get rid of the advertising model and the
105:28
giant manipulation machine every time
105:30
you use the big platforms for any kind
105:33
of positive social effect it’ll backfire
105:35
and destroy you and it’s it’s a fool’s
105:38
game even though it’s valid at first in
105:41
the long term it’s a fool’s game okay
105:44
this I don’t like saying that I hate
105:45
saying that it breaks my heart this is
105:47
the last question and it’s existential
105:49
wanna I’ll combine the two two questions
105:53
here seeing how pernicious social media
105:55
has become by being hijacked toward
105:58
bummer and you know bummer is another
106:00
technical term using yeah there’s a
106:04
wonderful writer on cyber things I’m
106:06
sherry Turkle and she read my book and
106:09
she said oh I love this book but there’s
106:10
just too much touching it
106:11
and the thing because it there’s like
106:13
bummer and there’s a cat’s behind on the
106:15
cover stuff and I the problem is I
106:19
married a woman who likes butt jokes and
106:21
I just can’t I don’t know some of they
106:23
just come I don’t know anyway okay so
106:29
how to how to guard against an immersive
106:32
technology like virtual reality becoming
106:35
even more insidiously bummer and then
106:38
how do you know what is real okay oh
106:41
well all right those are small questions
106:43
so the first one I mean I think the way
106:47
to keep fort reality vert reality could
106:49
be super hyper creepy I wrote a book
106:52
about vert reality that we have mention
106:53
it’s called dawn of the new everything I
106:55
don’t know if they’ll have it upfront or
106:56
not but I talked a lot about that issue
106:58
so virtuality could potentially be
107:00
creepy I think the way to tell whether
107:03
something’s getting creepy is whether
107:04
there’s a business model for creepiness
107:06
so if the way it’s making money is that
107:08
there’s somebody to the side who thinks
107:10
they can sneak lis alter you or
107:11
manipulate you that’s the creepy engine
107:14
if there isn’t that person and if there
107:15
isn’t that business going on it’s less
107:18
likely to be creepy I think this is
107:19
actually that’s actually a pretty simple
107:21
question to answer I think it boils down
107:23
to incentives I think incentives run
107:25
world as much or more than anything else
107:27
as far as this question of how to be how
107:29
do you know what’s real
107:30
the answer is imperfect what you do is
107:34
you struggle for it you struggle to do
107:37
scientific experiments to publish you
107:39
have to always recognize you can fool
107:40
yourself you have to recognize that
107:43
whole communities of people can fool
107:44
themselves and you just struggle and
107:45
struggle and struggle and you gradually
107:47
start to form a little island in a sea
107:50
of mystery in which you never have total
107:54
confidence but you start to have a
107:56
little confidence so there’s some things
107:58
that we can be confident of now the
108:01
earth is round not on line but do we
108:06
know it in an absolute absolute sense no
108:09
you can never know reality absolutely
108:11
but you can know it pretty well and so
108:13
in order to talk about reality you have
108:16
to be used you have to get used to near
108:21
perfection that is never actual
108:24
perfection and if you’re not comfortable
108:26
with that concept you have no hope of
108:27
getting to reality because that’s the
108:29
nature of reality reality is not
108:31
something you ever know absolutely and
108:32
in fact just to be clear I in one of my
108:35
books I defined reality is the thing
108:37
that can be never that can never be
108:38
measured exactly it’s the thing that can
108:40
never be simulated accurately it’s the
108:42
thing that can never be described to
108:44
perfection that is reality but at this
108:46
because the simulation can be described
108:48
to perfection I can describe to you a
108:50
video game world or a virtual world to
108:52
perfection I can’t do that with reality
108:54
and the the thing is though that we
108:59
can’t demand absolute knowledge in order
109:01
to have any knowledge at all or else we
109:03
make ourselves into genuine fools we
109:05
have to be able to accept that we can
109:07
have better knowledge than other
109:08
knowledge it’s all an incremental sort
109:11
of eternal improvement project so the
109:16
people who demand absolutely proof of
109:18
climate change or fools but they’re
109:21
interesting like I mean some of you
109:23
might have read there was a good history
109:25
published this week about the history of
109:28
the reading wars about how we learn
109:30
reading and there’s this community of
109:32
people who’ve just been absolutely
109:34
unable to accept a load of scientific
109:37
evidence about how did he
109:38
kids to read effectively because of an
109:40
ideology and they’re sincere and it’s
109:43
like people it’s really really hard
109:45
accepting reality is your life’s work
109:47
it’s really really really hard it’s it’s
109:50
not it doesn’t come naturally
109:52
necessarily it’s a discipline thank you
109:55
all right
109:56
[Applause]
110:03
[Music]
110:04
[Applause]

Laura DeNardis, “The Internet in Everything”

Once primarily a communication network, the Internet can now link a variety of physical devices and everyday objects, from cars and appliances to crucial medical equipment. Known as “the Internet of Things,” this system blurs distinctions between the virtual and the real, and, as DeNardis argues in this groundbreaking study, confers something tantamount to political power on whoever controls it. Showing how countries can use this cyber infrastructure to reach across physical boundaries, DeNardis, a professor in American University’s School of Communication and one of Slate’s 2016 Most Influential People in the Internet, lays out the threats it poses and offers policy prescriptions to protect our future. DeNardis is in conversation with Shane Harris, national security reporter for The Washington Post.

 

 

I thought I’d say okay I saw senator
54:46
Pressler getting up oh did you have a
54:48
question senator boy let’s know thank
55:07
you for a very wonderful book a lot of
55:09
this in local politics is much more
55:11
serious for example if someone is
55:12
running for the school board in Missouri
55:15
a little town or in South Dakota or
55:17
someplace something appears on the
internet or Facebook or especially
YouTube that’s a half truth about them
and then it’s magnified and there’s no
local media to go to to get a correction
there’s no way to get it down and nobody
55:31
pays much attention to it except the
55:32
people who are targeted with it what can
55:34
we do about the getting people to run
55:37
for the school board or to run for
55:38
county commissioner in this atmosphere
55:42
well first let me say thank you for
55:45
authoring the telecom Act of 1996 and
55:49
[Applause]
55:53
that was really a major for the way that
55:57
the Internet has unfolded and you know
55:59
intermediary liability and and and
56:02
things like that but so it’s really
56:04
interesting that you are asking a
56:06
question that is local how do we get
56:09
people to get interested in running for
56:11
school boards around this particular
56:12
issue I think the way the way that this
56:16
is happening is in the most personal
56:18
realms like as people I’ll answer it
56:22
this way people ask me well what are the
56:24
potential harms of the Internet of
56:27
Things and especially around young
56:29
people like what is the big deal well
56:32
for someone who has you you probably
56:35
have heard of the person who was a
56:40
hacker that screamed in a baby through a
56:41
baby monitor and the parents were
56:43
horrified to come and find that you know
56:45
someone was screaming and monitoring the
56:47
baby right or the discriminatory
56:51
practices around insurance around
56:54
employment around racial issues in how
56:57
data from the internet of things is
56:58
happening like these personal things
57:00
that happen even though there’s no
57:02
catastrophic issue that has happened
57:04
these personal things
57:06
I think are what are going to and with
57:08
the help of the media exposing them I
57:11
feel like that the just the educational
57:14
awareness of not only the future risks
57:16
but the situation that we find ourselves
57:18
in now will motivate people to get
57:20
involved and starting local is
57:23
definitely part of that but you know
57:25
whether it’s local whether it’s a big
57:27
sweeping us thing like the telecom Act
57:30
of 1996 whether it’s acts in
57:33
transnational organizations like the
57:36
Internet Corporation for Assigned Names
57:38
and numbers or standard standard-setting
57:40
organizations or whether it’s at the UN
57:42
level I think the big takeaway is that
57:45
we have to as a society view
57:47
cybersecurity as the great human rights
57:49
issue of our time and frankly a big part
57:52
of educating people is writing books and
57:54
everyone here is enlightened by this
57:56
conversation from you and we’re grateful
57:58
you’re doing a public service by
57:59
explaining these important things to
58:02
people in a way that they can understand
58:03
and making these issues not so
58:05
intimidating and overwhelming and I
58:07
think everyone can agree
58:08
why Laura’s so highly regarded as
58:11
someone who is shaping the Internet and
58:12
I think probably for good so please
58:14
let’s give her a round of applause and
58:23
thank you all for being such a great
58:25
audience that Laura will be signing
58:26
books if you’d like to talk to her more
58:28
thanks for coming

 

 

The Internet’s Mid-Life Crisis: The Agenda with Steve Paikin

In the 1990s the internet was thought of as democratic and anarchic. Then came social media giants such as Facebook, Twitter, and Instagram; political movements such as the Arab Spring; and Amazon and Google galvanized the attention spans of millions of users. The Agenda looks at the internet’s original promise, its milestones, and the future of the hyper-connected world.