Democratic ad makers think they’ve discovered Trump’s soft spot

After more than a year of polling, focus groups and message testing against the president, there’s a growing consensus about what damages Trump — and what doesn’t.

Donald Trump wasn’t halfway through his speech in Tulsa, Okla., and Democratic ad makers in Washington and New York were already cutting footage for an air raid on the slumping president.

They didn’t focus on the president’s curious monologue about his difficulties descending a ramp or drinking water at West Point, the small crowd size of the Tulsa event or even his use of the racist term “kung flu.” Instead, the ads zeroed in on Trump’s admission that he urged officials to “slow the [coronavirus] testing down.”

It’s a reflection of a growing consensus among Democrats about what kind of hits on Trump are most likely to persuade swing votersand which ones won’t. As in 2016, ad makers are focusing on Trump’s character. But unlike four years ago, they are no longer focusing on his character in isolation — rather they are pouring tens of millions of dollars into ads yoking his behavior to substantive policy issues surrounding the coronavirus, the economy and the civil unrest since the death of George Floyd.

You can’t chase the Trump clown car,” said Bradley Beychok, president of the progressive group American Bridge. “Him drinking water and throwing a glass is goofy and may make for a good meme, but it doesn’t matter in the scheme of things … What people care about is this outbreak.”

Until recently, it wasn’t entirely clear what, if anything, worked against Trump. From the moment he announced his presidential campaign five years ago, not even the most incendiary material seemed to cause significant damage. Not

  • calling Mexican immigrants “rapists,” not
  • “blood coming out of her wherever,” not “
  • grab them by the p—y” — all of which were featured by Democrats in character-based ads attacking Trump.

By Election Day, most voters didn’t find Trump honest or trustworthy, according to exit polls. But they voted for him anyway. And throughout much of his first term, including his impeachment, Democrats struggled to find an anti-Trump message that gained traction.

In their preparations for 2020, outside Democratic groups spent more than a year surveying voters in swing states by phone and online. They convened in-person focus groups and enlisted voters in swing states to keep diaries of their media consumption.

Multiple outside groups said they began to test their ads more rigorously than in 2016, using online panels to determine how likely an ad was to either change a viewer’s impression of Trump or to change how he or she planned to vote. Priorities USA, a major Democratic super PAC, alone expects to test more than 500 ads this cycle. Priorities, American Bridge and other outside groups, including organized labor, have been meeting regularly to share internal research and media plans.

“One thing we saw in polling a lot before the coronavirus outbreak is that people didn’t think he was a strong leader or a good leader, they complained about his Twitter,” said Nick Ahamed, analytics director at Priorities USA. “But they had a hard time connecting those character flaws they saw in him with their day-to-day experience.”

Trump’s response to the coronavirus pandemic and recent protests, he said, “really made concrete for people the ways in which his leadership has direct consequences on them and their loved ones … It’s easier to make ads that talk about his leadership than before the outbreak.”

The advertising elements that appear to work, according to interviews with more than a dozen Democrats involved in message research, vary from ad to ad. Using Trump’s own words against him often tests well, as do charts and other graphics, which serve to highlight Trump’s distaste for science. Voters who swung from President Barack Obama to Trump in 2016 — and who regret it — are good messengers. And so is Joe Biden, whose voice is widely considered preferable to that of a professional narrator. Not only does he convey empathy, according to Democrats inside and outside Biden’s campaign, but using Biden’s voice “helps people think about him as president,” said Patrick Bonsignore, Biden’s director of paid media.

But the ad makers’ overarching takeaway from their research was this: While Trump may not be vulnerable on issues of character alone, as he demonstrated in 2016, he is vulnerable when character is tied to his policy record on the economy and health care.

“What we’ve learned form a lot of previous experience … is that quite honestly, people who work in politics can be bad prognosticators in terms of which ad will work,” said Patrick McHugh, Priorities’ executive director. “You see a lot of times the videos that go viral on Twitter … you test those ads, and more often than not they backlash … they can move voters toward Trump.”

For the negative ad industry, the coronavirus has been a bonanza because it inextricably linked both the economy and health care. On the evening of his Tulsa rally, American Bridge, which had already been working on an ad pummeling Trump for his response to the coronavirus, bookended its material with Trump’s acknowledgment that he urged officials to “slow the testing down.”

Biden’s campaign rushed a video onto social media skewering Trump for the admission. And Priorities USA, the Biden campaign’s preferred big-money vehicle, was on TV within days with Trump’s testing remarks in the swing states of Wisconsin, Pennsylvania, Arizona and Michigan.

Trump complained on Twitter that “the Democrats are doing totally false advertising.” But after the Democratic National Committee posted its first TV ads since 2016 — one asserting that Trump had “brought America down with him” and the other a more focused critique of his handling of China and trade — even the president acknowledged the effectiveness of the assault.

“On the campaign they’ll say such horrible things about me. It’s a very unfair business,” he said on Fox News. “But the ad [Democrats] did this morning, it’s a great ad for them.”

In one obvious way, assailing Trump is less complicated for Democrats than it was four years ago. Trump is the incumbent now, and for the first time he has a record of governance. Pointing out historic economic and public health crises in ads is not rocket science.

Trump’s approval ratings, both overall and on his handling of the coronavirus, have tracked downward since March, when outside Democratic groups began running advertisements against him on the issue. A Reuters/Ipsos poll last week put public approval for his response to the coronavirus pandemic at 37 percent, the lowest mark on record.

“There are more voters on the table now than there have been in a long time,” Becca Siegel, Biden’s chief analytics officer, told POLITICO. “Many, many voters who are persuadable and open to hearing these messages.”

And Trump keeps providing fodder. As outside groups began running ads featuring Trump’s “slow the testing down” remark last week, one Democratic strategist said, “Everybody is going to put this into their ads. This is something people are going to see on their TVs … for the rest of the cycle.”

For Biden, it is difficult to argue anything isn’t working at the moment. He is flattening Trump in national polls and running ahead of him in most swing states.

Yet voters still know less about Biden than Trump, according to internal polling from both parties, and there is an undercurrent of tension within the Democratic Party about how much effort to spend attacking Trump versus building Biden up.

In a study based on data from tens thousands of survey participants — and cited frequently by Democrats — researchers at the University of California, Berkeley and Yale University found earlier this month that messages about the lesser-known candidate, Biden, were more effective at persuading voters than messages about Trump.

Echoing the study’s findings, David Doak, a retired longtime Democratic strategist and ad maker, said that while “the race is being decided right now by the negativity towards Trump … what I would do if I were the Biden [campaign] is to try and fill in that favorability, to strengthen what he’s getting there and move his favorability rating up.”

Jimmy Siegel, an ad maker who worked on Clinton’s 2008 campaign and for Michael Bloomberg this cycle, said, “You need more positive Biden stuff” — what another strategist called “more Biden cowbell.”

“I think Democrats have had a theory of the case against President Trump for a while, but it really hasn’t been until the last few months when it started finally getting traction,” said Mark Putnam, the famed Democratic ad maker who worked for Obama and also for Biden before parting ways with the campaign last year. “He almost seemed to have some kind of anti-gravity secret that allowed him to consistently screw things up and yet never pay a political price for it. And with just the way he’s handled one crisis after another in really the worst possible way, it’s finally sinking in.”

However, Putnam said, “That’s only half the battle … We also have to offer an alternative.”

Unite the Country, the super PAC that Putnam is working with, has released several TV and digital ads highlighting Biden’s biography and record on the economy, including a spot featuring Biden’s childhood home in in Scranton, Pa. — complete with the bed Biden slept in as a child that Putnam’s team found stored in the attic when they arrived.

And Biden’s campaign itself began working this month to define the former vice president — and Trump — for a general election audience, releasing two ads as part of a $15 million buy, his first major advertising offensive of the general election campaign.

Just as the outside Democratic groups did, Biden’s campaign tested those ads with online panels, finding versions that used Biden’s own voice performed “dramatically stronger” than those using a professional narrator, the Biden campaign’s Bonsignore said.

In one ad, Biden talks about the economy, offering only an implicit contrast with Trump.

But Biden’s other ad cuts a much sharper contrast — staying with Democrats’ relentless criticism of the incumbent. It includes footage of Trump posing with a Bible outside St. John’s Episcopal Church near the White House after officials forced protesters from the area, as well as an image of Trump’s “both sides” reaction to the deadly violence at a white supremacist rally in Charlottesville, Va. — an episode that has gained new resonance amid the racial unrest surrounding Floyd’s killing.

The ad recalled Hillary Clinton’s first ad of the 2016 general election, when Clinton used footage of Trump encouraging violence at a campaign rally and mocking a reporter’s disability to make a call for unity.

But there was one significant difference from the 2016 attack on Trump. Four years ago, said Tad Devine, who was a senior strategist to Bernie Sanders’ 2016 presidential campaign, issues of character proved irrelevant in general election advertising “because people weren’t voting on it” — there was no connection to draw between Trump’s character and a record of governance that did not yet exist.

This year, he said, “That is absolutely the weakest front for Trump … Things have changed so dramatically, and the connection between the character of the president and that president’s ability to protect people, whether it’s from economic collapse or pandemic, is really important.”

The contrast works, Devine said, because “people are so desperate to turn the page from what’s happening in America today.”

Jaron Lanier: How the Internet Failed and How to Recreate It

Transcript

00:03
[Music]
00:07
welcome everybody I’m Nathaniel Deutsch
00:09
I’m the director of the humanities
00:11
Institute here at UC Santa Cruz and I
00:14
want to welcome everyone here to see
00:17
Jaron Lanier
00:19
he’ll be talking tonight I also want to
00:21
welcome everybody who is watching this
00:22
on a live stream that we have running at
00:26
the same time we’re very thankful for
00:28
the support from the Peggy Downes Baskin
00:31
humanities endowment for
00:32
interdisciplinary ethics for supporting
00:35
this lecture we’re also very thankful
00:37
for the Andrew W mellon foundation for
00:39
supporting year-long series of events
00:42
that will be hosting at the humanities
00:43
Institute on data and democracy and
00:46
Jaron Lanier stalk is going to be
00:48
launching that series in addition to the
00:52
event tonight we will be hosting in the
00:55
coming year a series of other events
00:57
including questions that matter at
01:00
khumba jazz center on January 29th which
01:04
we invite all of you to I know there’s
01:06
some of you have been to some of our
01:07
past questions that matter events and
01:09
also an event that we have been planning
01:13
actually for a while and has become even
01:16
more necessary because of the events of
01:18
recent days and that is a conversation
01:21
on anti-semitism in the internet which
01:25
we will be hosting on a date to be
01:27
announced
01:28
I want to thank there’s many people I
01:30
could thank but I’ll leave out the names
01:33
I’ve already cleared it with them I’m
01:34
just gonna thank thank their units the
01:38
humanities Institute staff which is as
01:39
always amazing and then the staff of the
01:43
humanities divisions development office
01:45
which is also always amazing so thank
01:47
you everyone for all the work that went
01:49
into this event tonight’s program will
01:52
include a lecture followed by again and
01:54
I think some music followed by a
01:58
question-and-answer session and book
01:59
signing and I’ll be talking a little bit
02:01
more about the book signing later
02:03
questions and answers will be
02:04
facilitated by note cards and we have
02:07
some uh sure’s that are moving around
02:10
the room and if you would like to
02:13
a question please raise your hand now
02:14
and they will hand you cards and you can
02:18
write out the the question they’ll pick
02:20
it up and then they’ll give it to me and
02:21
I will be facilitating the question and
02:23
answer that way so I’ve had the pleasure
02:29
of spending the day with Jaron and I can
02:32
tell you that he is a fascinating person
02:35
a very generous person as well with his
02:39
time he met with some students earlier
02:41
today and had a conversation with them
02:44
which was wonderful for him to do and
02:47
tonight he will be giving a lecture
02:50
we’re lucky to have him here he’s a path
02:53
breaking computer scientist a virtual
02:55
reality pioneer if I’m not mistaken you
02:58
coined the phrase mutual virtual reality
03:02
he’s a composer and artist and author
03:04
who writes and numerous topics including
03:06
technology the social impact of
03:08
Technology the philosophy of
03:09
consciousness and information internet
03:11
politics and the future of humanism and
03:14
one of the things that we believe in so
03:16
strongly at the humanities Institute is
03:18
that conversations about technology
03:19
cannot simply be left to a computer
03:22
scientists no offense to any computer
03:24
scientists in the room we love you too
03:25
but we also think that it is critical to
03:28
have people who work in the humanities
03:30
involved in those conversations and this
03:32
is part of why we are doing this tonight
03:35
he is the author of best-selling and
03:37
award-winning books including you are
03:39
not a gadget a manifesto and who owns
03:41
the future most recently he’s the author
03:43
of 10 arguments for deleting your social
03:45
media account right now his lecture
03:49
tonight is entitled how the internet
03:51
failed and how to recreate it please
03:54
join me in welcoming Jaron Lanier
03:56
[Applause]
04:04
hey how are you all any students here is
04:12
this all this is the the adult okay good
04:15
good ah good excellent there for you
04:17
here I’m going to start with some music
04:21
because some of what I have to talk
04:24
about is not the most cheerful stuff
04:26
because our times aren’t universally
04:28
cheerful lately and music is how I
04:32
survive anyway any of you heard me play
04:36
this thing okay
04:45
[Music]
05:06
[Applause]
05:44
[Music]
05:57
[Music]
06:06
[Music]
06:16
you all know weight is up you all know
06:19
what that is right yeah it’s called a
06:24
cab
06:25
it’s from Laos it’s arguably the origin
06:33
of digital information if you look at it
06:38
it’s got a parallel set of objects that
06:42
are either off Iran there’s 16 of them
06:45
in this one 16-bit number they go back
06:49
many thousands of years they appear to
06:52
be older than the abacus in ancient
06:55
times they were traded across the Silk
06:58
Route from Asia and were known to the
07:01
ancient Greeks and Romans the Romans
07:04
made their own copy which was called a
07:06
hydrolyse and it was a giant egotistical
07:10
Roman version that was so big it has to
07:14
be run on Steam it was operated by teams
07:19
of slave boys because despite have
07:22
Festus is best efforts they didn’t have
07:24
computer AI yet and the slave boys
07:29
couldn’t quite operate all the planks
07:31
that open to close the holes and sink
07:33
and so they developed this crossbar
07:35
system and we know about it because
07:37
there’s a surviving hydrolyse believe it
07:39
or not and that automation evolved along
07:45
with the hydrolyse in in two directions
07:47
it turned into the mediaeval pipe organ
07:50
and there were player mechanisms on the
07:52
earliest pipe organs experimentally and
07:55
it also turned into a family of string
07:58
instruments that had various assists
08:02
like the early pre clavichord
08:04
instruments that eventually evolved as
08:07
the piano the notion of automating these
08:10
things was always present so there were
08:12
always attempts to make player pianos
08:14
around Mozart’s time somebody made a
08:18
non-deterministic player piano which
08:20
meant it didn’t play exactly the same
08:22
thing twice Mozart was inspired by that
08:26
made some music that included dice rolls
08:29
but another person who was inspired was
08:32
a guy named jacquard who used the
08:34
similar mechanism to make a programmable
08:36
loom that in turn inspired somebody
08:40
named Charles Babbage to make a
08:42
programmable calculator and his daughter
08:46
ada
08:46
to articulate a lot of ideas about
08:48
software for the first time and what it
08:49
meant to be a programmer and then in
08:52
turn that all inspired a dimm’d fellow
08:55
named Alan Turing to formalize the whole
08:58
thing and invent the modern computer so
09:01
there’s a direct line this is it this is
09:03
the origin of digital information now of
09:07
course it’s not the only line and if I
09:10
was if I was paid to be a historian I
09:12
wouldn’t have told you that story with
09:14
such authority and yet I’m not so this
09:23
is a charming tale it’s a happy place to
09:26
begin it’s a it’s a reminder that
09:30
inventions can bring delight and joy and
09:33
it’s part of why I’m a technologist but
09:38
unfortunately we have some matters to
09:40
discuss here that are not quite so happy
09:45
we live in a world that has been
09:50
darkening lately it’s not just a
09:57
historical lensing effect where it feels
10:00
worse than ever it’s bad in a new way
10:03
there’s something weird going on and I
10:05
want to begin by trying to distinguish
10:09
what’s going on with our present moment
10:11
of darkness as compared to earlier times
10:14
because this is tricky
10:16
it’s almost impossible I think to not be
10:20
embedded in one’s moment in time it’s
10:22
almost impossible not to have illusions
10:27
due to where you’re situated right and
10:29
so I don’t claim to have perfected the
10:32
art of absolute objectivity at all I’m
10:35
struggling and I’m sure that I don’t
10:38
have it quite right but I want to share
10:39
with you my attempts up to this
10:41
now the first thing to say is that by
10:46
many extremely crucial measures we’re
10:49
living in spectacularly good times where
10:53
the beneficiaries of a steady
10:55
improvement in the average standard of
10:58
living in the world we’ve seen a
11:01
lowering of most kinds of violence we’ve
11:04
seen an improvement in health in most
11:07
ways and for most people it’s actually
11:11
kind of remarkable in many ways these
11:14
are really good times and those trend
11:17
lines go way back over over centuries
11:20
we’ve seen steady improvement of
11:22
societies kind of gotten its act
11:23
together and we’ve been able to hold on
11:26
to a few memories about things that
11:28
didn’t work so we’ve tried new things
11:30
we’ve we’ve developed relatively more
11:33
humane societies and relatively better
11:36
science and better Public Health and
11:38
it’s amazing it’s wonderful it’s
11:42
something that’s a precious gift to us
11:47
from earlier generations that we should
11:49
be unendingly grateful for and I always
11:54
keep that in mind I always keep in mind
11:56
that just in our modern human-made world
11:59
just the fact you can walk into a
12:01
building and it doesn’t collapse on us
12:02
is a tribute to the people who made it
12:04
and the people who funded them and
12:07
regulated them and the people that
12:09
taught them there’s like this whole
12:11
edifice of love that’s apparent all the
12:14
time that we can forget about and during
12:16
times that feel dark one of the
12:18
antidotes is gratitude and just in these
12:20
simple things
12:21
I feel extraordinary gratitude and it
12:25
reminds me of how overall there’s been
12:27
so much success in the project of
12:29
Science and Technology it’s so easy to
12:33
lose sight of that and yet there is
12:35
something really screwy going on that
12:38
seems to me to be fairly distinct from
12:42
previous problems it’s a new sneaky
12:45
problem we’ve brought upon ourselves and
12:48
we have yet to fully invent our way out
12:50
of it
12:51
so what exactly is going on
12:54
I think at a most fundamental level
12:59
we’ve created a way of managing
13:03
information among ourselves that
13:05
detaches us from reality I think that is
13:10
the most serious problem if the only
13:14
problem was that our technology makes us
13:18
at times more batty
13:21
more irritable paranoid more
13:26
mean-spirited more separated more lonely
13:30
if that kind of problem was what we were
13:33
talking about that would be important it
13:36
would be serious it would be important
13:38
to address it but what really scares me
13:42
about the present moment is that I fear
13:44
we’ve lost the ability to have a
13:47
societal conversation about actual
13:49
reality about things like climate change
13:53
the need to have adequate food and water
13:56
for peak population which is coming the
13:58
need for dealing with changes in the
14:01
profile of diseases that are coming
14:04
there’s so many so many issues are real
14:08
they’re not just fantasy issues their
14:10
existential real issues climate above
14:14
all and the question is are we still
14:18
able to have a conversation about
14:20
reality or not
14:21
that becomes the existential question of
14:23
the moment and so far the way we’ve been
14:26
running things has been pulling us away
14:28
from reality that scares me and I think
14:32
that’s the core darkness that we have to
14:34
address we can survive everything else
14:36
but we cannot survive if we fail to
14:38
address that now in the title of this
14:42
lecture I promised a little bit of
14:44
history how the internet got screwed up
14:46
or something like that so I’ll tell you
14:48
a bit about that but I want to focus
14:51
more on trying to characterize this
14:54
issue a little more tightly and trying
14:58
to explain at least my thoughts on how
15:00
to remedy it and maybe some other
15:02
people’s thoughts to try to give you a
15:04
bit of a sense of it
15:06
now to begin with one of the infuriating
15:11
aspects of our current problem is that
15:13
it was well foreseen in advance that’s
15:16
the thing about it nobody can claim that
15:18
they were surprised and I can point to
15:22
many folks who were talking about this
15:24
in advance I’m good as good a starting
15:26
place as any is to talk about iam
15:28
foresters story the machine stops who
15:31
here has read it ok well a few people
15:36
terrifying right all right
15:38
the machine stops was written I believe
15:41
in 1907 is that right it might have been
15:43
on nine but you know a century in a
15:47
decade ago or so and it foresees a world
15:50
remarkably like ours it’s a world and
15:53
this was written well before touring
15:54
well before any of this stuff
15:57
I mean before there was computation and
15:59
it describes a world of people in front
16:02
of their screens interacting social
16:04
networking doing search and getting lost
16:07
in a bunch of stupid and
16:10
finally when the machine experiences a
16:14
crash there’s this calamity on earth and
16:17
people become so dependent on it that
16:19
the loss of this machine becomes a
16:21
calamity in itself and at the very end
16:23
of the book people are crawling out from
16:24
their screens and looking at the real
16:26
world and saying oh my god the Sun and
16:29
it’s like this it’s a really amazing
16:32
piece because it’s possibly the most
16:34
precious thing prescient thing that’s
16:37
ever been written at all it was written
16:40
in part as a response to the techie
16:44
utopianism of the day it was a response
16:47
to writers like HG Wells saying wait a
16:50
second these are still going to be
16:52
people we have to think about what this
16:53
will mean to people it’s often the case
16:56
that the first arrive or on a scene has
16:58
a clearer view and can have this kind of
17:01
lucidity that later people find it very
17:03
difficult to achieve and I think
17:04
something like that happens very long
17:06
ago but then honestly we could talk
17:11
about Touring’s last writing just before
17:13
his suicide where he was realizing the
17:16
even though he played as great a role as
17:19
anyone in defeating fascism he hadn’t
17:21
defeated fascism at all because here he
17:24
was being destroyed for his identity you
17:28
all know the story of trade by now it’s
17:30
not obscure anywhere there was a movie
17:31
and everything for a long time I would
17:33
speak to computer science classes nobody
17:35
knew about Turing’s death at all which
17:37
is a scandal but at this point I think
17:39
everyone knows and if you read his final
17:42
writings you read this kind of in a way
17:45
an inner glow of somebody who does have
17:48
some kind of a faith and some kind of a
17:50
stronger Center but also this kind of
17:52
sense of defeat and by the way it’s
17:55
within the context of that that he
17:57
invented artificial intelligence that he
17:58
invented the Turing test and this notion
18:00
of this person who would transcend this
18:03
non person who could transcend sexuality
18:06
and be just this pristine abstract
18:08
platonic being an escaped oppression
18:11
perhaps but anyway so we have that in
18:15
the immediate early generation of
18:17
computer scientists we had Norbert
18:19
Wiener who here has read Norbert Wiener
18:22
I don’t see a single young person’s hand
18:26
up and unfold if you’re young if you’re
18:29
a student and you haven’t read any of
18:30
these people would you please correct
18:31
that and read them seriously you’ll
18:33
you’ll be so happy if you take this
18:34
advice I’m actually read these people so
18:37
Norbert wieners one of the very first
18:39
computer scientists first generation and
18:43
he wrote books that were incredibly
18:45
prescient about this he wrote a book
18:46
called the human use of human beings and
18:49
he pointed out if you could attach a
18:51
computer to input and output devices
18:53
interacting with a person you could get
18:55
algorithms that would enacted adaptive
18:58
behaviors technologies to take control
19:01
of the person and he viewed this as this
19:05
extraordinary moral failure that to be
19:08
avoided any SS thought experiment at the
19:10
end of the book Reese’s well you could
19:12
imagine some kind of global system where
19:13
everybody would have devices on them
19:15
attached to such algorithms that would
19:17
be manipulating them in ways they
19:19
couldn’t quite follow and this would
19:21
bring humanity to a disastrous end but
19:23
of course this is only a thought
19:24
experiment no such thing is feasible
19:25
because there wouldn’t be enough
19:27
bandwidth on the radio waves and all
19:28
this
19:28
you know he then explained why it
19:30
couldn’t be done and of course we built
19:32
exactly the thing he warned about I
19:35
could give many other examples I worked
19:40
on it myself in 92 I wrote an essay
19:41
describing how little AI BOTS could
19:45
create fake social perception in order
19:47
to confuse people and throw elections
19:49
big deal
19:51
lots of people were prescient about this
19:53
this wasn’t a surprise we knew and
19:57
that’s the thing that’s so depressing
20:00
there was a lot of good cautionary
20:03
science fiction there were a lot of good
20:05
cautionary essays there were good
20:07
cautionary technical writings and we
20:10
ignored all of it we ignored it all how
20:15
could that have happened so I I would
20:21
rather tell the story about how
20:23
everybody was surprised and a lot of
20:25
people who are entrepreneurs in Silicon
20:27
Valley were surprised but only because
20:29
they don’t like reading don’t be like
20:33
them so the social history of how
20:39
everything screwed up is a reasonable
20:41
way to talk about the particular way in
20:43
which is screwed up so I’m gonna give it
20:46
a try the first thing to say is that in
20:51
the generation of media technologists
20:57
and artists and viewers from immediately
21:00
before computation went pop in like the
21:03
60s into the 70s into the 80s some of
21:07
the personality dysfunctions and some of
21:10
the craziness was already apparent we
21:11
started to see this notion that anybody
21:14
could be a celebrity and people became
21:16
obsessed with this idea that maybe I
21:17
could be one and maybe there’s something
21:19
wrong with me if I’m not and this kind
21:21
of mass media insecurity obsession thing
21:28
I it’s hard to trace the moment when
21:33
this personality dysfunction really hit
21:35
the mainstream and really started to
21:37
darken the world
21:38
we were talking earlier actually about
21:41
what moment to choose I was thinking
21:42
actually the assassination of John
21:44
Lennon because here you had somebody who
21:46
basically just wanted to be famous for
21:47
being able to be a kill a random killer
21:49
and that was a little new if you look at
21:53
crappy evil people earlier sure there
21:57
was someone to be famous I don’t know
21:58
Bonnie and Clyde or something like that
22:00
but there are a few different things
22:02
about them one thing is that they were
22:06
also stealing money there was a kind of
22:07
a way in which they were I don’t know
22:10
there’s some kind of a part of a system
22:11
they had peers they weren’t they weren’t
22:13
typically total loners the most typical
22:18
profile of really evil person before was
22:21
actually a hyper conformist the typical
22:23
Nazi was actually somebody who didn’t
22:25
want to stand out who just was going
22:27
with the flow and and fully internalized
22:30
the the social milieu around them and
22:32
because it felt normal and that’s that’s
22:34
been a much more typical way that people
22:37
behaved appallingly in history this this
22:40
sort of weird loner celebrity seeker
22:43
thing I’m sure it existed before but it
22:45
started to become prominent I I want to
22:48
say something I’ve never said publicly
22:49
before but it’s just been gnawing at me
22:50
for many years I’m old enough to have
22:53
had some contact back in the day with
22:56
both Marshall McLuhan and Andy Warhol
22:58
who were two figures who had a kind of a
23:01
loose way of talking about this early
23:04
but they didn’t condemn it they just
23:06
stood aloof and say oh we’re super smart
23:09
and wise for being able to see this
23:10
happening and what they should have done
23:11
as they should have said this is
23:13
and I it’s actually really been
23:15
bothering me I’ve never said that before
23:16
I feel it should be said because once
23:19
again the first people on the scene
23:20
sometimes have a kind of a vision and
23:23
they should be judgmental about it the
23:24
way M Forster was and I feel like they
23:27
maybe failed us morally at that point
23:29
because they saw it better than a lot of
23:30
other people maybe than anybody at that
23:32
time anyway that’s maybe not useful to
23:35
say now but at some point it has to be
23:37
said let’s fast forward a little bit
23:41
computation starts to get cheap enough
23:43
that it’s starting to creep out of the
23:44
lab this is the early 1980s
23:48
and here we hit another juncture there
23:54
was this thing that happened oh man I
23:57
was right there for it it was the birth
23:59
of the open free software idea there was
24:01
a friend of mine named Richard Stallman
24:04
any chance Richards here no I guess not
24:08
anyway you never know when saw I saw
24:10
four things anyway Richard had this
24:13
horrible he like one day he just art
24:16
saying oh my god Mike my girlfriend’s
24:18
been killed like my lovers been killed I
24:20
said oh my god that’s horrible but what
24:22
it really was was the software system
24:24
he’d been working on for this kind of
24:26
computer and what had happened is it had
24:28
gone into a commercial mode where the
24:30
companies and it was I think all the
24:31
Lisp machine which would probably nobody
24:33
remembers anymore a sort of early
24:35
attempt to make an AI specialized
24:37
computer and he he was upset he said he
24:43
sort of melded his anger about this with
24:46
a kind of an anti-capitalist feeling
24:47
said no software must be free it must be
24:50
just the thing that’s distributed it
24:51
can’t be property property is theft and
24:54
it really spoke to a lot of people it
24:57
melded with with these other ideas that
24:59
were going on at the time and so it
25:02
became this kind of feeling I would say
25:05
sort of a leftist feeling that was
25:07
profound and remains to this day a lot
25:10
of times if somebody wants to do
25:11
something useful with tech they’ll have
25:13
to put in the word open-source lately
25:15
they also have to put in blockchain and
25:17
so very typically it’s open source it’s
25:19
got blockchain and then then you know
25:21
it’s good so there was this other thing
25:26
going on which is this feeling that the
25:27
purpose of computers was to hide and
25:31
that’s that deserves a little bit of
25:33
explanation
25:34
they were America has always had this
25:37
divide this red-blue divide or whatever
25:40
remember it used to be a north-south
25:41
divide we but we fought one of history’s
25:43
horrible wars once is a civil war and so
25:48
people on what we’d call now the red
25:49
side of the divide we’re very upset
25:54
there was a Democratic president named
25:56
Jimmy Carter that a few people other
25:58
than me in the room might be old enough
25:59
to remember and there was a period when
26:02
there was an Arab oil embargo and we did
26:05
we had long lines at gas stations and he
26:07
imposed a 55 mile an hour speed limit on
26:09
the freeways which a lot of people
26:12
really hated because I wanted to drive
26:13
fast and so this thing sprang up called
26:16
CB radio and CB radios were these little
26:19
analog radios you’d install on your car
26:21
and you’d create a false persona a
26:24
handle and then you’d warn other people
26:27
about where the police were hiding so
26:29
that you could all drive fast
26:30
collectively by sharing information and
26:32
it was all anonymous he could never
26:34
trace it and this thing was huge this
26:36
had as high a profile at the time as
26:38
Twitter does today probably there were
26:40
songs celebrating it it was a really big
26:42
deal but then on the left side of
26:45
America on the blue side people also
26:48
wanted to hide and in that case there
26:52
were two things going on one is the
26:53
draft hadn’t quite died down and it was
26:55
still the Vietnam era and that was just
26:57
terrifying because people didn’t really
27:01
believe in that war and the idea of
27:03
being drafted into this horribly violent
27:05
war that appeared to have no good
27:07
purpose just absolutely broke people’s
27:09
hearts and terrified people so they
27:11
wanted to hide and a lot of people did
27:13
and then there was marijuana and the
27:16
drug laws and a lot of people really
27:19
were hiding from those as well so you
27:21
basically had both red and blue America
27:23
feeling like the number one priority for
27:27
freedom for goodness is to be able to
27:29
hide from the government so encryption
27:32
and hiding and fake personas became this
27:36
celebrated thing so this in this milieu
27:41
there was this idea that online
27:44
networking which didn’t really exist yet
27:45
I mean we had networks but they were all
27:46
very specialized and isolated there
27:48
wasn’t a broad internet yet there would
27:50
be this idea that everything would be
27:52
free and open everything would be
27:54
anonymous and it’d just be like this
27:56
giant black weird place where you
27:59
everything you never knew anything but
28:01
you were also free and nobody could find
28:02
you
28:03
hmm okay so that was that was this
28:06
starting idea so there were a few other
28:09
things that fed into it another thing
28:11
was that there was a famous rock band
28:12
called the Grateful Dead that encouraged
28:14
people to tape their songs and didn’t
28:16
care about privacy and all this there
28:17
are all these different factors
28:19
now oh this was going on and then
28:21
simultaneously this other thing happened
28:23
which is we started to have the figure
28:26
of the glorified practically superhuman
28:31
tech entrepreneur and these were in the
28:34
80s they but these were figures like
28:36
Steve Jobs Bill Gates people we still
28:39
remember of course bill still with us
28:41
and they were just worshipped they were
28:45
the coolest people ever well around
28:47
around here in California people hated
28:49
Bill but they loved Steve and there was
28:55
this kind of interesting problem which
28:58
is we not we didn’t just like our tech
29:02
entrepreneurs we made them into sort of
29:04
superhuman figures the the phrase dent
29:08
the universe is associated with jobs
29:09
it’s this notion that there’s this this
29:12
kind of michi and super power to create
29:16
the flow of reality to direct the future
29:18
because you are the tech entrepreneur
29:19
and computation is reality and the way
29:22
we set these architectures will create
29:24
future societies and that’ll ultimately
29:25
change the shape of the universe once we
29:27
get even greater powers over physics and
29:30
there was just like this no end to the
29:32
fantastical thinking we were at the
29:34
birth point for every form of absolute
29:36
God like you know immortality and
29:39
shape-shifting and every crazy thing I
29:40
was a little bit of that I’m sorry to
29:43
say I was I kind of got a little off
29:44
I was pretty intense in the 80s myself
29:47
but anyway there was this feeling that
29:50
the entrepreneur could just just like
29:55
was had more cosmic power than the
29:58
average person okay so now here you have
30:00
a dilemma that had been kind of sneaking
30:03
up and nobody had really faced it on the
30:05
one hand everything’s supposed to be
30:07
free everything’s supposed to be
30:09
anonymous everything is supposed to be
30:11
like this completely open thing but on
30:14
the other hand
30:15
we love our entrepreneurs we worship our
30:17
entrepreneurs the entrepreneurs are
30:18
inventing reality so it should be clear
30:21
that there’s a bit of a potential
30:23
conflict here everything must be free
30:24
but we worship entrepreneurs how do we
30:27
do it how do we do it how do we do it
30:28
and so a set of compromises were created
30:32
over the years that ended up giving us
30:35
the worst of both sides of that I would
30:37
say so I’ll give you the the story is is
30:40
long and interesting but I’ll give you
30:42
just a few highlights one thing that
30:45
happened is when we finally got around
30:47
to actually creating the Internet
30:49
we decided it has to be super bare-bones
30:52
it would represent machines because
30:54
without having a number representing a
30:56
machine you can’t have an Internet but
30:58
it wouldn’t represent people it didn’t
30:59
intrinsically have accounts built-in for
31:01
humans it had no storage for humans
31:05
built-in it had no transactions it had
31:07
no authentication it had no persistence
31:10
of information guaranteed it had no
31:12
historical function we had it was like
31:14
super bare-bones just this thing
31:16
connects with that thing that’s all it
31:17
did and the reason why was that we were
31:21
supposed to leave room for future
31:23
entrepreneurs those who we worshipped
31:26
you know the Internet so if I was about
31:30
to say the Internet as you know was
31:31
invented by Al Gore some of you would
31:33
laugh and that’s because it was a laugh
31:36
line for a while because he was a
31:37
democratic he was a vice president and
31:40
before that a senator from Tennessee and
31:42
he was accused of over claiming that
31:44
he’d invented the internet on a TV show
31:45
which didn’t happen however I think he
31:48
should claim it I think he did invent it
31:50
he didn’t invent it technologically not
31:52
at all all of the underlying stuff which
31:55
is called a packet switch network and a
31:56
few other elements that existed in lots
31:59
of instances from before he had this
32:01
idea of throwing some government money
32:03
into it to bribe everybody to become
32:04
interoperable so they’d just be one damn
32:06
network and people could actually
32:07
connect that really was him and he
32:10
deserves credit for having done that
32:12
unless you think it was a terrible idea
32:14
but when that was happening
32:16
I remember having conversations about is
32:18
like we by creating this thing in such
32:21
an incredibly bare-bones way we are
32:24
creating gifts of hundreds of billions
32:26
of
32:26
for persons unknown who will be required
32:28
to fill in these missing things that
32:30
everybody knows have to be filled in and
32:34
then a little while later this other
32:36
thing happened which is Tim berners-lee
32:37
who’s great came up with a world wide
32:39
web protocol and here he did this thing
32:44
up to that point all of the ideas for
32:48
how to create shared you know shareable
32:51
media experiences online which are
32:53
called hypertext after Ted Nelson had
32:55
come up with the first Network design

32:57
back in 1960 the HTTP is from his for
33:00
hypertext they a core tenet of these is
33:04
that anytime one thing on the internet
33:06
pointed at something else that other
33:07
thing had to know it was being pointed
33:08
out so that there were two-way legs you
33:11
always knew who was pointing at you and
the reason for that is that way you
could preserve context provenance
history you could create chains of
payment where if people mashed up stuff
from somebody else in that person mashed
up from somebody else you could pay
payments that would populate back to pay
for everybody who contributed so if you
wanted to have an economy of information
you could
the information wouldn’t be
dropped but Tim just had one wailings
you could point it somebody they have no
idea that we’re being pointed out and
the reason for that is that it’s just to
actually do the two-way links is
genuinely a pain in the butt it’s just
more work if you do one way links the
whole thing could spread a lot faster

anybody can do it it’s just a much
easier system and that embedded in it
not only this idea of virality or me
meanness where whatever can spread the
fastest is what wins

and so it was a quantity over quality
thing in my view that was another thing
that happened so another thing that
happened didn’t come from Silicon Valley
in the late 80s people in Wall Street
started to use automated trading in the
first flash crash from out of control
trading algorithms was 89 and they
figured out something very basic
although an Forester had described
exactly this problem so much earlier
which is that if you had a bigger
computer than everybody else and it was
more central getting more information
you could calculate ahead of everybody
in gained an information advantage and
in economics information advantages
everything so if you’ve had just a
little bit more information on everybody
else you could just turn that into money
and it wasn’t really new insight but it
had actually been implemented before
then shortly after that a company called
Walmart realized they could apply that
not just to financial instruments to
investments but to the real world and
they created a software model of their
supply chain and dominated it
they could
35:10
go to anybody who was involved somewhere
35:12
in giving them products and figure out
35:14
what their bottom line was so they could
35:15
negotiate everybody down they knew who
35:17
everybody’s competitor was they went
35:19
into every negotiation with superior
35:21
information when they built this giant
35:23
retail empire on information superiority
35:28
Dada all happened before anybody in
35:30
Silicon Valley started doing it okay now
fast-forward to the birth of Google so
you have these super bright kids Sergey
and Larry some of the students I talked
to today on campus here remind me of
35:43
what they were like at the same age
35:44
super bright super optimistic idealistic
35:47
actually focused and they were backed
35:54
into a corner in my view on the one hand
35:57
the whole hacker community the whole
tech community would have just slammed
them if they did anything other than
everything being free but on the other
hand everybody wanted them to be the
next Steve Jobs the next Bill Gates
that
was like practically a hunger like we
want we want our next star and the only
way to combine the two things was the
advertising model
the advertising model
would say you’ll get everything for free
you can be you know as far as you’re
concerned your experience is you just
36:26
ask for what you want and we give it to
36:28
you now the problem with that is that
36:31
because it’s an advertising thing you’re
36:35
actually being observed your information
36:37
is being taken you’re being watched and
36:39
there’s a true customer this other
36:41
person off to the side who at first you
36:45
were always aware of because you could
36:46
see their little ads you know they’re
36:48
like if your local dentist or whatever
36:50
it was cute at first it was harmless at
36:51
first
36:54
and unfortunately if they come up with
37:00
this thing
37:01
after I don’t know worse law had ended
37:04
in computers were as fast as they were
37:05
ever gonna get and we’d established a
37:08
whole regulatory and ethical substrate
37:10
for computation everything maybe it
37:12
could have worked but instead they did
37:14
it in a period where there was still a
37:16
whole lot of Moore’s law to happen so
37:18
all the computers got faster and faster
37:19
cheaper and cheaper more more plentiful
37:21
more and more storeit or more connection
37:22
the algorithms got better and better
37:25
machine learning kind of started to work
37:27
a little better a lot of these
37:29
algorithms kind of kind of figured it
37:30
out we had enough computation to do
37:32
experiments and get all kinds of things
37:34
working that hadn’t worked before all
37:36
kinds of little machine vision things I
37:38
sold them on machine vision company
actually and the whole thing kind of
accelerated and what started out as an
advertising model turned into something
very different and so here we get into
our description of at least my
perception of the state that we’re in
37:54
right now so I mentioned earlier that
37:58
Norbert Wiener had described what he
38:04
viewed as a potentially horrible outcome
38:06
for the future of computation where
you’d have a computer in real time
observing a person with sensors and
providing stimulus to that person in
38:14
some form with displays or other
38:16
effectors and implementing behavior
38:19
modification feedback loops in order to
38:23
influence the person and if that was
38:24
done globally it would detach humanity
38:27
from reality and bring our species to an
38:29
end that was the fear back in the 50s
38:31
now unfortunately this innocent little
advertising model which was supposed to
address both the desire to have
everything be this Wild West open thing
and the desire to have entrepreneurs
despite everything being free landed us
right in that pocket that’s exactly
where we went
38:53
now I should say a bit about behaviorism
38:56
because that’s another historical thread
38:58
that led to where we are behaviorism is
39:02
a discipline of reducing the number of
39:07
variables in the training of an organism
39:10
so that you can corporal’s them
39:12
rigorously and reproduce effects so
39:15
let’s say if you’re whispering into your
39:18
horses ear while you’re training your
39:19
horse
39:20
that’s not behaviorism if you’re
39:22
whispering into your kids ear even if
39:24
you do offer some treats once in a while
39:26
ten cards behavior that’s not
39:27
behaviorism it has elements of it but
39:30
hardware behaviors and reduces the
39:32
variables and says look what we want to
39:34
do if we want to isolate we want to say
39:36
here’s this organism it’s in a box
39:38
sometimes they’re called Skinner boxes
39:41
remembering BF Skinner one of the famous
39:43
behaviorists and we want to say if the
39:46
creature person human whatever does a
39:48
certain thing you want you give the
person the treat does something you
don’t want give them a punishment
typically maybe candy and electric shock
39:58
the timing and the occurrence of these
40:02
things is guided by an algorithm you
40:04
find him the algorithm you need to
40:06
discover how to change behavior patterns
40:08
this science of studying behavior
40:11
behaviorism yielded surprises really
40:16
interesting surprises very early on the
40:19
first celebrity behaviorist was probably
40:22
Pavlov you’ve all heard of Pavlov I’m
40:24
sure and he demonstrated famously that
40:27
he could get a dog to salivate upon
40:30
hearing a bell whereas previously the
40:32
dogs salivated
40:33
upon being given food and hearing the
40:36
Bell so he was able to create a purely
40:39
symbolic seeming stimulus to replace the
40:43
original concrete one that’s quite
40:45
important because in many areas today
40:48
where behaviors modified addictions are
40:50
created there only abstract stimuli this
40:53
is true for instance for gambling’s that
modern gambling is based on this so are
like little games like candy crush were
there pictures of candy instead of real
41:01
candy now I have no doubt someday
41:04
there’ll be some Facebook or Google
41:07
hovercraft you know drone over your head
41:10
that drops real candy and electric
41:11
shocks on your head but for the moment
41:13
we’re in this symbolic realm that that
41:16
pavlov uncovered another amazing result
41:21
is that you might think naively that’s
41:23
simply providing punishment and reward
41:26
as reliably and as immediately as
41:29
possible would be the most effective way
41:31
to change behavior patterns but actually
41:33
that’s not true it turns out that adding
41:36
an element of randomness makes the
41:39
algorithms more effective so we don’t
41:44
fully just to state the obvious nobody
41:46
really understands the brain as yet but
41:49
it appears that the brain is is
41:52
constantly in a natural state of seeking
41:55
patterns of trying to understand the
41:57
world so if you provide a slightly
41:59
randomized feedback pattern it doesn’t
42:02
confuse or repel the brain instead of
42:05
draws the brain in the brain is a gate
42:06
there must be something more to
42:07
understand there must be something more
42:09
and gradually you’re drawn and more and
42:11
more and more and so this is why the
42:15
randomness of when you win at gambling
42:17
is actually part of the addiction
42:19
algorithm that’s part of what makes it
42:21
happen
42:21
now in the case of social media what
42:24
happens is the reward is when you get
42:27
retweeted or you go viral something like
42:30
that the term of art in Silicon Valley
42:33
companies is usually a dopamine hit
42:35
which is not an entirely accurate
42:37
description but it’s the one that that’s
42:40
most commonly is for when you have a
42:41
quick rise of a positive reward but just
42:45
as the gambler becomes addicted to the
42:48
whole cycle where they’re losing more
42:50
often than they win a Twitter addict
42:53
gets addicted to the whole cycle where
42:56
they’re most often being being punished
42:59
by other people who are tweeting and
43:00
they only get a win once in a while
43:02
right it’s the same it’s the same
43:05
algorithm and indeed
43:09
one of the side effects so in the trade
43:14
the terminology we use is engagement we
43:17
have algorithms that drive engagement
43:19
and we hire zillions of people with
43:22
recent PhDs from psych departments this
43:24
whole program there’s a program called
43:26
persuasive technology at Stanford where
43:28
you can go get a PhD in this and then
43:30
you get hired by some tech company to
43:32
drive engagement but it’s it’s really
43:34
just a sanitized word for addiction so
43:40
we drive addiction using a variety of
43:42
these algorithms and we can study them
43:45
more than the classical behavior server
43:46
did because we can study a hundred
43:48
million instances at once and and and we
43:52
can put out a hundred million variations
43:53
on all kinds of people and correlated
43:55
with data for all those people and then
43:58
cycle and cycle in a cycle the
44:00
algorithms can find new pockets of
44:03
efficacy they can tweak themselves until
44:06
they work better and we don’t even know
44:07
why they’re far ahead of any ability we
44:10
have to really keep up with them and try
44:12
to interpret exactly why some things
44:13
work better than other things
44:14
now even so it’s important to get this
44:18
right the effect is in a way not that
44:22
dramatic so Facebook for instance has
44:26
published research bragging that it can
44:28
make people sad and they don’t realize
44:29
that they were made sad by Facebook now
44:31
by the way you might wonder why would
44:34
Facebook publish that wouldn’t they want
44:37
to hide that fact it sounds pretty bad
44:39
but you have to remember that you’re not
44:42
the customer of Facebook the customer is
44:44
the person off to the side we’ve created
44:46
a world in which any time two people
44:48
connect online it’s financed by a third
44:51
person who believes they can manipulate
44:52
the first two so to the degree Facebook
44:55
can can convince that the third party
44:58
that mysterious other who’s hoping to
45:00
have influence that they can have some
45:03
mystical magical unbounded sneaky form
45:06
of influence then Facebook makes more
45:08
money that’s why they published it and
45:11
I’ve been at events where this stuff is
45:14
sold by the various tech companies and
45:15
they there’s no end to the brags and the
45:18
exaggerations when it comes to telling
45:20
the true customers what their powers are
45:22
very different from their public stance
45:23
but at any rate the the the darkness of
45:33
this all is that when you use this
45:37
technique to addictive people and we
45:40
haven’t even gotten to the final stage
45:41
of influencing their behavior patterns
45:42
we’re still just at the first stage of
45:44
getting them addicted you create
45:46
personality dysfunctions associated with
45:49
addiction because it is a form of
45:50
behavioral addiction so if any of you
45:53
who have ever dealt with somebody who’s
45:55
a gambling addict the technical
45:58
qualities of gambling addiction are
45:59
similar to the technical qualities of
46:02
social media addiction now I was just
46:06
saying before that we have to get this
46:07
right and understand the the degree of
46:10
awfulness here because it’s actually
46:13
kind of slight but just very consistent
46:15
and distributed a gambling addiction can
46:18
be really ruinous somebody can destroy
46:20
their lives and their family a social
46:22
media addiction can be ruinous as we’ve
46:24
seen by unfortunate events in just the
46:27
last few days but more often there’s a
46:30
statistical distribution where a
46:32
percentage of people are kind of
46:35
slightly effective and have their
46:37
personality slightly changed so what
46:40
will happen is some percentage and in
46:42
some of the studies I’ve seen published
46:44
maybe it’s like 5%
46:46
show like a three percent change in
46:48
personality or something like that so
46:49
and this is over hundreds of millions of
46:51
people or even over billions so it’s a
46:53
very slight very distributed statistical
46:56
effect on people with just a few who are
46:59
really dramatically affected but the
47:02
problem with that is that it compounds
47:07
like compound interest a slight effect
47:10
that’s persistent consistent repeated
47:14
starts to darken the whole society so
47:17
let’s talk a little bit about the
47:18
addictive personality that’s brought out
47:20
by these things the way I characterize
47:23
it is it becomes paranoid
47:28
insecure a little sadistic it becomes
47:36
cranky now why why those qualities so I
47:44
have a hypothesis about this and here
47:46
I’m hypothesizing a little ahead of
47:50
experimental results in science so I
47:53
want to make that clear this is a
47:54
conjecture not not something that I can
47:57
cite direct evidence for what I but but
48:01
all the the components of it are all
48:03
well studied so it’s just putting
48:05
together things that are known and I
48:07
think I think this should therefore be
48:09
worthy of public discussion you can very
48:13
roughly bundle emotional responses from
48:17
people into two kind of bins when we’ll
48:22
call positive and the other will call
48:23
negative the positive ones are things
48:25
like affection trust optimism and a
48:32
person belief in a person faith in a
48:34
person comfort with a person relaxing
48:37
around a person all that kind of stuff
48:39
the qualities you want to feel in
48:40
yourself when you’re dating somebody
48:42
let’s say the negative ones are things
48:46
like fear anger jealousy rage feeling
48:53
aggrieved feeling a need for revenge
48:55
just all this stuff now in the negative
48:58
bin a lot of these emotions are similar
49:01
to another bin that’s been described
49:03
over many years which is the startle
49:05
responses or the fight-or-flight
49:06
responses and the thing about these
49:09
negative ones is that they rise quickly
49:11
and they take a while to fall so you can
49:15
become scared really fast you can become
49:17
angry really fast and the related
49:21
positive emotions tend to rise more
49:23
slowly but can can drop quickly they
49:25
have the reverse time profile so it
49:29
takes a long time to build trust but you
49:31
can lose trust very quickly it takes a
49:33
long time to become relaxed compared to
49:37
how quickly you can become
49:38
startled scared and nervous and on edge
49:41
no this isn’t universally true there are
49:44
some fast rising positive emotions I
49:46
just talked about the dopamine hits
49:48
earlier so that’s an exception but
49:50
overall they’re more fast rising
49:52
negative ones
49:53
now these algorithms that are measuring
49:57
you all the time in order to adapt the
50:00
customized feeds that you see and the
50:02
designs of the ads that you see and just
50:04
everything about your experience they’re
50:06
watching you watching you watching you
50:07
in a zillion ways expanding all the time
50:10
now they’re following your voice tone
50:13
and trying to discern things about your
50:14
emotions based on pure correlation
50:17
without necessarily much theory behind
50:18
it they’re watching your emotions as you
50:21
move they’re watching your eyes your
50:23
smile and of course they’re watching
50:25
what you click on what you type all that
50:28
and the thing is if you have an
50:32
emotional response that’s faster the
50:35
algorithms are going to pick up on it
50:36
faster because they’re trying to get as
50:39
much speed as possible they’re rather
50:42
like high-frequency trading algorithms
50:44
in that sense we intrinsically in
50:47
Silicon Valley try to make things that
50:49
respond quickly and act quickly and so
50:51
if you have a system that’s responding
50:54
to the fast rising emotions you’ll tend
50:56
to catch more of the negative ones
50:57
you’ll tend to catch more of the
50:58
startled emotions now here’s the thing
51:01
if you look at the literature and ask
51:04
the broad question if we accept this
51:08
idea of beaming emotions into positive
51:10
and negative feedback emotions as far as
51:14
behavior change goes is positive or
51:16
negative more influential on human
51:19
behavior and the answer you’ll get is a
51:21
really complex patchwork there’s
51:24
behaviors have been around for a long
51:26
time so there’s a lot of studies you can
51:28
read hard to know exactly how high
51:30
quality all the research is especially
51:32
the older stuff but in general you can
51:34
find lots of examples of positive
51:37
feedback working better than negative or
51:39
vice versa and it’s all very situational
51:42
a lot of it’s very subtle on how things
51:44
are framed for people all kinds of stuff
51:45
but overall I what I perceive from the
51:49
literature is
51:49
approximate purity between positive and
51:52
negative but if you ask which emotions
51:56
will the algorithms pick up on when
51:58
they’re trying to get the fastest
51:59
possible feedback it’s unquestionably
52:01
true that the negative ones are faster
52:03
all right
52:05
so what you see is the algorithm
52:06
suddenly flagging oh my god I got a rise
52:09
out of that person let’s do some more of
52:10
that because we’re engaging that person
52:12
and that stuff tends to be the stuff
52:15
that makes them angry paranoid
52:17
revengeful insecure nervous jealous all
52:21
these things and so what you see is this
52:24
feedback cycle where a certain kind of
52:28
dysfunctional personality trait is
52:30
brought out more and more and people
52:33
with similar dysfunctional personalities
52:36
are introduced to each other by the
52:38
system’s
52:38
so when it’s a personality look like
52:41
well the the addiction personality
52:43
online all named three people who have
52:47
recently displayed it rather blatantly
52:49
one is the president who I’m just not
52:52
going to bother to name because I’m sick
52:53
of idiot the second is Kanye the third
52:58
is Elon Musk three people all displaying
53:02
somewhat overlapping in my view
53:05
personality distortions now I’ve no I’ve
53:09
had slight contact with two of the above
53:12
three I’ll let you guess which two they
53:14
are well know I’ll say one of them’s
53:17
trouble I’ve met Trump a few times over
53:18
a very long period of time I’ve never
53:21
known him well I’ve never had a real
53:23
conversation with him but I will say
53:24
that in the 80s and 90s he didn’t seem
53:29
like somebody who was desperate for you
53:30
to like him he didn’t seem like somebody
53:33
who was nervous about what you thought
53:34
about him he didn’t seem like somebody
53:36
who was itching for a fight he didn’t
53:38
seem like somebody who was looking for
53:40
trouble and thought it would help him he
53:43
really just didn’t seem like that at all
53:44
he seemed I think he was still a con man
53:46
I think he was but he was kind of like a
53:49
happy con man is that you know it was
53:51
like a different persona
53:53
and and I think what you know remember
53:58
how I said before that the gambling
54:00
addict is addicted to the whole cycle
54:02
where they lose a lot before they win
54:04
and I think in the same way the Twitter
54:06
addict is addicted to a cycle where they
54:08
bring a lot of wrath upon themselves and
54:10
have to deal with a lot of negative
54:12
feedback before they get positive
54:14
feedback or that you know there’s a mix
54:16
it’s very much like the losing and
54:18
winning and gambling and so I think
54:21
what’s happened is he’s gotten himself
54:22
into this state where he’s he’s like
54:24
this really nervous narcissist and this
54:27
is kind of weird like this personality
54:30
of the person who really like this
54:31
really like me I think he likes me
54:33
this kind of weird nervous narcissistic
54:36
insecure person has not been a typical
54:39
authoritarian personality in the past
54:41
and yet it’s working now and I suspect
54:45
the reason why is a lot of the followers
54:47
who respond to it see themselves in that
54:49
insecurity which is really strange I
54:52
mean if you think about this in the past
54:55
the celebrity figure or the leader
54:57
typically wanted to display a
54:59
personality that was kind of
55:02
invulnerable and an aloof and unmeaning
55:06
self-sufficient uncaring about whether
55:09
whether they’re liked or not and yet
55:12
that’s not what’s going on here it’s
55:13
really strange and and then there’s this
55:16
issue of lashing out its it be so so
55:19
it’s it’s as if because you know that
55:22
you have to get a certain if there’s a
55:23
certain amount of punishment that goes
55:25
with that reward you actually seek out
55:28
some of the punishment because you’re oh
55:29
that’s actually a part of your addiction
55:31
so if you’re a gambling addict you
55:33
actually make some stupid bets it’s it’s
55:35
it’s true it’s just what happens so you
55:38
have Elon Musk
55:39
I’m calling this guy who tried to rescue
55:41
kids in a cave in Thailand a pedophile
55:43
out of nowhere all right same thing
55:46
twitter twitter addiction dysfunctional
55:49
personality Kanye I’m not even what you
55:52
know but but basically you have people
55:54
who are kind of degrading themselves and
55:57
making themselves into fools but in a
56:01
funny way in the current environment
56:03
and there’s a whole world of addicted
56:06
fans who actually relate to it see
56:08
themselves in it and it works it works
56:10
for the first time in history and it’s
56:12
really strange it’s really it’s a really
56:15
weird moment okay so I started by
56:20
talking about the problem of losing
56:22
touch with reality
56:23
now as you heard I have a book called
56:28
ten arguments for deleting your social
56:29
media accounts right now and it goes
56:31
through a lot of reasons to delete your
56:34
social media of which the closest to my
56:37
heart is actually the final one which is
56:39
a spiritual one it’s about how I think
56:41
that Silicon Valley is kind of creating
56:44
a new religion to replace old religions
56:47
and even atheism with this new faith
56:49
about AI and the superiority of tech and
56:54
how we’re creating the future and all
56:55
this and and I feel that that religion
56:57
is an inferior woman people are being
56:59
drawn into it through practice so that
57:00
that tenth argument is the one I care
57:02
most about but what I want to focus on
57:04
here is the existential argument which
57:06
is the loss of reality so the problem we
57:11
have here is that we’ve created so many
57:15
addicts so many people who are on edge
57:17
that they perceive essentially politics
57:24
before they perceive nature they
57:26
perceive the world of human
57:31
recriminations before they perceive
57:33
actual physical reality no I presented a
57:36
theory it’s in various of my books
57:39
called the pacts which which I will
57:42
recount to you now that’s a way of
57:44
thinking about this it goes like this
57:48
there’s some species that are
57:51
intrinsically social like a lot of ants
57:54
there’s some species that tend to be
57:57
solitary like a lot of octopuses some of
58:01
my favorite animals there are some
58:04
species that can switch that can be
58:07
either solitary or social depending on
58:11
circumstances
58:13
and a famous one that we refer to in
58:16
mythology and in our storytelling is the
58:18
wolf you could have a wolf pack or you
58:21
can have a lone wolf same wolves
58:24
different social structures different
58:26
different epistemology I would say when
58:30
you’re a lone wolf you’re responsible
58:33
for your own survival you have to pay
58:36
attention to your environment where will
58:38
you find water where will you find prey
58:40
how do you avoid being attacked where do
58:43
you find shelter how do you survive bad
58:44
weather you are attached to reality like
58:47
a scientist or like an artist you are
58:50
naturalist when you’re in a wolf pack
58:54
different story now you have to worry
58:57
about your peers they’re competing with
58:59
you you have to worry about those above
59:01
you in the pack will they trash you can
59:04
will you get their station you have to
59:06
piss on those below you because you have
59:08
to maintain your status but you have to
59:11
unify with all your fellow pack members
59:13
to oppose those other packs over there
59:15
the other so all of a sudden social
59:19
perception and politics has replaced
59:22
naturalism politics versus naturalism
59:25
those are the epistemologies of the lone
59:28
wolf and the wolf pack people are also
59:33
variable in exactly this way we can
59:36
function as individuals or we can
59:38
function as members of a pack now what
59:43
happens is exactly what I am at least
59:45
hypothesizing happens with wolves it’s a
59:47
kind of interesting interaction
59:48
interacting with scientists who actually
59:50
study wolves because I haven’t actually
59:52
spent that much time with wolves just a
59:53
little bit so they’re people who know a
59:54
lot more about wolves and let’s just say
59:57
my little portrayal is overly simplified
59:59
but just I mean I’m it’s like a little
60:03
cartoon but I hope it functions to
60:04
communicate so when we are thinking as
60:11
individuals we have a chance to be
60:13
naturalist so we have a chance to be
60:14
scientists and artists we have a chance
60:16
to perceive reality uniquely from our
60:20
own unique perspective a diverse
60:22
perspective as compared to everyone else
60:23
is that
60:24
we can then share when we join into a
60:28
pack mentality we perceive politics so
60:32
what happens on social media is because
60:34
the algorithms are trying to get a rise
60:37
out of you to up your engagement and
60:39
make you ripe for receiving behavior
60:42
modification you’re constantly being
60:44
pricked with little social anxiety rage
60:50
irritations all these little things all
60:53
these little status worries is my life
60:55
as good as that person’s life am i
60:57
lonely relative to all these people what
60:59
do they think of me am i smart enough am
61:02
i getting enough attention for this why
61:03
didn’t people care about the last thing
61:05
I did online blah blah blah blah blah
61:06
and there’s just like it’s not that any
61:08
of these things by themselves are
61:10
necessarily that serious but
61:11
cumulatively what they’re doing is
61:13
they’re shifting your mindset and
61:16
suddenly you’re thinking like a packed
61:18
feature you’re so the pack switch is set
61:20
and you’re thinking politically and when
61:23
you think politically you lose
61:25
naturalism you know I think both modes
61:29
of being have a place I think being I
61:32
think if people exclusively all the time
61:34
stayed in the lone setting that would be
61:37
bad for society that would be bad for
61:39
relationships would be bad for families
61:42
and so on however there needs to be a
61:45
balance there needs to be a healthy way
61:47
of going back and forth between them and
61:49
not getting lost in one or the other and
61:52
so the hypothesis I’d put forward is
61:54
that we’re giving people so many little
61:57
anxiety-producing bits of feedback that
61:59
we’re getting them into this pack
62:00
mentality where they’ve become hyper
62:04
political without maybe even quite
62:06
realizing it and losing touch with
62:08
reality no when I say losing touch with
62:10
reality that demands some evidence
62:14
because you might say well are we less
62:16
in touch than in the past
62:18
so remember at the start after the music
62:23
I gave you what I consider to be sort of
62:27
a positive framing and a lot of good
62:29
news absolute poverty has been reduced
62:32
absolute levels of violence have been
62:33
reduced absolute levels of disease have
62:35
been reduced and so
62:36
there are many ways in which we’re
62:38
bettering ourselves but there’s this
62:40
other thing going on which is bad enough
62:44
that it might be the undoing of all of
62:47
that and that is this loss of reality
62:50
now here’s what I want to point out I I
62:53
travel around a fair amount and I
62:55
visited places that would appear on the
62:58
surface to have very little in common
63:00
I’ll mention some of them Brazil Sweden
63:03
Turkey Hungary the United States what do
63:08
they all have in common what they have
63:10
in common is the rise not just we
63:14
sometimes characterized it as right-wing
63:16
populist politics I don’t think that’s
63:20
quite right I think what we actually are
63:23
seeing is the rise of cranky paranoid
63:30
unreal politics I think that’s a better
63:34
characterization and it’s really
63:36
remarkable how it’s all happened at
63:38
about the same time and it’s happened in
63:40
some poor parts of the world too it’s
63:41
not even it’s like so it’s an you could
63:44
say well it’s something about aging
63:45
populations all the cranky old people I
63:47
have you know freshmen will tell me that
63:50
to get our minor but you know their
63:52
countries that are very young that have
63:54
that problem Turkey Brazil it’s like oh
63:56
it’s diverse countries it’s that we
63:59
can’t have democracies unless they’re
64:01
they’re ethnically monolithic or
64:03
something brazil’s diverse oh it’s it’s
64:08
inequality we can’t have the problem is
64:11
that societies are just losing their
64:14
social safety net well you know Sweden
64:17
Germany not really they might have
64:19
anxiety about actually you know it’s
64:22
it’s not so all these places are really
64:25
different they have different histories
64:26
and yeah they’ve all had similar
64:29
dysfunctions and so you have to say well
64:31
what’s in common between all of them and
64:33
you can say something vague well they
64:35
all have anxiety about the future and
64:36
this that’s true but the obviously they
64:38
have in common is that people have moved
64:40
to this mode of connecting through
64:42
manipulative systems that are designed
64:44
for the benefit of third parties who
64:45
hope to manipulate everybody sneakily
64:47
that seems like the clear thing they all
64:50
have in common
64:51
Brazil recently I mean all the same crap
64:55
that we saw was happening on whatsapp
64:58
which is the big connector down there
65:00
and Facebook I think to their credit try
65:04
to help a little bit but they couldn’t
65:05
really do it cuz the whole system is
65:07
designed to be manipulative you know
65:08
it’s if if if you have a car – that’s
65:12
designed to roll it’s very hard to say
65:14
well we won’t let it roll very much I
65:16
mean whatever it does it’ll be rolling
65:17
if you have a manipulation system and
65:19
that’s what it’s designed for you can
65:21
try to get it to roll more slowly or
65:23
something but all it can really do is
65:24
manipulate that is what these things are
65:26
optimized for that’s what they’re built
65:28
for that’s how they make money
65:29
every penny of the many billions of
65:32
dollars that some of these companies
65:33
have taken in that are totally dependent
65:35
on this and of the big companies the
65:37
only ones really totally dependent are
65:39
Google and Facebook or almost suddenly
65:41
dependent it all comes from people who
65:43
believe they’ll be able to sneakily
65:44
influence somebody else by paying money
65:46
via these places that is what they do
65:47
there’s just no other way to describe it
65:50
and so you have the typical thing that
65:57
happens is that the algorithms there
66:01
isn’t any information in them that comes
66:03
from like angels or extraterrestrials it
66:06
all has to come from people so people
66:07
input some information and often it’s
66:09
very positive at first you know it’ll a
66:11
lot of the starter information that goes
66:13
into social networks ranges from
66:16
extremely positive and constructive and
66:18
constructive to just neutral and nothing
66:21
much so there might be people who are
66:23
trying to better themselves maybe
66:24
they’re trying to help each other with
66:26
health information or something like
66:27
that
66:28
then all this information starts they’ll
66:31
say what we’re gonna forward some of
66:33
this information to this person in that
66:34
person we’ll try a 10 million times and
66:36
we’ll see if we get a rise from anybody
66:39
that ups their engagement now the people
66:41
who will be engaged quote-unquote
66:43
engaged are the ones who dislike that
66:45
information so all of a sudden you’re
66:47
getting juice from finding exactly the
66:49
horrible people who hate whatever the
66:50
positive people started off with and so
66:53
this is why you see this phenomenon over
66:55
and over again where whenever somebody
66:57
finds a great way to use a social
66:58
network they have this
66:59
initial success and then it’s echoed
67:01
later on but horrible people giving even
67:03
more mileage out of the same stuff so
67:04
you start with an Arab Spring and then
67:06
you get Isis getting even more mileage
67:08
out of the same tools you start with
67:10
black lives matter you get these
67:12
horrible racist these horrible people
67:16
who just are blackening America getting
67:18
even more mileage out of the same tools
67:20
it just keeps on happening and by the
67:24
way you start with me too and then you
67:26
get in cells and proud boys and whatever
67:28
the next stupid things gonna be because
67:30
the algorithms are finding these people
67:32
as a matter of course introducing them
67:34
to each other and then putting them in
67:35
feedback loops where they get more and
67:36
more incited without anybody planning it
67:39
there’s no evil person sitting in a
67:41
cubicle intending this I or at least I
67:44
would be very surprised to find somebody
67:46
like that I know a lot of the people in
67:49
the different places and I just don’t
67:51
believe it I believe that we backed
67:54
ourselves into this weird corner and
67:56
we’re just not able to admit it and so
67:58
we’re just kind of stuck in this stupid
68:00
thing where we keep on doing this to
68:01
ourselves so what you end up with is
68:06
electorates that are driven you have
68:09
like enough of a percentage of people
68:11
who are driven to be a little cranky and
68:14
paranoid and a little irritated and they
68:17
might have legitimate reasons I’m not
68:18
saying that they’re totally disconnected
68:20
from real life complaints but their way
68:22
of framing it is based on whatever the
68:24
algorithms found could be forwarded to
68:26
them that would irritate them the most
68:27
which is a totally different criteria
68:29
than reality so whatever it is and so if
68:34
it’s in the case of the synagogue
68:37
shooter it was one set of in
68:40
the case of the pipe bomber guy was
68:41
other thing in the case of the guy who
68:43
set up the but it’s all similar it’s all
68:45
part of the same brew of stuff that
68:46
algorithms forward now in some cases the
68:51
algorithms might have tweaked the
68:53
messages a bit because the algorithms
68:54
can do things like play with fonts and
68:56
colors and timing and all kinds of
68:57
parameters to try to if those have a
68:59
slight effect of how much of rise they
69:01
can get but typically they come from
69:03
people who are also
69:05
just trying to get as much impact as
69:08
possible and I think what I think what’s
69:11
happened is we’ve created a whole world
69:13
of people who think that it’s honorable
69:18
to be a terribly socially insecure
69:21
nitwit who feels that the world is
69:23
against them and it’s desperate to get
69:24
attention in any way and if they can get
69:26
that attention that’s the ultimate good
69:28
and the president acts that way a lot of
69:31
people act that way
69:33
that’s what musk was doing and I could
69:36
many other figures and I think what
69:38
happens is these people become both the
69:40
source of new data that furthers the
69:41
cycle and of course it drives them and
69:44
so that there’s sort of multiple levels
69:48
of evil that result from this the
69:50
obvious one is these horrible people who
69:54
make our world unsafe and make it make
69:57
our world violent and break our hearts
69:59
and just keep on doing it over and over
70:01
again and this just off the sense that
70:03
just random people are self-radicalized
70:06
and turning themselves into the heart of
70:08
the most awful version of a human
70:09
imaginable but there aren’t that many of
70:12
them in absolute numbers and I said in
70:14
earlier in terms of absolute amounts of
70:17
violence there’s actually an overall
70:18
decrease in the world despite all this
70:20
horrible stuff with some notable
70:22
exceptions like in with Isis in the
70:24
Middle East and so forth
70:25
but overall you know actually that’s
70:27
that’s true however the second evil is
70:31
the one that I think actually threatens
70:33
our overall survival and that is the one
70:35
of gradually making it impossible to
70:38
have a conversation about reality it’s
70:41
really become impossible to have a
70:44
conversation about climate it’s become
70:46
impossible to have a conversation about
70:48
health it’s become impossible to have a
70:51
conversation about poverty it’s become
70:53
impossible to have a conversation about
70:55
refugees it’s become impossible to have
70:58
a conversation about anything real it’s
71:03
only become possible to have
71:05
conversations about what the algorithms
71:07
have found upsets people and on the
71:09
terms of upsetting because that’s the
71:11
only thing that’s allowed to matter
71:15
and that is terribly dark that is
71:19
terribly dark and terribly threatening
71:21
and what I the scenario I worry about is
71:25
I mean it’s conceivable that some sort
71:30
of repeat of what happened it’s hard for
71:34
me to even say this but some sort of
71:35
repeat of what happened in the late 30s
71:37
in Germany could come about here I can
71:39
imagine that scenario I can imagine it
71:42
vividly because my own grandfather
71:43
waited too late in Vienna and my mother
71:46
was taken as a child and survived the
71:49
concentration camp so I feel it’s very
71:51
keenly having a daughter myself and yet
71:56
I don’t think that’s the most likely bad
71:58
scenario here I think the more likely
72:00
bad scenario is that we just put up with
72:03
more and more shootings more and more
72:07
absolutely useless horrible people
72:10
becoming successful and one in one
72:12
theatre or another whether politicians
72:13
or company heads or entertainers or
72:17
whatever and gradually we don’t address
72:21
the climate gradually we don’t address
72:24
where we’re gonna get our fresh water
72:26
from gradually we don’t address where
72:28
we’re gonna get a new antibiotics from
72:30
gradually we don’t wonder how we’re
72:33
gonna stop the spread of viruses vaccine
72:37
paranoia is another one of these stupid
72:39
things that spread through these
72:41
channels gradually we see more and more
72:43
young men everywhere turning themselves
72:45
into the most jerky version of a young
72:47
man sort of various weenie suppress
72:51
supremacy movements under different
72:53
names from you know gamergate to in
72:57
cells – all right – proud boys –
73:00
whatever this is going to be like this
73:02
endlessly and then gradually one day
73:04
it’s too late and we haven’t faced
73:06
reality and that and we’re we no longer
73:10
have agriculture we no longer have our
73:12
coastal cities we no longer have a world
73:15
that we can survive in and I that is you
73:21
know it’s a kind of a what I worry about
73:23
is a terribly stupid cranky undoing
73:27
fight into us not a big dramatic one
73:30
it’s neither a whimper nor a bang but
73:33
just sort of a cranky rant that could be
73:38
our end and is that a laugh line I don’t
73:44
know you guys are pretty dark anyway so
73:51
what to do about it
73:53
so here there my characterization of the
73:57
problem overlaps strongly with a lot of
74:00
other people’s characterizations of the
74:02
problem mine is perhaps not identical to
74:06
the problem as described by many others
74:08
but there’s an F overlap that I think we
74:11
have a shared we meaning many people who
74:13
hope to change us have a shared sense of
74:15
what’s gone wrong no the first thing I
74:17
want to say in terms of optimism is Wow
74:19
is that better than things used to be if
74:21
I had been giving this talk even a few
74:24
years ago not long at all ago I would
74:28
have been giving the talk as a really
74:30
radical French figure who was saying
74:31
things that almost nobody accepted who
74:33
had lost friends over these ideas and
74:36
who was really kind of surviving on the
74:40
basis of my technical abilities in my
74:42
past rather than what I was saying
74:44
presently because it was so unpopular
74:45
the last especially since I would say
74:48
like brexit Trump but also just in
74:51
general like studies showing the
74:52
horrible increase in suicides and teen
74:55
girls that that scale with their social
74:58
media use all these horrible things that
74:59
have come out oh no something that’s
75:02
really different in Silicon Valley there
75:05
are genuinely substantial movements
75:07
among the people the companies to try to
75:09
change their act regulators at least in
75:12
Europe are starting to get teeth and
75:13
really look at it seriously the tech
75:17
companies are trying to find a way to
75:19
get out of the manipulation game they
75:22
haven’t necessarily succeeded and not
75:24
all of them are trying but some of them
75:26
are
75:27
and it’s a different world it’s a world
75:29
with a lot of people who are engaged so
75:31
now having presented and the problem as
75:34
I see it it’s possible to talk about the
75:38
solution now a lot of folks feel the
75:41
solution should be privacy rights the
75:44
European regulators are really into that
75:46
we had a major conference on that in
75:48
Brussels last week where Tim Cook who
75:52
runs Apple gave a fire-breathing talk
75:53
that kind of sounded like a talk I might
75:55
have given at some point in the past I
75:57
gave a talk there too and I was like wow
76:01
I’ve got the radical anymore it’s very
76:02
straight in a way in a way I kind of
76:04
mourn the loss of radicalness because
76:08
some part of me likes being the person
76:10
like at this outer edge and I’m not and
76:12
it’s kind of like oh god I’m supposed to
76:14
be the radical but anyway I am I think
76:20
it’s great that the Europeans are
76:21
pushing for privacy the theory behind
76:24
that is that the more the harder it is
76:29
for the manipulation machine to get at
76:31
your data the less it can manipulate and
76:33
the more maybe there’s a chance for
76:36
sanity there’s a peculiar race going on
76:39
because the societies and year of that
76:42
support regulation and have and have
76:45
regulators with teeth which we really
76:47
don’t have much of in the u.s. right now
76:48
are themselves under siege by these the
76:51
the cranky political parties who are
76:54
sometimes called right-wing populist but
76:56
I think should be just called you know
76:59
the crank parties and the the cranky
77:02
parties might bring these societies down
77:04
so there’s a race can the regulator’s
77:07
influence the technology in time to
77:09
preserve themselves or will the
77:11
technology destroy their politics before
77:13
they have a chance it’s a really so
77:16
that’s a race going on right now it’s
77:17
quite dramatic and I wouldn’t know how
77:19
to handicap it now the privacy approach
77:23
is hard because these systems are
77:27
complicated like if I say okay here’s
77:30
click on this button to consent to using
77:32
your data for this like I even obviously
77:34
can’t read them
77:36
thing and even if there’s some kind of
77:38
better regulation supporting it it’s
77:40
just nobody understands that even the
77:42
companies themselves don’t understand
77:43
their own data they don’t understand
77:44
their own security they don’t I mean
77:46
like this whole thing is beyond all of
77:48
us nobody’s nobody’s really doing it
77:50
that well everybody’s having data
77:52
breaches and discovering suddenly that
77:54
they were using data they didn’t think
77:56
they were using that’s happened
77:57
repeatedly at Google and Facebook in
77:59
particular so I’ve advocated a different
78:02
approach which is instead of using
78:08
regulators to talk about privacy get
78:11
lawyers and accountants to talk about
78:13
lost value from your data being stolen
78:15
now I have several reasons for that one
78:19
is I don’t think we’ll ever lose our
78:22
accountants and our lawyers I think
78:24
they’re more persistent than our
78:26
regulators that’s one reason and I’m not
78:30
going to do lawyer jokes because it’s
78:34
about the health society’s become some
78:35
mean-spirited I don’t like to make jokes
78:37
about classes of people even lawyers
78:38
anymore but in your so my best friends
78:43
are really you know them but anyway let
78:52
me give you an example that I like to
78:55
use to explain the economic approach
78:57
here there’s a tool online that I happen
79:01
to use frequently that I really like
79:03
which is automatic translation between
79:05
languages if you want to look at a
79:06
website in another language or send
79:08
somebody now you can go online and there
79:10
at least two companies that do this
79:12
pretty well now Microsoft and Google can
79:14
enter your text in one language a usable
79:16
translation comes out on the other side
79:18
convenient free great modernity however
79:24
here’s an interesting thing it turns out
79:28
that languages are alive every single
79:30
day there’s a whole world of public
79:31
events all of a sudden today I have to
79:35
be able to talk about the Tree of Life
79:36
shooter and you have to know what I mean
79:38
all of a sudden today I have to be able
79:40
to talk about the magibon Marie you need
79:42
to know what I mean so every single day
79:44
there all of these new reference points
79:46
that come out lately often horrible ones
79:48
sometimes my
79:49
once maybe a new music video and a new
79:53
meme that people like whatever so every
79:56
single day those of us who help maintain
79:59
such systems have to scrape meaning
80:02
steal tens of millions of example phrase
80:04
translations from people who don’t know
80:06
it’s being done to them so there are
80:08
tens of millions of people who are kind
80:10
of tricked into somehow translating this
80:12
phrase or that phrase in Google and
80:14
Microsoft have to grab these things and
80:15
incorporate them to update their systems
80:17
to make them work but at the same time
80:20
the people who are good at translating
80:22
are losing their jobs
80:24
the career prospects for a typical
80:27
language translator have been decimated
80:30
meaning their tenth of what they were
80:31
following exactly the pattern of other
80:34
information based work that’s been
80:36
destroyed by the everything must be free
80:39
movement recording musicians
80:41
investigative journalists crucially
80:43
photographers all of these people are
80:46
looking at about a tenth of the career
80:48
prospects that they used to have that’s
80:50
not to say that everything’s bleak all
80:51
there there are examples in each case of
80:54
a few people who find their way and this
80:57
gets to a very interesting technical
80:58
discussion which is I won’t but you get
81:01
a zipper curve where there are few
81:03
successful people and then it falls to
81:04
nothing whereas before you before you
81:06
had a bell curve but I can if anybody
81:08
wants to know more about that I can but
81:09
anyway you have a tiny number of
81:11
successful people but almost everybody
81:13
has lost their careers now wouldn’t it
81:15
make more sense if instead of making
81:20
money by providing free translations in
81:22
order to get other people who are called
81:24
advertisers to manipulate the people who
81:26
need the translations in some sneaky way
81:28
that they don’t understand and make the
81:31
whole world more cranky and less reality
81:32
oriented instead of that what if we went
81:37
to the people providing this phrase
81:39
translations and we just told them you
81:41
know if you could just give us the
81:42
phrase translations we really need then
81:45
our system would work better and we’d
81:47
pay you because then we’d have a better
81:48
system and then if we went to the people
81:50
who need translations and say free isn’t
81:53
really quite working because that way we
81:55
that means we have to get these other
81:56
people to manipulate you to have a
81:58
customer but we’ll make it really cheap
82:00
what about a die
82:01
a translation or something like that we
82:03
worked out some kind of a system where
82:05
the people who provide the translations
82:07
meet each other because it’s a network
82:08
we can introduce them they form a union
82:11
they collectively bargain with us for a
82:13
reasonable rate so that they can all
82:15
live put their kids through school and
82:17
then we get better working translators
82:20
and yeah you pay a dime you can afford a
82:22
dime and something everybody’s happier
82:24
no there are a few things about this
82:26
that are really good in my point of view
82:28
one is we no longer have these people
82:30
from the side paying to manipulate
82:32
people everything’s become clear – we
82:35
have a whole class of people making a
82:36
living instead of needing to go on the
82:38
dole instead of saying oh we need this
82:40
basic income because everybody is
82:41
worthless 3 we’re being honest instead
82:44
of lying which is a really big deal
82:46
right now we have to lie because we’re
82:49
not telling the people that were taking
82:50
their data we’re telling them oh you’re
82:52
buggy whips you’re worthless
82:53
but in secret we need you that’s a lie
82:56
and for there’s kind of a spiritual
82:59
thing here where we’re telling people
83:01
honestly when they’re still needed like
83:03
to tell people oh actually you’re
83:05
obsolete the robots taken over your job
83:08
when it’s not true when we still need
83:10
their data there’s something very cruel
83:12
about that it cuts to some sort of issue
83:14
of dignity and human Worth and it really
83:16
bothers me so for all these reasons this
83:18
seems like a better system to me and
83:21
sure we’d have to make accommodations
83:22
for those who can’t afford whatever the
83:24
rate would be for the language
83:25
translation but we can do that we’ve
83:27
almost figured out ways to do that if
83:28
we’re a decent society and we’d be a
83:30
more decent society because we wouldn’t
83:32
have an economy that’s strictly run on
83:35
making people into assholes so so that’s
83:41
why I advocate the economic approach so
83:44
I know it’s bad form but it can I refer
83:47
you to a paper to read go look up
83:50
something called blueprint for a better
83:52
digital society I’m sorry about the
83:54
title I didn’t make it up it’s an
83:56
editor’s fault Adi Ignatius it’s your
83:59
fault Adi and it’s a Harvard Business
84:01
Review recently you can find it online
84:02
very easily blueprint for better digital
84:05
society and it’s the latest version
84:07
about how to make this thing work and a
84:09
little bit about how to transition to it
84:13
so so that’s the solution I’ve been
84:16
exploring and promoting I think there’s
84:19
room for a lot of solutions another idea
84:22
is people like the Center for Humane
84:25
technology which is Tristan Harris in
84:27
another group called Common Sense Media
84:28
are trying to educate individuals about
84:31
how to be more aware of how they are
84:33
manipulated and how to make slight
84:35
adjustments to be manipulated a little
84:37
less worth trying remember it’s a sneaky
84:42
machine the whole industry is based on
84:43
fooling you so staying ahead of it is
84:45
gonna be work you can’t just do it once
84:47
and think you’re done it’ll be a
84:48
lifetime effort that’s why I think you
84:49
should just quit the things yeah when I
84:53
say can you please delete all your
84:55
social media accounts surely one of the
84:58
first thoughts and all your minds is
85:00
well that’s ridiculous I mean you’re not
85:02
going to get billions of people to
85:04
suddenly drop these things there’s
85:06
there’s two reasons why you’re correct
85:09
if you have that that that thought one
85:14
is that you’re addicted this is an
85:16
actual addiction you can’t just go to
85:18
somebody with a gambling addiction and
85:19
say oh just so you know any more than
85:21
you can do that if they have a heroin
85:23
addiction that’s not how addiction works
85:25
you can’t just say no it’s a prop it’s
85:27
hard
85:27
addiction is hard all of us have
85:29
addictions none of us are perfect but
85:31
this particular ones destroying our
85:33
future it’s really bad it’s not just
85:34
personal we hurt each other with this
85:36
one in an exceptional way so another
85:41
reason is network effect and that means
85:43
everybody already has like all their
85:45
pictures and all their past and all
85:47
their stuff on these properties that
85:49
belong to companies like Facebook and
85:50
for everybody to get off it all at once
85:53
they can continue to have connections
85:54
with each other is a coordination
85:56
problem that’s essentially impossible at
85:57
scale so that’s that’s a network effect
86:00
problem so why am I asking people to do
86:03
something that can only happen a little
86:05
and the reason why is even if it only
86:08
happens a little it’s incredibly
86:09
important so let me let me draw a
86:12
metaphor to some things that have
86:14
happened in the past we have in the past
86:17
had mass addictions that were tied to
86:24
corrupt
86:25
mercial motives at a large scale one
86:28
example is the cigarette industry
86:32
another example is big alcohol alcoholic
86:36
beverages I could mention others lead
86:39
paint is when I bring up in the book now
86:41
in these cases well actually the lead
86:44
paint was an addiction thing so I’ll
86:45
leave I’ll leave out lead paint so let’s
86:47
just talk about cigarettes and in the
86:50
case of cigarettes when I was growing up
86:52
it was almost impossible to challenge
86:55
cigarettes
86:56
I you know like cigarettes were manly
86:59
they were cool if you were on the red
87:02
side of America they were the cowboy
87:03
thing if you were on the blue side they
87:06
were the cool beatnik thing everybody
87:07
had a cigarette and you just couldn’t be
87:10
cool without your cigarette but enough
87:12
people finally realize that they could
87:14
get out from under it that at least it
87:16
allowed a conversation the addict will
87:19
defend it if you talk to somebody who’s
87:21
really addicted to cigarettes it’s very
87:22
hard for them to really get a clear view
87:25
of what the cigarette means to society
87:27
what it means to have cigarette in
87:28
public spaces there was a time in this
87:32
room would have been filled with
87:33
cigarette smoke and we would have been
87:35
gradually killing the students who were
87:36
attending I think I’m coughing in
87:42
sympathy with remembering what that was
87:43
like because it was really horrible
87:48
alcohol Mothers Against Drunk Driving
87:50
was or drunk drivers I forgot which it
87:53
is has been one of the most effective
87:55
political organizations they changed
87:57
laws they changed awareness they changed
87:59
outcomes and saved an enormous number of
88:01
lives despite the fact that once again
88:04
alcohol is cool it’s supposed to be cool
88:06
to drink at a frat party it supposed to
88:07
be cool to drink at your fancy
88:09
restaurant everybody loves drinking and
88:11
there’s this whole world event of
88:13
advertising liquor we found a reasonable
88:17
compromise in both cases we don’t throw
88:20
people who drink or smoke cigarettes in
88:23
jail like we’ve done for marijuana for
88:25
years instead we came up with a
88:28
reasonable policy don’t do it in public
88:30
don’t do it behind the wheel it worked
88:32
that was only possible because we had
88:36
enough people who were outside of the
88:38
addiction system
88:39
have a conversation in this case we
88:42
don’t have that in this case all the
88:44
journalists who should be helping us are
88:46
addicted to Twitter and making fools of
88:47
themselves if you’re a journalist in
88:50
this room you know I’m telling the truth
88:54
the same for politicians same for public
88:57
figures celebrities who might be helpful
88:59
we need to create just a space to have a
89:04
conversation outside of the addiction
89:06
system now you might be thinking oh my
89:08
god I’ll destroy my life if I’m not on
89:10
these things I don’t think it’s true I
89:13
think if you actually drop these things
89:14
you suddenly discover you can have any
89:15
life you want I’m not claiming that I’m
89:18
the most successful writer or public
89:21
speaker but I’m pretty successful I have
89:22
best-selling books I get around I you
89:25
know you hired me to come talk to you
89:29
and I’ve never had an account on any of
89:31
these things and you could say oh but
89:33
you’re an exception in this way well I
89:34
mean how much of an exception can I be I
89:37
play any points against me I’m like this
89:39
weirdo and and I still know seriously
89:42
you know I mean I still can do it if I
89:44
can do it probably other people can do
89:45
it too
89:46
I just don’t I think that there’s this
89:48
illusion that your whole life like
89:50
they’ll be you’ll just be erased if
89:52
you’re not on these things but that
89:53
illusion is exactly part of the problem
89:55
that’s that’s exactly part of this weird
89:59
existential insecure need for attention
90:05
at any cost bizarre personality
90:08
dysfunction that’s destroying us just
90:10
give it a rest
90:11
now here’s what I would say there was a
90:13
time it’s when if you were young
90:17
especially one of the priorities that
90:20
you felt in your life was to know
90:21
yourself and the only way to know
90:23
yourself is to test yourself and the way
90:25
you test yourself is maybe you’d go
90:26
trekking in the Himalayas or something I
90:30
used to hitchhike into central Mexico
90:32
when I was really young just a really
90:34
young teenager and that’s how I tested
90:36
myself these days I think the the
90:39
similar idea would be quitting your
90:41
social media and really deleting it like
90:43
you can’t you can’t like quit Facebook
90:45
and keep Instagram that’s you
90:46
have to actually delete the whole
90:48
thing
90:49
and and then it doesn’t mean you’re
90:52
doing it for your whole life
90:53
delete everything and then stay off
90:56
stuff for six months okay if you’re
90:59
young you can afford it it will not kill
91:01
you and then after six months you will
91:03
have learned and then you make a
91:05
decision in my opinion you should not
91:08
harm your life for the sake of the ideas
91:11
I’ve talked about today if it’s really
91:13
true that your career will be better or
91:15
whatever through using these things then
91:18
you need to follow your truth and do
91:21
what makes you succeed and if it’s
91:22
really true that being a serf just
91:24
stupid Silicon Valley giant is the thing
91:27
that helps your career okay but you have
91:32
to be the one making that decision and
91:33
if you haven’t tested yourself you don’t
91:36
have standing to even know so I’m not
91:40
telling you what’s right for you but I
91:41
demand that you discover what’s right
91:44
for you that I think is a fair demand
91:47
given the stakes and with that cheerful
91:51
closing I will call it
91:53
[Applause]
92:02
[Music]
92:06
so we have is that is a mic on so do we
92:10
have the question set people well we
92:14
were gonna have cards I don’t know if
92:15
any cards have made their way here’s a
92:17
card cards okay
92:19
I’m actually an unclear on how this
92:22
whole thing works okay well this is it
92:28
alright so normally I would get a bunch
92:32
of cards but but but I haven’t that
92:34
hasn’t happened yet okay lately I’ve
92:36
noticed that I was getting progressively
92:38
more cranky that’s now a technical term
92:41
I think you’ve introduced along with a
92:43
virtual-reality cranky from a lack of
92:46
sleep because of the excessive blue
92:48
light given off by screens ah have you
92:51
factored this effect into your theory oh
92:55
yeah
92:56
well there’s the time and stuff like
92:57
that there’s more as soon as soon as I
93:02
put blue filters on my screens I got a
93:03
lot less cranky okay there the the
93:08
problem of blue light keeping you up and
93:11
those are all real problems and in fact
93:13
you might want to just turn colour off
93:15
on your computer definitely turn colour
93:17
off on your phone and all seriousness
93:18
you don’t need it
93:19
for most things I have I have color I
93:23
use a phone but I definitely cover off
93:24
and like make those changes if you know
93:28
if you notice something like that yeah
93:32
you can right you can turn off the blue
93:38
light on your computer you can go into a
93:40
setting and you know the best way to do
93:43
it go to the visual accessibility
93:45
settings because they have these high
93:46
contrast settings for people who have
93:49
trouble focusing and they get rid of
93:50
color as we come stark contrast as an
93:54
example oh for God’s sakes I have to
94:00
enter this – I was going to show you
94:01
what it looks like but I’m not going to
94:02
bother with a code anyway you just you
94:04
can do it every major platform has this
94:06
ability it’s really that and you should
94:08
do it go to Common Sense Media org or to
94:15
Center for humane technologies website
94:17
and both of them have advice on how to
94:21
do things like this and another thing is
94:23
both I’m pretty sure both Windows and
94:26
Mac if it’s a computer have ways to make
94:28
the blue light go away as the evening
94:29
approaches there’s like this kind of
94:31
stuff this is real stuff and you should
94:33
pay attention to it and the technology
94:35
should serve you and not drive you crazy
94:36
but I do have to say this is not an
94:38
existential threat this is this is at
94:41
the level of too much sugar and
94:42
breakfast cereals or something like that
94:43
it is actually a real issue it’s it’s it
94:46
does have an effect on the health of the
94:48
population but it’s not going to destroy
94:49
us this other stuff I’m talking about is
94:51
at another level okay
94:53
so rather than the cards do we have
94:57
cards we do okay great let’s give some
95:02
cards is that your card oh okay great
95:07
okay here’s my card isn’t there a design
95:09
problem for publishing online if you
95:11
know who’s pointing at you how is that
95:14
related to the problem Allen turning
95:16
faced touring it says turning he hatched
95:20
the concept of a machine like
95:22
personality isn’t that too software what
95:25
listening and compassion is to human
95:27
communication yeah it’s a kind of
95:30
interesting question to me when I read
95:34
Turing’s final notes that the Turing
95:37
test comes up twice it comes up in a
95:39
little monograph he wrote and it comes
95:40
up in a sort of a little note there’s
95:43
two statements of it and in both of them
95:45
to me reading them there’s just this
95:49
profound sadness I feel like this is
95:51
this person who’s just screaming out so
95:55
some of you might I don’t know there’s a
95:56
whole history to this thing that what
95:58
trinket is he created a metaphor oh boy
96:02
let me try to do this as fast as I can
96:04
Turing did as much as anybody to defeat
96:07
the Nazis in World War two by braking
96:10
using one of the first computers that
96:11
ever existed to break a Nazi secret code
96:13
called enigma and he he was considered a
96:18
great war hero however he lived an
96:22
identity that was illegal at that time
96:24
which is that he was gay
96:25
and he was forced by the British
96:27
government after the war to accept a
96:30
bizarre crack treatment for being gay
96:32
which was to overdose on female sexual
96:35
hormones with this bizarre idea that
96:37
female hormones would balance his over
96:39
sexiness which was supposed to be the
96:41
gay it’s like so stupid it’s hard to
96:43
even repeat it and he started developing
96:46
female physiological characteristics as
96:48
a result of that treatment and it he
96:53
committed suicide by a sort of a weird
96:56
political thing where he laced an apple
96:58
with cyanide and ate it next to the
96:59
first computer sort of anti Eve or
97:03
something and he was a very brilliant
97:05
and poetic man and in the final couple
97:08
of weeks of his life he came up with
97:10
this idea of repurposing an old
97:14
Victorian parlor game that used to be
97:18
this thing we’d have a man and a woman
97:21
behind a curtain or a screen of some
97:26
kind and all they could do is pass
97:30
little messages to a judge and the judge
97:32
would have to tell who’s the man and
97:33
who’s the woman and each of them might
97:35
be trying to fool the judge which is
97:36
kind of a weird if you think about it
97:38
the Victorians were pretty kinky and
97:40
bizarre and and so what you’re doing is
97:45
as with behaviorism and as for the
97:47
internet you’re slicing away all of
97:48
these factors and just turning it into
97:50
like this limited stream of information
97:51
so it’s kind of like tweeting or
97:53
something and that so what Turing said
97:56
is what if you got rid of the woman and
97:58
you had a man in a computer and the
98:00
judge couldn’t tell them apart wouldn’t
98:03
then finally you have to admit that the
98:05
computer should be given rights and give
98:07
in stature and be treated and when you
98:10
read it I don’t the way I read it is
98:13
it’s this person saying oh my god I
98:14
figured out how to save the world from
98:16
these people who wanted to destroy
98:18
everybody based on being of the wrong
98:19
identity these people who wanted to kill
98:22
not only gays but of course Jews and
98:24
Gypsies and and and black people and
98:27
these horrible people and I came up with
98:30
this way of defeating them and now
98:31
you’re destroying me for who I am
98:33
and I feel like there’s this kind of
98:37
astonishing sadness in it and the way
98:40
it’s the way turns and so that was the
98:43
birth of the idea of artificial
98:44
intelligence and I feel like the way
98:46
it’s remembered is completely unlike
98:48
what it’s like to read the original you
98:50
know I feel like if you look at the have
98:51
you ever read the original Turing
98:53
because if you read the original Turing
98:54
I mean it’s like it’s intense you know
98:58
here’s this person who’s being tortured
98:59
to death it’s like it’s not some kind of
99:02
nerdy thing at all it’s it’s a it’s a
99:05
difficult it’s difficult to read the
99:07
documents and I think it was like this
99:12
crazy I think he knew he was about to
99:15
die and I think he was reaching out for
99:17
some sort of a fantasy of what kind of a
99:20
thing what would it take for people to
99:23
not be cruel what would it take and I
99:26
think in this very dark moment he
99:28
thought maybe giving up humanity
99:30
entirely and we’ll just maybe if we’re
99:32
just machines maybe we won’t do this to
99:35
ourselves and the thing about that of
99:37
course is we’ve turned ourselves sort of
99:40
into machines because we’ve all kind of
99:42
acting like machines to be able to use
99:44
this stuff you’re all sitting there all
99:45
day entering your like little codes to
99:47
get online that you’re sort of turning
99:49
into machines in practice and yet we’ve
99:51
just become more and more coral like
99:53
that that’s the the ultimate irony is
99:55
that it didn’t help so that’s my take on
99:58
it and this idea that AI is some could
100:02
be some form of compassion I think it’s
100:05
kind of I think it’s really
100:06
just a way of stealing data from people
100:07
who should be paid to translate AI is
100:11
theft
100:11
to paraphrase anyway okay we we a I is
100:23
just a way look all all we can do with
100:27
computers ever look to be a good
100:30
technologist you have to believe that
100:33
people are sort of mystically better
100:35
than machines otherwise you end up with
100:38
gobbledygook and nonsense you can’t
100:40
design for machines so AI has to be
100:43
understood as a channel for taking data
100:45
from one person to help another
100:47
I take I take the data from the
100:50
translators and I apply it through a
100:52
machine learning scheme or some kind of
100:54
scheme and I can get translations that
100:56
help people in a better way than I could
100:59
without that scheme in between which is
101:01
wonderful
101:02
so it’s technology to help people
101:04
connect in a way that’s more helpful if
101:05
you understand AI that way you elevate
101:08
people and you don’t confuse yourself
101:11
okay yeah we we we don’t have a lot of
101:13
time and we have a lot of great
101:15
questions so some questions are not
101:16
going to be able to be answered now
101:19
although I want to mention that there
101:21
will be a book signing and book
101:23
purchasing outside after the event is
101:27
over there’ll be two tables please if
101:29
you want to ask me long quest you can’t
101:31
go up through it come up to me and ask
101:32
like some open-ended giant question I’m
101:34
signing your book that would taken out
101:35
like by that you really can’t do that by
101:37
a book the people who are selling ebooks
101:39
have asked that you buy a book first
101:41
before you have it signed and that note
101:48
I’ll segue into there’s a couple
101:49
questions that are connected to this how
101:51
about your market solution arguably the
101:55
mess we’re in now comes from the
101:56
monopolistic and manipulative tendencies
101:58
inherent in markets given that the world
102:01
world has never known pure markets what
102:03
would keep this one pure oh it’s not
102:05
going to be pure it’s going to be
102:07
annoying and unfair and horrible but the
102:09
thing about it is it won’t be extent
102:10
existentially horrible
102:12
the thing about market so what I would I
102:15
believe about economic philosophies is
102:17
there’s never been one that’s worked out
102:19
in practice and instead just asked with
102:21
moral philosophies and theories of how
102:24
we learn and many many other areas where
102:26
we’re trying to deal with very complex
102:27
systems it’s not so much that we can
102:29
seek the perfect answer but we have to
102:31
trade-off between partial answers so to
102:33
me there’s never been a pure market
102:37
there’s never been and I don’t think
102:39
there ever could be but I think what you
102:41
can do is you can get a balance this was
102:43
the the Keynesian approach to economics
102:45
I think is very wise you get you you you
102:48
get a balance between reasonable
102:50
oversight and and in a reasonably
102:53
unfettered market and they’ll go through
102:54
cycle for the market will need help and
102:56
you just you trade off you trade
102:58
and I think that that’s that’s the only
103:01
path we have I think being eyed and
103:04
being an ideologue for any solution to a
103:07
highly complicated problem is always
103:08
wrong okay just two more then and
103:11
there’s a couple like this as well
103:14
what about the connection force of
103:15
social media eg for the feminist
103:18
movement like me to these online
103:21
communities raise awareness and create
103:23
supportive communities and then many
103:25
people who rely on social media for
103:27
community because of the demands of
103:28
capitalist jobs yeah yeah it’s just
103:32
that’s all true except that backfires
103:34
and the backfire is worse than the
103:36
original so like what happened the it
103:40
just keeps on happening I mean like
103:41
before me too there were there was a
103:44
problem of diversity in the gaming world
103:47
and a few women in gaming just wanted to
103:51
be able to say one or two things and not
103:52
be totally invisible and and then the
103:54
result of that was this for Asia’s thing
103:56
called gamergate that was just this
103:58
total never shut up totally wipe
104:01
everybody else out totally make it
104:02
everything horrible movement and then me
104:05
too has spawned this this other thing
104:09
that’s still rising which is the in
104:10
sells and the proud boys and all this
104:12
stuff and the problem is that in these
104:15
open systems at first your experience of
104:19
finding mutual support and creating
104:21
social changes as a entik it’s real it’s
104:23
just that there’s this machine you’re
104:25
not thinking about behind the scenes
104:26
that’s using the fuel you’re providing
104:29
in the form of the data to irritate
104:31
these other people because it gets even
104:33
more of a rise from them and you’re
104:35
creating this other thing that’s even
104:36
more powerful that’s horrible
104:38
even though it wasn’t your intent and
104:39
that’s the thing that keeps on happening
104:40
over and over again it doesn’t
104:42
invalidate the validity of the good
104:44
stuff that happens first it’s just that
104:46
it always backfires well not always but
104:48
typically and you end up you end up
104:53
being slammed and you don’t even like
104:55
one of the things that’s really bad
104:56
about it is that it’s you know it seems
104:59
like it’s just the fault of the creeps
105:01
who come up where it’s actually kind of
105:02
more the fault of the algorithms that
105:04
introduced the creeps to each other and
105:05
then got them excited in this endless
105:07
cycle of using your good intentions to
105:09
irritate the worst people so I mean
105:12
I know the thing is it’s cute
105:14
blacklivesmatter was great I think I
105:16
mean I think it’s wonderful and yet the
105:18
reaction to it was horrible and of a
105:21
higher magnitude and I just think we
105:23
have to find unfortunately until we can
105:26
get rid of the advertising model and the
105:28
giant manipulation machine every time
105:30
you use the big platforms for any kind
105:33
of positive social effect it’ll backfire
105:35
and destroy you and it’s it’s a fool’s
105:38
game even though it’s valid at first in
105:41
the long term it’s a fool’s game okay
105:44
this I don’t like saying that I hate
105:45
saying that it breaks my heart this is
105:47
the last question and it’s existential
105:49
wanna I’ll combine the two two questions
105:53
here seeing how pernicious social media
105:55
has become by being hijacked toward
105:58
bummer and you know bummer is another
106:00
technical term using yeah there’s a
106:04
wonderful writer on cyber things I’m
106:06
sherry Turkle and she read my book and
106:09
she said oh I love this book but there’s
106:10
just too much touching it
106:11
and the thing because it there’s like
106:13
bummer and there’s a cat’s behind on the
106:15
cover stuff and I the problem is I
106:19
married a woman who likes butt jokes and
106:21
I just can’t I don’t know some of they
106:23
just come I don’t know anyway okay so
106:29
how to how to guard against an immersive
106:32
technology like virtual reality becoming
106:35
even more insidiously bummer and then
106:38
how do you know what is real okay oh
106:41
well all right those are small questions
106:43
so the first one I mean I think the way
106:47
to keep fort reality vert reality could
106:49
be super hyper creepy I wrote a book
106:52
about vert reality that we have mention
106:53
it’s called dawn of the new everything I
106:55
don’t know if they’ll have it upfront or
106:56
not but I talked a lot about that issue
106:58
so virtuality could potentially be
107:00
creepy I think the way to tell whether
107:03
something’s getting creepy is whether
107:04
there’s a business model for creepiness
107:06
so if the way it’s making money is that
107:08
there’s somebody to the side who thinks
107:10
they can sneak lis alter you or
107:11
manipulate you that’s the creepy engine
107:14
if there isn’t that person and if there
107:15
isn’t that business going on it’s less
107:18
likely to be creepy I think this is
107:19
actually that’s actually a pretty simple
107:21
question to answer I think it boils down
107:23
to incentives I think incentives run
107:25
world as much or more than anything else
107:27
as far as this question of how to be how
107:29
do you know what’s real
107:30
the answer is imperfect what you do is
107:34
you struggle for it you struggle to do
107:37
scientific experiments to publish you
107:39
have to always recognize you can fool
107:40
yourself you have to recognize that
107:43
whole communities of people can fool
107:44
themselves and you just struggle and
107:45
struggle and struggle and you gradually
107:47
start to form a little island in a sea
107:50
of mystery in which you never have total
107:54
confidence but you start to have a
107:56
little confidence so there’s some things
107:58
that we can be confident of now the
108:01
earth is round not on line but do we
108:06
know it in an absolute absolute sense no
108:09
you can never know reality absolutely
108:11
but you can know it pretty well and so
108:13
in order to talk about reality you have
108:16
to be used you have to get used to near
108:21
perfection that is never actual
108:24
perfection and if you’re not comfortable
108:26
with that concept you have no hope of
108:27
getting to reality because that’s the
108:29
nature of reality reality is not
108:31
something you ever know absolutely and
108:32
in fact just to be clear I in one of my
108:35
books I defined reality is the thing
108:37
that can be never that can never be
108:38
measured exactly it’s the thing that can
108:40
never be simulated accurately it’s the
108:42
thing that can never be described to
108:44
perfection that is reality but at this
108:46
because the simulation can be described
108:48
to perfection I can describe to you a
108:50
video game world or a virtual world to
108:52
perfection I can’t do that with reality
108:54
and the the thing is though that we
108:59
can’t demand absolute knowledge in order
109:01
to have any knowledge at all or else we
109:03
make ourselves into genuine fools we
109:05
have to be able to accept that we can
109:07
have better knowledge than other
109:08
knowledge it’s all an incremental sort
109:11
of eternal improvement project so the
109:16
people who demand absolutely proof of
109:18
climate change or fools but they’re
109:21
interesting like I mean some of you
109:23
might have read there was a good history
109:25
published this week about the history of
109:28
the reading wars about how we learn
109:30
reading and there’s this community of
109:32
people who’ve just been absolutely
109:34
unable to accept a load of scientific
109:37
evidence about how did he
109:38
kids to read effectively because of an
109:40
ideology and they’re sincere and it’s
109:43
like people it’s really really hard
109:45
accepting reality is your life’s work
109:47
it’s really really really hard it’s it’s
109:50
not it doesn’t come naturally
109:52
necessarily it’s a discipline thank you
109:55
all right
109:56
[Applause]
110:03
[Music]
110:04
[Applause]

Rana Foroohar, “Don’t Be Evil”

00:05
very excited to introduce Rana Rana Zay
00:08
is the global business columnist and
00:11
natural times and CNN’s global economic
00:13
analysts previously she’s been the
00:15
assistant managing editor in charge of
00:16
Business and Economics at time as well
00:18
as the magazine’s economic columnist and
00:20
spent 13 years at Newsweek as an
00:22
economic foreign affairs editor and
00:24
correspondent and in her new book don’t
00:26
be evil which i think is a great title
00:29
Rhonda Chronicles how far big tech has
00:31
fallen from its original vision of free
00:33
information and digital democracy
00:35
drawing on nearly 30 years of experience
00:36
reporting on the technologies sector
00:39
Ronna traces the evolution of companies
00:40
such as Google Facebook Apple Amazon
00:46
into behemoths that monetize people’s
00:49
data spread misinformation and hate
00:51
speech and threaten citizens privacy she
00:53
also shows how we can fight back by
00:55
creating a framework that both fosters
00:57
innovation and protects us from threats
00:59
posed by digital technology her book is
01:02
already garnering widespread praise with
01:04
the Guardian calling it a masterly
01:05
critique of the internet pioneers who
01:07
now dominate our world so without
01:08
further ado please help me in welcoming
01:10
Rana for a heart to politics and prose
01:16
thank you I am so honored to be here
01:19
it’s really a pleasure this is one of my
01:22
favorite bookstores probably my favorite
01:24
bookstore in Washington and so it’s just
01:27
a huge pleasure I thought I would start
01:30
by just talking a little bit about how I
01:32
got the idea to write this book it’s
01:33
actually my second book my first book
01:35
makers and takers was a look at the
01:38
financial sector and how it no longer
01:40
serves business so I like to kind of
01:42
take on these big industry-wide maybe
01:45
take down so we’ve the word but kind of
01:49
look at an ecosystem and economic
01:50
ecosystem see how it’s working or not
01:52
working I got the idea for this book
01:56
probably two months into my new job at
02:00
the Financial Times
02:01
I was hired in 2017 to be the chief
02:05
business commentary writer so my my job
02:08
was to sort of look at the top world’s
02:11
business stories economic stories and
02:13
try to make sense of them in commentary
02:14
and when I do that I tend to try and
02:17
follow the money in order to narrow the
02:18
funnel of where to put my focus and I
02:20
had come across a really really
02:22
interesting statistic that 80%
02:25
of the world’s wealth corporate wealth
02:27
was living in 10% of companies and these
02:30
were the companies that had the most
02:31
data personal data and intellectual
02:34
property and so the biggest of those
02:36
were the big tech platforms that my my
02:38
book kind of tries to make icons of
02:41
we’re using all the candy colors here
02:43
the fangs Facebook Amazon Apple Netflix
02:47
Google so that was a pretty stunning
02:50
statistic and it was interesting because
02:51
I was thinking about how wealth since
02:54
2008 had transferred from the financial
02:56
sector into the big tech sector and that
02:59
had happened really quietly without a
03:02
whole lot of commentary in the press now
03:05
at the same time I was starting to kind
03:06
of dig into this story something else
03:08
happened a much more personal episode I
03:11
came home one day and I there was a
03:14
credit card bill waiting for me and I
03:16
opened it up and I started looking
03:17
through and there were all these tiny
03:19
charges in the amount of dollar
03:21
ninety-nine three dollars five dollars
03:22
whatever and I noticed that they were
03:25
all from the app store and I thought oh
03:28
my gosh I must have been hacked and then
03:30
I thought who else has my password my
03:33
ten-year-old son Alex I see nods from
03:38
parents and others so I go downstairs
03:42
and I find Alex on the couch with his
03:44
phone which is his usual after-school
03:46
position and I say you know what what’s
03:50
up do you know anything about this and
03:51
he sort of stunned and oh yes oh that
03:55
yeah and turns out alex has gotten very
03:58
fond of a game called FIFA Mobile which
04:01
is an online soccer game and it’s one of
04:03
these games that’s dude that you can
04:04
download it for free but once you get
04:07
into the game and start playing you have
04:10
to buy stuff
04:11
in-app purchases it’s called our loot
04:13
boxes is another another name so if you
04:16
want to move up the rankings and do well
04:18
in the game
04:19
you have to buy virtual Ronaldo or some
04:22
new shoes for your player and nine
04:24
hundred dollars and one month later Alex
04:27
was at the top of the rankings but I was
04:32
horrified I was actually horrified and
04:34
fascinated in fact I mean as
04:36
mother I was horrified his phone was
04:39
immediately confiscated passwords were
04:41
changed limitations were put into place
04:44
by the way he now officially is allowed
04:47
only one hour a day on his phone he’s 13
04:51
years old the average for that age is 7
04:54
hours a day national average now he
04:57
sneaks in an extra I think he probably
04:58
gets about 90 minutes because I can’t
05:00
police him all the time on the way to
05:01
the on the way to school but it’s I mean
05:04
to me that is a stunning fact that the
05:06
average American 13 year old spent 7
05:09
hours day on their phone anyway so I was
05:12
horrified as a parent but I was
05:13
fascinated as a business writer because
05:15
I thought this is the most amazing
05:17
business model I have ever seen and I
05:20
have to learn everything about it and
05:22
right about that time someone had come
05:26
to see Mia a man named Tristan Harris
05:28
who’s one of the characters in my book
05:29
and Tristan is a really interesting guy
05:32
he was formerly the chief ethics officer
05:35
at Google and he was trying to bring
05:39
goodness and not evil to the company and
05:42
make sure that all the all the products
05:45
and services were functioning sort of a
05:47
human interest and then he realized he
05:48
was not having any luck doing that
05:49
within the company so he decided to go
05:52
outside and start something called the
05:54
Center for Humane technology and Tristan
05:57
had become really really worried about
05:59
the core business model that is it’s
06:02
particularly relevant for Google and
06:05
Facebook but is also a big part of
06:07
Amazon’s model and and it’s really the
06:08
model that another author Shoshanna
06:10
Zubov who recently wrote a wonderful
06:12
book on this topic would call
06:14
surveillance capitalism and so it’s the
06:16
idea of companies coming in and tracking
06:20
everything you are doing online and
06:22
increasingly offline you know if you
06:24
have your if you have an Android phone
06:25
it might know where you are in the
06:27
grocery store if you’re in a car with
06:29
smart technology your your location
06:32
coordinates can be tracked so all of
06:35
this is serving to build a picture of
06:37
you that is then used to be sold to
06:41
advertisers and then you can be targeted
06:44
with what’s called hyper targeted
06:46
advertising which is essentially why for
06:49
example
06:50
if I go online to look for a hotel in
06:53
California I might get a certain price
06:55
but someone else might get a different
06:57
price so this is a really important
06:59
thing we are looking at different
07:01
internets right there are subtle
07:04
differences but they’re there and this
07:06
data profile that is being built up is
07:08
splitting us as individual consumers but
07:12
I would argue that it’s also splitting
07:14
us as citizens and I’ll when I get to
07:16
the readings I’ll kind of flush that out
07:18
a bit more but Tristan
07:20
kind of turned me on to this business
07:23
model and he also helped me connect the
07:25
dots between this business model and
07:27
what had happened to my son because it
07:29
turns out that the technologies these
07:31
sorts of nudges that take you down a
07:34
game or that bring you to certain places
07:36
on Amazon or that give you a certain
07:39
kind of search result or purchasing
07:41
option on Google are part of an entire
07:45
field called capped ology which is kind
07:49
of an Orwellian word and these these
07:52
technologies actually come largely out
07:54
of something called the Stanford
07:55
persuasive technology lab so there is an
07:58
entire industry that is designed to
08:01
track your behavior and pull in things
08:03
like behavioral psychology casino gaming
08:06
techniques and then layer those on to
08:09
apps that will push you towards making
08:13
purchasing decisions or perhaps even
08:15
other kinds of decisions political
08:16
decisions that might be good for certain
08:19
actors and it’s interesting because when
08:22
I started to think about all this one of
08:24
the things I really wanted to do in this
08:26
book was to cry try and create a single
08:28
narrative arc to take folks through this
08:31
20 year evolution of this industry from
08:34
the mid-1990s which is really when the
08:36
consumer internet was born till now and
08:39
at the time I was writing and and still
08:41
probably today you could argue that
08:43
Facebook was the company that was
08:45
getting the most negative attention for
08:48
a lot of the economic and political
08:49
ramifications of its business model but
08:51
if you go back to the very beginning
08:53
Google is the most interesting way to
08:56
track this because Google really
08:59
invented the targeted advertising
09:01
business model they really invented
09:03
surveillance capitalism and one of the
09:05
things that is fascinating and and
09:06
sometimes I’m asked what’s the most
09:08
surprising thing that you found when
09:10
writing this book and really the most
09:11
surprising thing is it was all hiding in
09:14
plain sight so if you go back to the
09:17
original paper the Larry Page and Sergey
09:19
Brin who were the founders of Google did
09:21
in 1998 while at Stanford as graduate
09:25
students they actually lay out they lay
09:28
out what a giant search engine would
09:30
look like how it would function but then
09:31
how you might pay for it and if you go
09:34
down to page 33 there is a section in
09:36
the appendix called advertising and its
09:38
discontents and it essentially says that
09:42
if you monetize a search engine in this
this way with hyper targeted advertising
the interests of the users and the
interests of the advertisers be they
companies or who knows what public
entities are eventually going to come
into conflict and so they actually
recommend that there be some kind of
academic search engine an open search
engine in the public interest so this to
10:05
me first of all is fascinating that it
10:07
was just there all along and fascinating
10:11
that very few people have read that
10:13
entire paper even though even those that
10:16
write about it which in some ways kind
10:18
of goes to the point that in the last 20
10:20
years we all do a lot less reading not
10:22
folks here but but in general we do less
10:25
reading there was actually a fascinating
10:26
study that came out recently from common
10:28
sense media which is Jim’s dyers group
10:30
in California that tracks children’s
10:33
behaviors online teenagers only
10:36
one-third of them read for pleasure more
10:39
than once a month
10:41
long-form articles doesn’t matter if
10:43
you’re reading on an e-book or device
10:44
but long-form articles books only once a
10:47
month for pleasure so all our entire
10:50
world has been changed economically
10:52
these companies have huge monopoly power
10:54
politically we’re all kind of living
10:56
with the ramifications of this new world
10:58
of social media disinformation fake news
11:01
and cognitively our brains are changing
11:05
our behaviors are changing so connecting
11:07
all of those things was really what I
11:10
was trying to get at in this book and so
11:13
I’m gonna read two or three maybe short
11:16
excerpt
11:17
and then we can leave a lot of time for
11:19
questions so that people can kind of
11:20
dive into as much of this as they want
11:23
and I’ll start perhaps with my very
11:28
first meeting with the Googlers Larry
11:33
Page and Sergey Brin who I met not in
11:36
Silicon Valley but in Davos the Swiss
11:39
gathering spot of the global power elite
11:42
where they had taken over a small Chalet
11:44
to meet with a select group of media the
11:47
year was 2007 the company had just
11:50
purchased YouTube a few months back and
11:52
it seemed eager to convince skeptical
11:54
journalists that this acquisition wasn’t
11:56
yet another death blow to copyright paid
11:58
content creation and the viability of
12:00
the news publications for which we
12:02
worked
12:02
unlike the buttoned-up consulting types
12:05
or the suited executives from the old
12:07
guard multinational corporations that
12:09
roamed the promenades of davos their
12:11
tasseled loafers slipping on the icy
12:13
paths the Googlers with a cool bunch
12:15
they wore fashionable sneakers and their
12:17
chalet was sleek white and stark with
12:19
giant cubes masquerading as chairs in a
12:21
space that looked as though it had been
12:23
repurposed that morning by designers
12:25
flown in from the valley in fact it may
12:27
have been and if so Google would not
12:29
have been alone in such access I
12:30
remember attending a party once in Davos
12:32
hosted by Napster founder and former
12:34
Facebook president Sean Parker that
12:37
featured giant taxidermy bears and a
12:39
musical performance by John Legend back
12:42
in the Google Chalet Brin and page
12:44
projected a youthful earnestness as they
12:46
explained the company’s involvement in
12:48
or authoritarian China and insisted
12:50
they’d never be like Microsoft which was
12:52
considered the corporate bully and
12:53
monopolist at the time what about the
12:55
future of news we wanted to know after
12:57
admitting that page read only free news
12:59
online whereas Brin often bought the
13:01
sunday New York Times in print it’s nice
13:03
he said cheerfully
13:04
the duo affirmed exactly what we
13:07
journalists wanted to hear Google they
13:09
assured us would never threaten our
13:10
livelihoods
13:11
yes advertisers were indeed migrating
13:14
and mass from our publications to the
13:15
web where they could target consumers
13:17
with a level of precision that the print
13:19
world could barely imagine but not to
13:21
worry Google would generously retool our
13:22
business models so we too could thrive
13:24
in the new digital world I was much
13:27
younger than and not the admittedly
13:29
cynical business journalist that I have
13:30
since
13:31
and yet I listened skeptically
13:32
skeptically to that happy future of news
13:35
like lecture whether Google actually
13:37
intended to develop some brilliant new
13:40
revenue model or not what alarmed me was
13:42
that none of us were asking a far more
13:44
important question sitting towards the
13:46
back of the room somewhat conscious of
13:48
my relatively junior status I hesitated
13:50
waiting until the final moments of the
13:52
meeting before raising my hand excuse me
13:55
I said we’re talking about all this like
13:57
journalism is the only thing that
13:58
matters but isn’t this really about
13:59
democracy if newspapers and magazines
14:02
are all driven out of business by Google
14:04
or companies like it I asked how are
14:06
people gonna find out what’s going on
14:07
Larry Page looked at me with an odd
14:10
expression as if he were surprised that
14:11
someone should be asking such a naive
14:13
question oh yes we’ve got a lot of
14:16
people thinking about that
14:17
not to worry his tone seemed to say
14:19
Google had the engineers working on that
14:22
little democracy problem next question I
14:26
read that because I’m kind of amazed
14:30
there is still a real lack of
14:34
understanding I think in the valley
14:36
about some of the real negative
14:39
externalities of what have been let’s
14:41
face it amazing technologies I mean we
14:43
you know where would we be without
14:44
search in our smartphones we all
14:46
carrying around the power of a mainframe
14:47
in our pockets but as a journalist I
14:51
think there’s really been a an inability
14:54
of these companies to kind of own up to
14:56
you know some of the bad stuff that they
14:59
have wrought and I think that that still
15:00
considers oh sorry still continues to be
15:03
to be the case one of the other points
15:06
that I try and make in the book is that
15:09
the problems I’m talking about have
15:12
actually moved outside of just the big
15:14
four flat platform firms that that we’re
15:16
moving into a world in which
15:17
surveillance capitalism is going to be
15:19
part of the healthcare system and the
15:21
financial system and really every kind
15:24
of business is now using this as its
15:26
model so for example if you buy coffee
15:29
at Starbucks Starbucks knows a lot about
15:30
you Johnson & Johnson knows a lot about
15:33
you there there are firms watching you
15:35
all the time and so we’re really at a
15:37
pivot point I think where we have to ask
15:40
as a society what are the deeper
15:43
implications of this and our
15:44
okay with them and so I would like to
15:47
read another excerpt where I look at how
15:50
this model is is moving into the
15:52
insurance sector and what that means so
15:58
far data has been obtained via computers
16:01
and mobile devices but now with the rise
16:03
of personal digital assistants like
16:05
Amazon’s Alexa Google’s home mini and
16:07
Apple Siri now at 30 and now in a third
16:10
of American homes with triple digit
16:12
sales growth a year the human voice is
16:14
the new gold while reports of Alexa
16:16
Alexa and Siri listening in on
16:18
conversations and phone calls are
16:19
disputed there’s no question that they
16:21
can hear every word you say and from
16:23
there it’s a short step to them using
16:24
that knowledge to direct your purchasing
16:26
decisions it isn’t much of a longer step
16:28
to see the political implications
16:30
already some researchers worry that
16:32
digital assistants will become even more
16:33
powerful tools than social media for
16:36
election manipulation certainly none of
16:38
us will be unaffected consider consider
16:41
that homeowner oops sorry
16:43
I’m reading from a reading from the
16:44
wrong part I think apologies somehow
16:54
picked the wrong section here anyway I’m
16:57
going to talk you through this example
16:58
because it’s it’s something that is
17:01
already out there I had a conversation a
17:03
couple of years ago with an executive
17:04
from Zurich Financial which is a big
17:07
financial company they do insurance many
17:10
parts of the world they will now if
17:12
you’d like them to put sensors in your
17:14
home or in your car and if you have for
17:18
example as I do you live in a 1901
17:20
townhouse let’s say you’re upgrading
17:22
your pipes you get a check you get a you
17:24
know a positive mark and you may see
17:26
your insurance premium go down but let’s
17:30
say your kid is smoking a joint in their
17:32
bedroom and the sensor picks up on that
17:34
you then get a black mark here and your
17:36
premium may go up same again in your car
17:39
if you’re speeding your insurance
17:42
company will know and so on and so forth
17:43
now you can either like this or not
17:46
depending on where you sit in the
17:48
socio-economic spectrum but what’s very
17:50
very interesting is that entire business
17:53
model a pooled risk business model
17:55
that’s what insurance is it’s now been
17:57
completely dissed
17:58
so you can be targeted and split so this
18:02
is no longer about society pulling risk
18:04
a saree pooling risk this is about
18:06
individuals having to own the risk so if
18:09
you take that to its natural conclusion
18:12
you can imagine an elite up here that
18:17
has access to special pricing and all
18:19
kinds of great products but you can also
18:21
imagine an uninsurable group of people
18:25
at the bottom and then who is going to
18:28
pick up that risk now the public sector
18:30
may be maybe they’ll be a junk bond
18:33
market for insurance either way you have
18:36
a split in society that didn’t exist
18:39
before and that was always the business
18:42
model here you know you go back and read
18:44
some of the early work of someone like
18:46
Hal Varian for example who was the chief
18:48
economist at Google splitting pricing
18:51
down to the individual was always the
18:53
point of platform technology firms like
18:56
Google or Facebook or Amazon splitting
18:58
individuals out so they could be
18:59
targeted in different ways but that not
19:01
only splits pricing it splits Society
19:05
and so that’s kind of really the the
19:07
core issue I want to get out here
19:10
I think I’ll maybe read just just one
19:13
more excerpt and then we can do we have
19:15
we have time yeah and then we’ll open it
19:17
up for questions after that my first
19:22
book just to mention again was about the
19:25
financial industry and one of the things
19:26
that strikes me is that big tech
19:28
companies have in some way become the
19:30
new too big to fail entities not only
19:33
are they holding more wealth and power
19:35
than the largest banks but in some ways
19:36
they function like banks they have a
19:39
tremendous amount of money they use it
19:41
to buy up corporate debt if that debt
19:44
were to go bad that could actually be
19:46
the beginnings of another financial
19:47
crisis and so that’s kind of a part of
19:49
this story that really hasn’t gotten out
19:51
there so let me let me read just two or
19:54
three more pages for you on that topic
19:57
the late great management guru Peter
20:00
Drucker once said in every major
20:01
economic downturn in US history the
20:03
villains have been the heroes during the
20:05
preceding boom I can’t help but wonder
20:08
if that might be the case over the next
20:10
few years as the you know
20:11
it states and possibly the world heads
20:13
towards its next big slowdown downturns
20:16
historically come about once every
20:18
decade and it’s been more than that
20:19
since the 2008 financial crisis back
20:22
then banks were the too-big-to-fail
20:24
institutions responsible for our falling
20:26
stock portfolios home prices and
20:28
salaries technology companies by
20:30
contrast have led the market upswing
20:32
over the past decade but this time
20:34
around it’s the big tech firms that
20:36
could play the spoiler role you wouldn’t
20:39
think that it could be so when you look
20:40
at the biggest and richest tech firms
20:42
today take Apple for example warren
20:44
buffett says he wished he owned even
20:45
more Apple stock Goldman Sachs is
20:47
launching a new credit card with the
20:48
tech Titan which became the world’s
20:50
first trillion-dollar market cap company
20:52
in 2018 but hidden within these bullish
20:55
headlines are a number of disturbing
20:57
economic trends of which Apple is
20:59
already exemplar study this one company
21:02
and you begin to understand how big tech
21:04
companies the new too-big-to-fail
21:05
institutions could indeed sow the seeds
21:08
of the next financial crisis the first
21:11
thing to consider is the financial
21:12
engineering done by such firms like most
21:15
of the largest and most profitable
21:17
multinational companies Apple has loads
21:19
of cash about 300 billion as well as
21:22
plenty of debt close to 122 billion
21:24
that’s because like nearly every other
21:27
large rich company it has parked most of
21:30
its spare cash in offshore bond
21:32
portfolios over the last ten years at
21:34
the same time since the 2008 crisis is
21:37
that it is issued cheap debt at rates to
21:41
do sorry it is issued cheap rate sorry
21:44
cheap debt at low rates in order to do
21:48
record amounts of share buybacks and
21:50
dividends Apple’s responsible about a
21:53
quarter of the 407 billion in buybacks
21:55
and out since the Trump tax bill was
21:57
passed in December of 2017 but buybacks
22:00
have bolstered mainly the top 10% of the
22:03
US population that owns 84% of all stock
22:06
the fact that share buybacks have become
22:08
the biggest single use of corporate cash
22:10
for over a decade now has buoyed markets
22:13
but it’s also increased the wealth
22:15
divide which many common economists
22:17
believe is that not only the single
22:19
biggest factor in slower than historic
22:21
trend growth but is also driving
22:22
political populism which threatens the
22:25
good system itself that phenomenon has
22:28
been put on steroids by the rise of yet
22:30
another trend epitomized by Apple
22:33
intangibles such as intellectual
22:35
property and brands now make up a much
22:37
larger share of wealth in the global
22:39
economy the digital economy has a
22:41
tendency to create super stars since
22:43
software and internet services are so
22:45
scalable and they enjoy network effects
22:50
let’s see do but as these as software
22:56
and internet services become a bigger
22:58
part of the economy they reduce
23:00
investment across the economy as a whole
23:02
and that’s not only because banks are
23:03
reluctant to lend to businesses whose
23:06
intangible assets may simply disappear
23:08
if they go belly-up but because of the
23:10
winner-take-all effect that a handful of
23:12
companies including Apple Amazon and
23:14
Google enjoy so to sum this up in plain
23:17
English as this handful of companies has
23:20
gotten bigger and more powerful
23:21
investment in the overall decline
23:23
economy has declined the number of jobs
23:26
that they’re creating relative to their
23:28
market size is much lower than that in
23:30
the past so you have the superstar
23:32
economy that has become kind of a
23:33
winner-take-all game I think that we’re
23:37
going to probably see some kind of a
23:39
market correction in the next couple of
23:41
years it’s going to be very interesting
23:43
at that point to see whether tech leads
23:45
the markets down and whether you might
23:47
then see a kind of an Occupy Silicon
23:49
Valley sentiment as you did in 2008 with
23:53
Occupy Wall Street I think that that’s
23:54
really quite possible we can delve more
23:57
into that if you’d like but I think I
23:59
want to stop here and be respectful of
24:01
question time and there are parts that
24:04
you guys want to hear more about or
24:06
particular areas that I could read more
24:08
from you can let me know go ahead
24:15
because sure we don’t get to speak very
24:18
often you and I one is you’ve doubtless
24:23
read about Bloomberg’s decision recently
24:26
to forbade its reporters from covering
24:28
Michael Bloomberg yeah yet The
24:31
Washington Post has no problem
24:34
investigating Vsauce do you see is that
24:38
a problem for you have you thought about
24:40
that is that a and so have any
24:43
consistency that should bother at
24:45
financial journalists and the second
24:46
question is how important for any
24:51
solution to the problems you you raise
24:53
would an tights for the revival of
24:56
antitrust be s we see on the continent
24:59
where it’s more aggressive and among
25:01
some of the the Democratic candidates
25:04
for the president well so let me take
25:06
the antitrust question first that’s
25:08
actually important part of the book
25:10
there’s an entire chapter on antitrust
25:12
and I think we probably are gonna see
25:15
some shifts as folks may know since the
25:19
1980s onward antitrust in America has
25:23
basically been predicated on price so as
25:25
long as consumer prices were falling it
25:28
was perceived that companies could be as
25:30
big as they wanted that it wasn’t a
25:31
problem but one of the things I look at
25:34
in the book is this this shift to a
25:36
world in which transactions are being
25:39
done not in dollars but in data so
25:42
that’s a that’s a barter transaction
25:43
really and one of the things that’s so
25:45
interesting and this is actually a way
25:47
in another way in which Silicon Valley
25:49
is similar to Wall Street the
25:50
transaction is really opaque so you
25:53
don’t know essentially how much you’re
25:55
paying for the supposedly free service
25:58
that you’re receiving that is a very
26:02
difficult market to create fairness
26:04
within and it probably makes the Chicago
26:07
School notion of consumer prices going
26:10
down no problem I think probably
26:13
irrelevant and so there’s two ways in
26:15
which that’s being dealt with you have
26:17
the rise of this new Brandeis school of
26:19
thinking in which you know maybe this is
26:22
really about power maybe maybe we should
26:25
think about the big tech firms
26:26
we do the nineteenth-century railroads
26:28
we’re alright you know you had at one
26:30
point railroad Titans that would come in
26:33
and build tracks and then own the cars
26:35
and then own the things that were in the
26:37
cars and eventually that became a
26:39
zero-sum game and it’s you know it’s as
26:42
folks probably know we’re in a period in
26:44
which there’s as much concentration of
26:46
wealth and power as there was in the
26:47
Gilded Age so I could imagine very
26:50
easily a scenario in which you could
26:51
justify Amazon say being the platform
26:55
for e-commerce but not being able to
26:57
compete in the specific areas of fashion
27:01
or you know whatever else they’re
27:03
selling against other customers and in
27:05
fact that’s already the case in the
27:06
financial sector that big companies that
27:09
trade let’s say aluminum you know as
27:12
Goldman Sachs did this is what it ran
27:14
into a suit a few few years ago that it
27:16
was both owning all the aluminum and
27:18
trading it and that’s that’s
27:20
anti-competitive and so that became an
27:22
issue for the Fed so I think we probably
27:24
are going to see that kind of ruling as
27:26
for the post and journalism you know
27:29
it’s funny I have some friends that are
27:30
they’re quite influential to post and
27:34
they say that Bezos is pretty hands-off
27:37
I mean I can’t I can’t vouch for that
27:38
one thing I will say is that Amazon did
27:41
put this book on the top 20 nonfiction
27:43
what Stern’s a month so you know I don’t
27:46
know if that’s a ploy to make me think
27:48
that they’re they’re being really fair
27:49
but from probably Jeff Bezos I don’t
27:52
know I he probably not thinking that
27:53
much about this book or me but anyway
27:56
next question go ahead so it seems like
27:59
some of the major decisions that these
28:01
big tech companies are making are in
28:04
regard to fake news and how they’re
28:06
moderating fake news or the lack of it
28:08
so have you seen maybe an approach by
28:11
any current social media platform or any
28:13
proposed plans in place that you think
28:15
would be best for moderating fake news
28:17
that’s such a good question so just to
28:20
kind of pull back the the two points of
28:22
view on that are hey look you know the
28:26
platform tech companies are essentially
28:27
giant media and advertising firms right
28:30
I mean if you look at the business model
28:31
of a Google or a Facebook it’s
28:34
essentially just like the Financial
28:35
Times or CNN it’s just much more
28:37
effective and it can be targeted to the
28:39
individual
28:40
that means that these firms have taken
28:42
you know 85 90 percent of the app new
28:45
digital advertising pie in the last few
28:47
years now given that they function as
28:49
media companies should they not be
28:51
liable for disinformation in the way
28:55
that a media company would be so if I
28:57
print something incorrect at the FT
28:59
that’s you know the the paper and also
29:02
my hide on the line there I think that
29:05
we should actually think about rolling
29:08
back some of those loopholes that these
29:09
firms enjoy since the mid-1990s onwards
29:12
I think that they are going to have to
29:14
take some responsibility now the
29:16
question is do we want Mark Zuckerberg
29:18
being the minister of truth and that’s
29:20
that’s that’s a really tough question
29:23
what I would prefer is for the
29:26
government to actually you know for
29:28
democratically elected governments to
29:29
come up with rules about what is and
29:32
isn’t appropriate and to not have
29:34
individual companies making those
29:36
choices I think we’re in a period right
29:38
now where you know you’ve got Twitter
29:40
you’ve got Google to a certain extent
29:41
coming out saying okay we recognize we
29:43
need to do things differently that’s
29:44
putting pressure on Facebook but at the
29:46
end of the day we’re gonna have to have
29:47
I think an entirely new framework not
29:51
just in this area but also in taxation
29:53
in you know an antitrust which we’ve
29:56
already talked about this is the shift
29:58
that we’re going through is I think the
30:00
new Industrial Revolution it’s a 70 year
30:03
transition and it’s going to require a
30:04
lot of different frameworks relative to
30:07
what we already have so the answer is no
30:10
I don’t see any particular company that
30:12
has come up with the right framework yet
30:14
any other questions
30:16
oh yeah I’d like to go back to antitrust
30:18
for a minute the Washington Post put up
30:20
an article just this afternoon about how
30:23
Apple is changing its business model and
30:25
it’s different as you know it’s
30:27
differentiated itself in the market by
30:29
saying they care about privacy well now
30:32
they are moving from a a device company
30:38
to a services company according to the
30:40
article and they are used and they are
30:43
using privacy as a lever to provide
30:46
services that their that other smaller
30:51
companies like tile which is the example
30:54
the article has used to create a market
30:59
for itself right and so it says in the
31:03
article that the feds are considering
31:04
looking at antitrust measures against
31:06
Apple but I think it raises a bigger
31:09
question that you pointed to which is
31:13
that the models of antitrust don’t work
31:16
anymore so in terms of privacy lots of
31:22
people have talked about monetizing
31:24
privacy getting paid yeah data but how
31:27
do you think from an economic point of
31:30
view we as a society need to look at the
31:33
role of privacy and the role of
31:35
antitrust together to somehow change the
31:38
way we think about these companies
31:41
because in addition we’ve got
31:43
consolidation in the marketplace so yeah
31:45
no longer fair competition you can’t
31:48
become another Amazon right easily
31:51
because there are so many big so mate
31:53
because the players are big and there
31:55
are so few of them in each part of the
31:58
economy yeah a right so there’s a lot in
32:00
what you’ve just said for starters I
32:03
think you’re hitting on something really
32:04
important which I get at in my solutions
32:06
chapter that this is such a huge shift
32:09
and it’s touching so many different
32:11
areas and we’ve talked about privacy
32:13
we’ve talked about antitrust we haven’t
32:15
even gotten into national security you
32:17
know civil liberties I mean there there
32:19
are so many different areas and when you
32:21
one of the things I noticed when I sat
32:23
down to write the solution sections you
32:25
know when you do a think book you always
32:26
have to have the solutions section and
32:28
you know the publisher wants like that
32:29
Silver Bullet thing and you look at this
32:32
and you notice that when you pull a
32:33
lever here it effects something in this
32:35
other areas so I think that’s one reason
32:38
why we should have a national committee
32:41
to actually look at what are all the
32:43
questions it’s when I speak to folks
32:45
particularly in DC policymakers there’s
32:47
you know the antitrust camp here the
32:49
privacy camp here the security folks
32:50
there that conversation needs to be
32:52
happening in a 360 way and it is
32:54
happening much more so that way in
32:57
Europe I will say I just came off of two
32:59
weeks of book touring in Europe and the
33:02
conversation there I think is much more
33:04
developed and they seem to be to go to
33:06
your point about the ecosystem and how
33:08
share it one of the things that seems to
33:11
be folks seem to be headed towards is a
33:13
public digital Commons a kind of a
33:16
database let’s say alright if you decide
33:19
as you know the cat seems to be out of
33:21
the bag that we’re gonna allow
33:22
surveillance capitalism I mean there
33:24
there are certain folks like Shoshanna
33:26
would love to see the dial turned back
33:28
I’m not sure if that’s possible let’s
33:30
have a public database in which not just
33:33
one corporation or a handful of
33:35
corporations but multiple sized players
33:37
as well as the public sector as well as
33:40
individual citizens who’s you know after
33:43
all it’s our data being harvested
33:45
everybody gets access and then you can
33:47
figure out how you want to share the pie
33:49
and one interesting example recently is
33:51
the Google sidewalk project in Toronto
33:54
it sounds like you’re up on these issues
33:56
so you’re probably aware but Google had
33:59
taken over sort of twelve acres on the
34:01
Toronto Waterfront and put sensors
34:04
everywhere and the idea was to create a
34:06
smart city in which you’d be able to
34:08
manage traffic patterns and energy usage
34:10
and things like that but until recently
34:12
Google was going to own all that data
34:14
and have access to and finally the
34:16
Toronto government got a clue and said
34:18
well actually you know what let’s put
34:19
this in a public database so other
34:22
smaller or midsize local firms can come
34:25
in and be part of that economic
34:26
ecosystem but also as a public sector we
34:30
can decide well maybe we want to share
34:32
data for energy issues or for health
34:36
issues but maybe we don’t want to share
34:37
it for certain other kinds of things and
34:40
perhaps there would be some way in which
34:42
individuals could take back some of that
34:44
value so California is thinking about a
34:47
digital dividend payment from the big
34:49
tech companies there’s also been talk of
34:51
a digital sovereign wealth fund if you
34:53
think about kind of data as the new oil
34:56
whatever the value is judged to be it
34:59
would be putting the sovereign wealth
35:01
fund in the same way that Alaska or
35:02
Wyoming give back payments or use that
35:05
for the the public sector that could be
35:08
done with data too so I think something
35:10
like that is probably going to be the
35:12
best solution I’ll tell you I have many
35:14
examples in the book of ways in which
35:16
the bigger players have been able to
35:18
squash small and mid-sized firms and
35:20
that
35:21
a major issue and a lot of venture
35:23
capitalists that I speak to are actually
35:26
becoming concerned about that because
35:27
they say that there’s sort of black
35:29
zones of innovation where if Amazon is
35:33
there or Google is there you really
35:34
can’t start a business there’s just been
35:36
too much that’s been been written
35:38
ring-fenced question over here
35:40
yes while your book may be the the best
35:43
one on the subject they’ve certainly
35:44
been other books before talking about
35:46
individuals privacy and their their data
35:49
and everything about them why is it that
35:52
you think people are so unconcerned
35:56
about handing over all of their data to
35:58
these companies when they are perhaps
36:00
very concerned about handing it over to
36:02
the government why why do they feel
36:04
these guys are the good guys and the
36:06
government is necessarily the bad guys
36:08
yeah it’s such an interesting question
36:11
and that really varies from country to
36:13
country I find that that’s sort of an
36:15
interesting cultural dynamic that can
36:17
shift depending on what market you’re in
36:19
I have really been puzzled as to why
36:23
people are so first of all why everybody
36:25
just clicks the box and says no problem
36:27
I think part of that is is the opacity
36:29
of the market I mean if you kind of go
36:31
back to Adam Smith basic economics you
36:35
need three things to make a market
36:36
function property properly that would be
36:38
equal access to data transparency in the
36:41
transaction and a shared moral framework
36:44
and you could argue that none of those
36:46
things are in place so when we’re making
36:48
these transactions I think as that’s
36:51
that very fact becomes better explained
36:56
and people begin to kind of understand
36:58
that narrative like the insurance
36:59
example I just gave that all right
37:02
you’re getting something but you’re
37:03
giving up a lot I’m beginning to see
37:07
pushback already and I suspect in recent
37:10
weeks as some of the big players have
37:11
moved into healthcare you know into into
37:14
the commercial banking business I just
37:17
think that we are going to begin to see
37:18
more people being reluctant to give up
37:23
that much value for what they’re getting
37:25
you’re also interestingly seeing when
37:28
there are other options people will go
37:31
elsewhere so Jimmy Wales who started
37:33
Wikipedia just I think
37:34
the weeks ago came up with a new social
37:36
networking site he’s already got 300,000
37:38
users there and it’s an odd
37:41
they don’t do targeted advertising it’s
37:42
run on the wiki model where you can
37:44
donate if you want I think once the
37:47
antitrust piece is in place and you
37:49
actually have space for new competitors
37:52
to come in and to offer up different
37:54
kinds of services that perhaps are more
37:56
respectful of privacy that you you know
37:58
you could see a shift there but I’m
38:00
curious actually can I pull the audience
38:02
for a minute because I want to ask how
38:04
many people think that in the next five
38:07
years individuals are going to become
38:09
more worried about giving up information
38:12
that’s going to change their behavior
38:13
online so like two-thirds but not yeah
38:19
that’s interesting okay oh go ahead
38:23
sorry we’re sheep we’re cheap oh my god
38:26
that was a different book curious if you
38:30
see the administration’s
38:32
suggestion that it the California can’t
38:35
set its own rules for gas mileage and so
38:41
on and emissions as having a parallel in
38:44
this area you know I hadn’t thought
38:48
about that question before I always
38:50
think about California as really being
38:53
very leading what is eventually going to
38:55
become the national standard and I think
38:58
in data I feel like that is gonna happen
39:01
you know even the Europeans in fact are
39:04
saying that the California model is
39:06
probably the better model for data data
39:08
protection and privacy and sharing of
39:11
value so the Europeans have GDP are you
39:13
know which was kind of the first step in
39:15
the privacy direction but it doesn’t
39:17
take into account that economic
39:18
ecosystem so perversely you have the big
39:21
companies maybe being able to do better
39:23
with the GDP our model and smaller ones
39:26
getting cut out of the loop because they
39:27
don’t have the legal muscle to kind of
39:29
deal with all the rules so I do think
39:31
the California model is going to become
39:32
a de facto standard we also haven’t
39:34
talked about China which is of course
39:36
going its own way and I have it I have a
39:38
chapter in the book where I look at that
39:40
I look at the current trade war tech war
39:43
kind of through the lens of surveillance
39:44
capitalism and
39:46
that’s gonna be very interesting I think
39:48
one of the big probably the biggest mid
39:52
to long-term economic question for me is
39:54
are we going to see a transatlantic
39:56
alliance around digital trade and coming
39:59
up with some standards because China is
40:01
going its own direction it’s going to
40:02
develop its own ecosystem it has its own
40:04
big players the u.s. is in another
40:07
category but where is Europe gonna be is
40:09
it going to be a tri polar world is it
40:11
going to be a bipolar world in terms of
40:13
how all this works that that’s a major
40:15
ik you cannot make an actually foreign
40:16
policy question I think hey thanks for
40:21
coming and thinking um I’m wondering we
40:25
have like a Department of Agriculture we
40:27
have a Department of Energy will there
40:29
be a Department of Technology ever in
40:30
the US and which other countries already
40:33
have that kind of thing going yeah
40:35
England is talking about that actually I
40:37
think kind of an FDA of Technology is
40:40
probably a very good idea you know I see
40:44
going back to the example about my son
40:46
there there
40:46
the research is nascent and causality is
40:49
is difficult to prove but there there’s
40:51
you know a new body of research since
40:54
2011 2012 when smartphones really became
40:57
ubiquitous showing that levels of
40:59
anxiety and depression and younger
41:01
people arising you know they’re there
41:04
they’re issues of self harm sometimes
41:07
when people you know use these
41:08
technologies addictively so I think that
41:11
that’s that’s a big issue to me it’s
41:12
very similar to cigarettes you know
41:14
those were regulated there was a
41:16
different narrative and then behaviors
41:17
changed and I think I think that that’s
41:20
one area to consider policy wise there
41:25
may be time for one or two more
41:26
questions
41:27
okay sorry over here and then over here
41:29
hi
41:30
I’m kind of curious what you think about
41:33
the fact that most of these
41:36
conversations around technology or even
41:38
democracy tends to focus on institutions
41:41
and systems and structures which is
41:44
great because they are so powerful and
41:47
ubiquitous my background is in teaching
41:51
critical thinking and in conflict
41:54
management and I what I worry
41:58
that so little attention is being paid
42:01
to the intelligence and maturity of the
42:05
citizenry I’m from India after 70 years
42:10
of democracy we’ve lost it I think it’s
42:15
simplistic to blame the right-wing
42:18
leaders and the government I believe we
42:22
as a people have not developed the
42:24
maturity to be effective intelligent
42:30
citizens we don’t have the values we are
42:34
still feudal we are still extremely
42:37
hierarchical we don’t have the
42:40
democratic values in India and we didn’t
42:43
cultivated over 70 years I see a
42:46
parallel to being susceptible to the
42:52
seductions of Technology whether it be
42:56
free news or the click baiting or
43:00
anything that the big companies seduce
43:04
us with that even as we need as you said
43:08
they an FDA kind of for technology we
43:13
seem to be observing ourselves of the
43:17
responsibility of being you know of
43:20
waking up and no se pians I hear I hear
43:24
what you’re saying and it’s interesting
43:25
two things come to mind first of all as
43:28
I say I just got back from Europe the
43:29
debate is much more nuanced there and
43:33
and further along and I think that’s in
43:36
part because there was not quite as much
43:39
pendulum shift in the last 40 or 50
43:41
years from the public sector to the
43:43
private sector as there was here I think
43:46
I’m not quite sure if I agree entirely
43:49
with your point about institutions I
43:50
think in some ways part of the problem
43:53
one of the reasons why we have
43:54
concentration levels that are same as
43:57
they were in the 19th century is that
44:00
you know we have a generation of
44:03
business leaders that grew up in the 80s
44:04
thinking that the government was only
44:06
good for cutting taxes and there’s hyper
44:09
individualism that’s that’s
44:12
the entire economic model and in some
44:14
ways I think that you know Facebook is
44:16
maybe the apex of the neoliberal
44:19
economic model if you think about the
44:22
problems of globalization were that cap
44:25
but you know it was supposed to be
44:27
globalization was supposed to be about
44:28
capital goods and people crossing
44:30
borders well it turned out the capital
44:31
could cross a lot faster than either
44:33
goods or people if you take that into
44:36
the world of data that’s even more true
44:38
and so I think that you have a group of
44:41
companies now that have really
44:44
turbocharged a lot of the problems that
44:46
have given us the politics that we have
44:49
now and and a company like Facebook I
44:51
mean I think it every time Zuckerberg is
44:52
on the hill it’s like there’s this
44:54
attitude that they are supranational you
44:56
know and kind of flying 35,000 feet
44:59
above national concerns and I think that
45:02
that’s part of a larger shift and
45:04
probably going to be a big part of the
45:05
2020 debate right are we gonna now have
45:08
a pendulum shift back away from private
45:12
power to some public power some
45:14
different sharing of that which is a
45:16
values question which I think gets at
45:18
some of what you’re talking about
45:20
long-winded answer anyway I think we
45:22
have time for maybe one more question
45:23
yeah – quick question okay one is some
45:27
of the tech companies especially the
45:29
platform companies have you know why
45:32
should we not consider looking at them
45:35
as utility companies yeah I mean we’ve
45:39
had phone companies and as far as I know
45:41
they don’t data mine our conversations
45:43
and maybe mistaken a bit right I mean
45:46
right that’s they could easily right
45:49
right yes it’s different different
45:50
business model yeah yeah so so that was
45:52
one the other thing is you mentioned
45:54
that eventually we need tech policy
45:56
around this and the issue at least my
45:59
issue is that the people who make these
46:01
decisions the the policy makers they
46:05
just most to them don’t have the
46:07
technical background right to properly
46:10
assess the different choices and make
46:12
those decisions I mean I think one of
46:15
them Zuckerberg or someone testified the
46:17
questioning was just awful I mean they
46:20
just ignore our tech support was
46:22
terrible
46:23
yeah exactly so I know anyway whatever
46:27
thoughts you have no that’s a great and
46:29
that’s like maybe a great place to sort
46:31
of wrap up I think the utility model is
46:34
completely viable and it’s interesting
46:36
one of the bits of pushback that you’ll
46:38
sometimes get from folks in the valley
46:40
about that is well if we’re if we’re
46:42
split in this way or if the the capacity
46:46
to innovate is sort of you know
46:47
compressed as the profit margins would
46:49
be compressed in a utility model that’ll
46:52
be bad for innovation not really I mean
46:54
there’s the statistics show for starters
46:56
that companies innovate more when
46:58
they’re smaller they tend to innovate
47:00
more when they’re private and breakups
47:03
in the past you can argue have actually
47:05
created more innovation so a lot of
47:07
academics would say that even the the
47:10
the the antitrust just the threat of
47:13
antitrust action against Microsoft was
47:15
one of the reasons that Google was
47:16
allowed to to blossom as it did you can
47:19
go back to the breakup of the bells and
47:22
say maybe that created space for the
47:25
cellphone industry to to move ahead so I
47:28
think there’s a lot of examples that a
47:31
more decentralized model is actually a
47:34
good thing and I think that that is
47:36
actually going to be a really important
47:37
thing because right now there’s this I
47:40
think very perverse debate in the u.s.
47:42
that is bringing together parts of the
47:45
far right and parts the far left that
47:47
all right we need these companies to
47:48
stay big because they’re the national
47:50
champions and the the becoming war with
47:52
China that is a complete bunk that is
47:56
not shown out first of all I mean these
47:58
companies would love to be in China if
47:59
they could get into China you know I
48:02
think decentralized is the advantage in
48:06
all respects in the US economically so
48:09
yeah I’m have no problems with a utility
48:12
model anyway I think my time is up but
48:15
I’d be happy to sign books and answer
48:17
any other questions here at the table
48:18
and thanks so much for your attention
48:19
[Applause]
48:34
you

How Google Edged Out Rivals and Built the World’s Dominant Ad Machine: A Visual Guide

The U.S. is investigating whether the tech giant has abused its power, including as the biggest broker of digital ad sales across the web

Nexstar Media Group Inc., the largest local news company in the U.S., recently tested what would happen if it stopped using Google’s technology to place ads on its websites.

Over several days, the company’s video ad sales plummeted. “That’s a huge revenue hit,” said Tony Katsur, senior vice president at Nexstar. After its brief test, Nexstar switched back to Google.

Alphabet Inc. ’s Google is under fire for its dominance in digital advertising, in part because of issues like this. The U.S. Justice Department and state attorneys general are investigating whether Google is abusing its power, including as the dominant broker of digital ad sales across the web. Most of the nearly 130 questions the states asked in a September subpoena were about the inner workings of Google’s ad products and how they interact.

We dug into Google’s vast, opaque ad machine, and in a series of graphics below, show you how it all works—and why publishers and rivals have had so many complaints about it.

Much of Google’s power as an ad broker stems from acquisitions of ad-technology companies, especially its 2008 purchase of DoubleClick. Regulators who approved that $3.1 billion deal warned they would step in if the company tied together its offerings in anticompetitive ways.

In interviews, dozens of publishing and advertising executives said Google is doing just that with an array of interwoven products. Google operates the leading selling and buying tools, and the biggest marketplace where online ad deals happen.

When Nexstar didn’t use Google’s selling tool, it missed out on a huge amount of demand that comes through its buying tools, Mr. Katsur said: “They want you locked in.”