Jaron Lanier: How the Internet Failed and How to Recreate It

Transcript

00:03
[Music]
00:07
welcome everybody I’m Nathaniel Deutsch
00:09
I’m the director of the humanities
00:11
Institute here at UC Santa Cruz and I
00:14
want to welcome everyone here to see
00:17
Jaron Lanier
00:19
he’ll be talking tonight I also want to
00:21
welcome everybody who is watching this
00:22
on a live stream that we have running at
00:26
the same time we’re very thankful for
00:28
the support from the Peggy Downes Baskin
00:31
humanities endowment for
00:32
interdisciplinary ethics for supporting
00:35
this lecture we’re also very thankful
00:37
for the Andrew W mellon foundation for
00:39
supporting year-long series of events
00:42
that will be hosting at the humanities
00:43
Institute on data and democracy and
00:46
Jaron Lanier stalk is going to be
00:48
launching that series in addition to the
00:52
event tonight we will be hosting in the
00:55
coming year a series of other events
00:57
including questions that matter at
01:00
khumba jazz center on January 29th which
01:04
we invite all of you to I know there’s
01:06
some of you have been to some of our
01:07
past questions that matter events and
01:09
also an event that we have been planning
01:13
actually for a while and has become even
01:16
more necessary because of the events of
01:18
recent days and that is a conversation
01:21
on anti-semitism in the internet which
01:25
we will be hosting on a date to be
01:27
announced
01:28
I want to thank there’s many people I
01:30
could thank but I’ll leave out the names
01:33
I’ve already cleared it with them I’m
01:34
just gonna thank thank their units the
01:38
humanities Institute staff which is as
01:39
always amazing and then the staff of the
01:43
humanities divisions development office
01:45
which is also always amazing so thank
01:47
you everyone for all the work that went
01:49
into this event tonight’s program will
01:52
include a lecture followed by again and
01:54
I think some music followed by a
01:58
question-and-answer session and book
01:59
signing and I’ll be talking a little bit
02:01
more about the book signing later
02:03
questions and answers will be
02:04
facilitated by note cards and we have
02:07
some uh sure’s that are moving around
02:10
the room and if you would like to
02:13
a question please raise your hand now
02:14
and they will hand you cards and you can
02:18
write out the the question they’ll pick
02:20
it up and then they’ll give it to me and
02:21
I will be facilitating the question and
02:23
answer that way so I’ve had the pleasure
02:29
of spending the day with Jaron and I can
02:32
tell you that he is a fascinating person
02:35
a very generous person as well with his
02:39
time he met with some students earlier
02:41
today and had a conversation with them
02:44
which was wonderful for him to do and
02:47
tonight he will be giving a lecture
02:50
we’re lucky to have him here he’s a path
02:53
breaking computer scientist a virtual
02:55
reality pioneer if I’m not mistaken you
02:58
coined the phrase mutual virtual reality
03:02
he’s a composer and artist and author
03:04
who writes and numerous topics including
03:06
technology the social impact of
03:08
Technology the philosophy of
03:09
consciousness and information internet
03:11
politics and the future of humanism and
03:14
one of the things that we believe in so
03:16
strongly at the humanities Institute is
03:18
that conversations about technology
03:19
cannot simply be left to a computer
03:22
scientists no offense to any computer
03:24
scientists in the room we love you too
03:25
but we also think that it is critical to
03:28
have people who work in the humanities
03:30
involved in those conversations and this
03:32
is part of why we are doing this tonight
03:35
he is the author of best-selling and
03:37
award-winning books including you are
03:39
not a gadget a manifesto and who owns
03:41
the future most recently he’s the author
03:43
of 10 arguments for deleting your social
03:45
media account right now his lecture
03:49
tonight is entitled how the internet
03:51
failed and how to recreate it please
03:54
join me in welcoming Jaron Lanier
03:56
[Applause]
04:04
hey how are you all any students here is
04:12
this all this is the the adult okay good
04:15
good ah good excellent there for you
04:17
here I’m going to start with some music
04:21
because some of what I have to talk
04:24
about is not the most cheerful stuff
04:26
because our times aren’t universally
04:28
cheerful lately and music is how I
04:32
survive anyway any of you heard me play
04:36
this thing okay
04:45
[Music]
05:06
[Applause]
05:44
[Music]
05:57
[Music]
06:06
[Music]
06:16
you all know weight is up you all know
06:19
what that is right yeah it’s called a
06:24
cab
06:25
it’s from Laos it’s arguably the origin
06:33
of digital information if you look at it
06:38
it’s got a parallel set of objects that
06:42
are either off Iran there’s 16 of them
06:45
in this one 16-bit number they go back
06:49
many thousands of years they appear to
06:52
be older than the abacus in ancient
06:55
times they were traded across the Silk
06:58
Route from Asia and were known to the
07:01
ancient Greeks and Romans the Romans
07:04
made their own copy which was called a
07:06
hydrolyse and it was a giant egotistical
07:10
Roman version that was so big it has to
07:14
be run on Steam it was operated by teams
07:19
of slave boys because despite have
07:22
Festus is best efforts they didn’t have
07:24
computer AI yet and the slave boys
07:29
couldn’t quite operate all the planks
07:31
that open to close the holes and sink
07:33
and so they developed this crossbar
07:35
system and we know about it because
07:37
there’s a surviving hydrolyse believe it
07:39
or not and that automation evolved along
07:45
with the hydrolyse in in two directions
07:47
it turned into the mediaeval pipe organ
07:50
and there were player mechanisms on the
07:52
earliest pipe organs experimentally and
07:55
it also turned into a family of string
07:58
instruments that had various assists
08:02
like the early pre clavichord
08:04
instruments that eventually evolved as
08:07
the piano the notion of automating these
08:10
things was always present so there were
08:12
always attempts to make player pianos
08:14
around Mozart’s time somebody made a
08:18
non-deterministic player piano which
08:20
meant it didn’t play exactly the same
08:22
thing twice Mozart was inspired by that
08:26
made some music that included dice rolls
08:29
but another person who was inspired was
08:32
a guy named jacquard who used the
08:34
similar mechanism to make a programmable
08:36
loom that in turn inspired somebody
08:40
named Charles Babbage to make a
08:42
programmable calculator and his daughter
08:46
ada
08:46
to articulate a lot of ideas about
08:48
software for the first time and what it
08:49
meant to be a programmer and then in
08:52
turn that all inspired a dimm’d fellow
08:55
named Alan Turing to formalize the whole
08:58
thing and invent the modern computer so
09:01
there’s a direct line this is it this is
09:03
the origin of digital information now of
09:07
course it’s not the only line and if I
09:10
was if I was paid to be a historian I
09:12
wouldn’t have told you that story with
09:14
such authority and yet I’m not so this
09:23
is a charming tale it’s a happy place to
09:26
begin it’s a it’s a reminder that
09:30
inventions can bring delight and joy and
09:33
it’s part of why I’m a technologist but
09:38
unfortunately we have some matters to
09:40
discuss here that are not quite so happy
09:45
we live in a world that has been
09:50
darkening lately it’s not just a
09:57
historical lensing effect where it feels
10:00
worse than ever it’s bad in a new way
10:03
there’s something weird going on and I
10:05
want to begin by trying to distinguish
10:09
what’s going on with our present moment
10:11
of darkness as compared to earlier times
10:14
because this is tricky
10:16
it’s almost impossible I think to not be
10:20
embedded in one’s moment in time it’s
10:22
almost impossible not to have illusions
10:27
due to where you’re situated right and
10:29
so I don’t claim to have perfected the
10:32
art of absolute objectivity at all I’m
10:35
struggling and I’m sure that I don’t
10:38
have it quite right but I want to share
10:39
with you my attempts up to this
10:41
now the first thing to say is that by
10:46
many extremely crucial measures we’re
10:49
living in spectacularly good times where
10:53
the beneficiaries of a steady
10:55
improvement in the average standard of
10:58
living in the world we’ve seen a
11:01
lowering of most kinds of violence we’ve
11:04
seen an improvement in health in most
11:07
ways and for most people it’s actually
11:11
kind of remarkable in many ways these
11:14
are really good times and those trend
11:17
lines go way back over over centuries
11:20
we’ve seen steady improvement of
11:22
societies kind of gotten its act
11:23
together and we’ve been able to hold on
11:26
to a few memories about things that
11:28
didn’t work so we’ve tried new things
11:30
we’ve we’ve developed relatively more
11:33
humane societies and relatively better
11:36
science and better Public Health and
11:38
it’s amazing it’s wonderful it’s
11:42
something that’s a precious gift to us
11:47
from earlier generations that we should
11:49
be unendingly grateful for and I always
11:54
keep that in mind I always keep in mind
11:56
that just in our modern human-made world
11:59
just the fact you can walk into a
12:01
building and it doesn’t collapse on us
12:02
is a tribute to the people who made it
12:04
and the people who funded them and
12:07
regulated them and the people that
12:09
taught them there’s like this whole
12:11
edifice of love that’s apparent all the
12:14
time that we can forget about and during
12:16
times that feel dark one of the
12:18
antidotes is gratitude and just in these
12:20
simple things
12:21
I feel extraordinary gratitude and it
12:25
reminds me of how overall there’s been
12:27
so much success in the project of
12:29
Science and Technology it’s so easy to
12:33
lose sight of that and yet there is
12:35
something really screwy going on that
12:38
seems to me to be fairly distinct from
12:42
previous problems it’s a new sneaky
12:45
problem we’ve brought upon ourselves and
12:48
we have yet to fully invent our way out
12:50
of it
12:51
so what exactly is going on
12:54
I think at a most fundamental level
12:59
we’ve created a way of managing
13:03
information among ourselves that
13:05
detaches us from reality I think that is
13:10
the most serious problem if the only
13:14
problem was that our technology makes us
13:18
at times more batty
13:21
more irritable paranoid more
13:26
mean-spirited more separated more lonely
13:30
if that kind of problem was what we were
13:33
talking about that would be important it
13:36
would be serious it would be important
13:38
to address it but what really scares me
13:42
about the present moment is that I fear
13:44
we’ve lost the ability to have a
13:47
societal conversation about actual
13:49
reality about things like climate change
13:53
the need to have adequate food and water
13:56
for peak population which is coming the
13:58
need for dealing with changes in the
14:01
profile of diseases that are coming
14:04
there’s so many so many issues are real
14:08
they’re not just fantasy issues their
14:10
existential real issues climate above
14:14
all and the question is are we still
14:18
able to have a conversation about
14:20
reality or not
14:21
that becomes the existential question of
14:23
the moment and so far the way we’ve been
14:26
running things has been pulling us away
14:28
from reality that scares me and I think
14:32
that’s the core darkness that we have to
14:34
address we can survive everything else
14:36
but we cannot survive if we fail to
14:38
address that now in the title of this
14:42
lecture I promised a little bit of
14:44
history how the internet got screwed up
14:46
or something like that so I’ll tell you
14:48
a bit about that but I want to focus
14:51
more on trying to characterize this
14:54
issue a little more tightly and trying
14:58
to explain at least my thoughts on how
15:00
to remedy it and maybe some other
15:02
people’s thoughts to try to give you a
15:04
bit of a sense of it
15:06
now to begin with one of the infuriating
15:11
aspects of our current problem is that
15:13
it was well foreseen in advance that’s
15:16
the thing about it nobody can claim that
15:18
they were surprised and I can point to
15:22
many folks who were talking about this
15:24
in advance I’m good as good a starting
15:26
place as any is to talk about iam
15:28
foresters story the machine stops who
15:31
here has read it ok well a few people
15:36
terrifying right all right
15:38
the machine stops was written I believe
15:41
in 1907 is that right it might have been
15:43
on nine but you know a century in a
15:47
decade ago or so and it foresees a world
15:50
remarkably like ours it’s a world and
15:53
this was written well before touring
15:54
well before any of this stuff
15:57
I mean before there was computation and
15:59
it describes a world of people in front
16:02
of their screens interacting social
16:04
networking doing search and getting lost
16:07
in a bunch of stupid and
16:10
finally when the machine experiences a
16:14
crash there’s this calamity on earth and
16:17
people become so dependent on it that
16:19
the loss of this machine becomes a
16:21
calamity in itself and at the very end
16:23
of the book people are crawling out from
16:24
their screens and looking at the real
16:26
world and saying oh my god the Sun and
16:29
it’s like this it’s a really amazing
16:32
piece because it’s possibly the most
16:34
precious thing prescient thing that’s
16:37
ever been written at all it was written
16:40
in part as a response to the techie
16:44
utopianism of the day it was a response
16:47
to writers like HG Wells saying wait a
16:50
second these are still going to be
16:52
people we have to think about what this
16:53
will mean to people it’s often the case
16:56
that the first arrive or on a scene has
16:58
a clearer view and can have this kind of
17:01
lucidity that later people find it very
17:03
difficult to achieve and I think
17:04
something like that happens very long
17:06
ago but then honestly we could talk
17:11
about Touring’s last writing just before
17:13
his suicide where he was realizing the
17:16
even though he played as great a role as
17:19
anyone in defeating fascism he hadn’t
17:21
defeated fascism at all because here he
17:24
was being destroyed for his identity you
17:28
all know the story of trade by now it’s
17:30
not obscure anywhere there was a movie
17:31
and everything for a long time I would
17:33
speak to computer science classes nobody
17:35
knew about Turing’s death at all which
17:37
is a scandal but at this point I think
17:39
everyone knows and if you read his final
17:42
writings you read this kind of in a way
17:45
an inner glow of somebody who does have
17:48
some kind of a faith and some kind of a
17:50
stronger Center but also this kind of
17:52
sense of defeat and by the way it’s
17:55
within the context of that that he
17:57
invented artificial intelligence that he
17:58
invented the Turing test and this notion
18:00
of this person who would transcend this
18:03
non person who could transcend sexuality
18:06
and be just this pristine abstract
18:08
platonic being an escaped oppression
18:11
perhaps but anyway so we have that in
18:15
the immediate early generation of
18:17
computer scientists we had Norbert
18:19
Wiener who here has read Norbert Wiener
18:22
I don’t see a single young person’s hand
18:26
up and unfold if you’re young if you’re
18:29
a student and you haven’t read any of
18:30
these people would you please correct
18:31
that and read them seriously you’ll
18:33
you’ll be so happy if you take this
18:34
advice I’m actually read these people so
18:37
Norbert wieners one of the very first
18:39
computer scientists first generation and
18:43
he wrote books that were incredibly
18:45
prescient about this he wrote a book
18:46
called the human use of human beings and
18:49
he pointed out if you could attach a
18:51
computer to input and output devices
18:53
interacting with a person you could get
18:55
algorithms that would enacted adaptive
18:58
behaviors technologies to take control
19:01
of the person and he viewed this as this
19:05
extraordinary moral failure that to be
19:08
avoided any SS thought experiment at the
19:10
end of the book Reese’s well you could
19:12
imagine some kind of global system where
19:13
everybody would have devices on them
19:15
attached to such algorithms that would
19:17
be manipulating them in ways they
19:19
couldn’t quite follow and this would
19:21
bring humanity to a disastrous end but
19:23
of course this is only a thought
19:24
experiment no such thing is feasible
19:25
because there wouldn’t be enough
19:27
bandwidth on the radio waves and all
19:28
this
19:28
you know he then explained why it
19:30
couldn’t be done and of course we built
19:32
exactly the thing he warned about I
19:35
could give many other examples I worked
19:40
on it myself in 92 I wrote an essay
19:41
describing how little AI BOTS could
19:45
create fake social perception in order
19:47
to confuse people and throw elections
19:49
big deal
19:51
lots of people were prescient about this
19:53
this wasn’t a surprise we knew and
19:57
that’s the thing that’s so depressing
20:00
there was a lot of good cautionary
20:03
science fiction there were a lot of good
20:05
cautionary essays there were good
20:07
cautionary technical writings and we
20:10
ignored all of it we ignored it all how
20:15
could that have happened so I I would
20:21
rather tell the story about how
20:23
everybody was surprised and a lot of
20:25
people who are entrepreneurs in Silicon
20:27
Valley were surprised but only because
20:29
they don’t like reading don’t be like
20:33
them so the social history of how
20:39
everything screwed up is a reasonable
20:41
way to talk about the particular way in
20:43
which is screwed up so I’m gonna give it
20:46
a try the first thing to say is that in
20:51
the generation of media technologists
20:57
and artists and viewers from immediately
21:00
before computation went pop in like the
21:03
60s into the 70s into the 80s some of
21:07
the personality dysfunctions and some of
21:10
the craziness was already apparent we
21:11
started to see this notion that anybody
21:14
could be a celebrity and people became
21:16
obsessed with this idea that maybe I
21:17
could be one and maybe there’s something
21:19
wrong with me if I’m not and this kind
21:21
of mass media insecurity obsession thing
21:28
I it’s hard to trace the moment when
21:33
this personality dysfunction really hit
21:35
the mainstream and really started to
21:37
darken the world
21:38
we were talking earlier actually about
21:41
what moment to choose I was thinking
21:42
actually the assassination of John
21:44
Lennon because here you had somebody who
21:46
basically just wanted to be famous for
21:47
being able to be a kill a random killer
21:49
and that was a little new if you look at
21:53
crappy evil people earlier sure there
21:57
was someone to be famous I don’t know
21:58
Bonnie and Clyde or something like that
22:00
but there are a few different things
22:02
about them one thing is that they were
22:06
also stealing money there was a kind of
22:07
a way in which they were I don’t know
22:10
there’s some kind of a part of a system
22:11
they had peers they weren’t they weren’t
22:13
typically total loners the most typical
22:18
profile of really evil person before was
22:21
actually a hyper conformist the typical
22:23
Nazi was actually somebody who didn’t
22:25
want to stand out who just was going
22:27
with the flow and and fully internalized
22:30
the the social milieu around them and
22:32
because it felt normal and that’s that’s
22:34
been a much more typical way that people
22:37
behaved appallingly in history this this
22:40
sort of weird loner celebrity seeker
22:43
thing I’m sure it existed before but it
22:45
started to become prominent I I want to
22:48
say something I’ve never said publicly
22:49
before but it’s just been gnawing at me
22:50
for many years I’m old enough to have
22:53
had some contact back in the day with
22:56
both Marshall McLuhan and Andy Warhol
22:58
who were two figures who had a kind of a
23:01
loose way of talking about this early
23:04
but they didn’t condemn it they just
23:06
stood aloof and say oh we’re super smart
23:09
and wise for being able to see this
23:10
happening and what they should have done
23:11
as they should have said this is
23:13
and I it’s actually really been
23:15
bothering me I’ve never said that before
23:16
I feel it should be said because once
23:19
again the first people on the scene
23:20
sometimes have a kind of a vision and
23:23
they should be judgmental about it the
23:24
way M Forster was and I feel like they
23:27
maybe failed us morally at that point
23:29
because they saw it better than a lot of
23:30
other people maybe than anybody at that
23:32
time anyway that’s maybe not useful to
23:35
say now but at some point it has to be
23:37
said let’s fast forward a little bit
23:41
computation starts to get cheap enough
23:43
that it’s starting to creep out of the
23:44
lab this is the early 1980s
23:48
and here we hit another juncture there
23:54
was this thing that happened oh man I
23:57
was right there for it it was the birth
23:59
of the open free software idea there was
24:01
a friend of mine named Richard Stallman
24:04
any chance Richards here no I guess not
24:08
anyway you never know when saw I saw
24:10
four things anyway Richard had this
24:13
horrible he like one day he just art
24:16
saying oh my god Mike my girlfriend’s
24:18
been killed like my lovers been killed I
24:20
said oh my god that’s horrible but what
24:22
it really was was the software system
24:24
he’d been working on for this kind of
24:26
computer and what had happened is it had
24:28
gone into a commercial mode where the
24:30
companies and it was I think all the
24:31
Lisp machine which would probably nobody
24:33
remembers anymore a sort of early
24:35
attempt to make an AI specialized
24:37
computer and he he was upset he said he
24:43
sort of melded his anger about this with
24:46
a kind of an anti-capitalist feeling
24:47
said no software must be free it must be
24:50
just the thing that’s distributed it
24:51
can’t be property property is theft and
24:54
it really spoke to a lot of people it
24:57
melded with with these other ideas that
24:59
were going on at the time and so it
25:02
became this kind of feeling I would say
25:05
sort of a leftist feeling that was
25:07
profound and remains to this day a lot
25:10
of times if somebody wants to do
25:11
something useful with tech they’ll have
25:13
to put in the word open-source lately
25:15
they also have to put in blockchain and
25:17
so very typically it’s open source it’s
25:19
got blockchain and then then you know
25:21
it’s good so there was this other thing
25:26
going on which is this feeling that the
25:27
purpose of computers was to hide and
25:31
that’s that deserves a little bit of
25:33
explanation
25:34
they were America has always had this
25:37
divide this red-blue divide or whatever
25:40
remember it used to be a north-south
25:41
divide we but we fought one of history’s
25:43
horrible wars once is a civil war and so
25:48
people on what we’d call now the red
25:49
side of the divide we’re very upset
25:54
there was a Democratic president named
25:56
Jimmy Carter that a few people other
25:58
than me in the room might be old enough
25:59
to remember and there was a period when
26:02
there was an Arab oil embargo and we did
26:05
we had long lines at gas stations and he
26:07
imposed a 55 mile an hour speed limit on
26:09
the freeways which a lot of people
26:12
really hated because I wanted to drive
26:13
fast and so this thing sprang up called
26:16
CB radio and CB radios were these little
26:19
analog radios you’d install on your car
26:21
and you’d create a false persona a
26:24
handle and then you’d warn other people
26:27
about where the police were hiding so
26:29
that you could all drive fast
26:30
collectively by sharing information and
26:32
it was all anonymous he could never
26:34
trace it and this thing was huge this
26:36
had as high a profile at the time as
26:38
Twitter does today probably there were
26:40
songs celebrating it it was a really big
26:42
deal but then on the left side of
26:45
America on the blue side people also
26:48
wanted to hide and in that case there
26:52
were two things going on one is the
26:53
draft hadn’t quite died down and it was
26:55
still the Vietnam era and that was just
26:57
terrifying because people didn’t really
27:01
believe in that war and the idea of
27:03
being drafted into this horribly violent
27:05
war that appeared to have no good
27:07
purpose just absolutely broke people’s
27:09
hearts and terrified people so they
27:11
wanted to hide and a lot of people did
27:13
and then there was marijuana and the
27:16
drug laws and a lot of people really
27:19
were hiding from those as well so you
27:21
basically had both red and blue America
27:23
feeling like the number one priority for
27:27
freedom for goodness is to be able to
27:29
hide from the government so encryption
27:32
and hiding and fake personas became this
27:36
celebrated thing so this in this milieu
27:41
there was this idea that online
27:44
networking which didn’t really exist yet
27:45
I mean we had networks but they were all
27:46
very specialized and isolated there
27:48
wasn’t a broad internet yet there would
27:50
be this idea that everything would be
27:52
free and open everything would be
27:54
anonymous and it’d just be like this
27:56
giant black weird place where you
27:59
everything you never knew anything but
28:01
you were also free and nobody could find
28:02
you
28:03
hmm okay so that was that was this
28:06
starting idea so there were a few other
28:09
things that fed into it another thing
28:11
was that there was a famous rock band
28:12
called the Grateful Dead that encouraged
28:14
people to tape their songs and didn’t
28:16
care about privacy and all this there
28:17
are all these different factors
28:19
now oh this was going on and then
28:21
simultaneously this other thing happened
28:23
which is we started to have the figure
28:26
of the glorified practically superhuman
28:31
tech entrepreneur and these were in the
28:34
80s they but these were figures like
28:36
Steve Jobs Bill Gates people we still
28:39
remember of course bill still with us
28:41
and they were just worshipped they were
28:45
the coolest people ever well around
28:47
around here in California people hated
28:49
Bill but they loved Steve and there was
28:55
this kind of interesting problem which
28:58
is we not we didn’t just like our tech
29:02
entrepreneurs we made them into sort of
29:04
superhuman figures the the phrase dent
29:08
the universe is associated with jobs
29:09
it’s this notion that there’s this this
29:12
kind of michi and super power to create
29:16
the flow of reality to direct the future
29:18
because you are the tech entrepreneur
29:19
and computation is reality and the way
29:22
we set these architectures will create
29:24
future societies and that’ll ultimately
29:25
change the shape of the universe once we
29:27
get even greater powers over physics and
29:30
there was just like this no end to the
29:32
fantastical thinking we were at the
29:34
birth point for every form of absolute
29:36
God like you know immortality and
29:39
shape-shifting and every crazy thing I
29:40
was a little bit of that I’m sorry to
29:43
say I was I kind of got a little off
29:44
I was pretty intense in the 80s myself
29:47
but anyway there was this feeling that
29:50
the entrepreneur could just just like
29:55
was had more cosmic power than the
29:58
average person okay so now here you have
30:00
a dilemma that had been kind of sneaking
30:03
up and nobody had really faced it on the
30:05
one hand everything’s supposed to be
30:07
free everything’s supposed to be
30:09
anonymous everything is supposed to be
30:11
like this completely open thing but on
30:14
the other hand
30:15
we love our entrepreneurs we worship our
30:17
entrepreneurs the entrepreneurs are
30:18
inventing reality so it should be clear
30:21
that there’s a bit of a potential
30:23
conflict here everything must be free
30:24
but we worship entrepreneurs how do we
30:27
do it how do we do it how do we do it
30:28
and so a set of compromises were created
30:32
over the years that ended up giving us
30:35
the worst of both sides of that I would
30:37
say so I’ll give you the the story is is
30:40
long and interesting but I’ll give you
30:42
just a few highlights one thing that
30:45
happened is when we finally got around
30:47
to actually creating the Internet
30:49
we decided it has to be super bare-bones
30:52
it would represent machines because
30:54
without having a number representing a
30:56
machine you can’t have an Internet but
30:58
it wouldn’t represent people it didn’t
30:59
intrinsically have accounts built-in for
31:01
humans it had no storage for humans
31:05
built-in it had no transactions it had
31:07
no authentication it had no persistence
31:10
of information guaranteed it had no
31:12
historical function we had it was like
31:14
super bare-bones just this thing
31:16
connects with that thing that’s all it
31:17
did and the reason why was that we were
31:21
supposed to leave room for future
31:23
entrepreneurs those who we worshipped
31:26
you know the Internet so if I was about
31:30
to say the Internet as you know was
31:31
invented by Al Gore some of you would
31:33
laugh and that’s because it was a laugh
31:36
line for a while because he was a
31:37
democratic he was a vice president and
31:40
before that a senator from Tennessee and
31:42
he was accused of over claiming that
31:44
he’d invented the internet on a TV show
31:45
which didn’t happen however I think he
31:48
should claim it I think he did invent it
31:50
he didn’t invent it technologically not
31:52
at all all of the underlying stuff which
31:55
is called a packet switch network and a
31:56
few other elements that existed in lots
31:59
of instances from before he had this
32:01
idea of throwing some government money
32:03
into it to bribe everybody to become
32:04
interoperable so they’d just be one damn
32:06
network and people could actually
32:07
connect that really was him and he
32:10
deserves credit for having done that
32:12
unless you think it was a terrible idea
32:14
but when that was happening
32:16
I remember having conversations about is
32:18
like we by creating this thing in such
32:21
an incredibly bare-bones way we are
32:24
creating gifts of hundreds of billions
32:26
of
32:26
for persons unknown who will be required
32:28
to fill in these missing things that
32:30
everybody knows have to be filled in and
32:34
then a little while later this other
32:36
thing happened which is Tim berners-lee
32:37
who’s great came up with a world wide
32:39
web protocol and here he did this thing
32:44
up to that point all of the ideas for
32:48
how to create shared you know shareable
32:51
media experiences online which are
32:53
called hypertext after Ted Nelson had
32:55
come up with the first Network design

32:57
back in 1960 the HTTP is from his for
33:00
hypertext they a core tenet of these is
33:04
that anytime one thing on the internet
33:06
pointed at something else that other
33:07
thing had to know it was being pointed
33:08
out so that there were two-way legs you
33:11
always knew who was pointing at you and
the reason for that is that way you
could preserve context provenance
history you could create chains of
payment where if people mashed up stuff
from somebody else in that person mashed
up from somebody else you could pay
payments that would populate back to pay
for everybody who contributed so if you
wanted to have an economy of information
you could
the information wouldn’t be
dropped but Tim just had one wailings
you could point it somebody they have no
idea that we’re being pointed out and
the reason for that is that it’s just to
actually do the two-way links is
genuinely a pain in the butt it’s just
more work if you do one way links the
whole thing could spread a lot faster

anybody can do it it’s just a much
easier system and that embedded in it
not only this idea of virality or me
meanness where whatever can spread the
fastest is what wins

and so it was a quantity over quality
thing in my view that was another thing
that happened so another thing that
happened didn’t come from Silicon Valley
in the late 80s people in Wall Street
started to use automated trading in the
first flash crash from out of control
trading algorithms was 89 and they
figured out something very basic
although an Forester had described
exactly this problem so much earlier
which is that if you had a bigger
computer than everybody else and it was
more central getting more information
you could calculate ahead of everybody
in gained an information advantage and
in economics information advantages
everything so if you’ve had just a
little bit more information on everybody
else you could just turn that into money
and it wasn’t really new insight but it
had actually been implemented before
then shortly after that a company called
Walmart realized they could apply that
not just to financial instruments to
investments but to the real world and
they created a software model of their
supply chain and dominated it
they could
35:10
go to anybody who was involved somewhere
35:12
in giving them products and figure out
35:14
what their bottom line was so they could
35:15
negotiate everybody down they knew who
35:17
everybody’s competitor was they went
35:19
into every negotiation with superior
35:21
information when they built this giant
35:23
retail empire on information superiority
35:28
Dada all happened before anybody in
35:30
Silicon Valley started doing it okay now
fast-forward to the birth of Google so
you have these super bright kids Sergey
and Larry some of the students I talked
to today on campus here remind me of
35:43
what they were like at the same age
35:44
super bright super optimistic idealistic
35:47
actually focused and they were backed
35:54
into a corner in my view on the one hand
35:57
the whole hacker community the whole
tech community would have just slammed
them if they did anything other than
everything being free but on the other
hand everybody wanted them to be the
next Steve Jobs the next Bill Gates
that
was like practically a hunger like we
want we want our next star and the only
way to combine the two things was the
advertising model
the advertising model
would say you’ll get everything for free
you can be you know as far as you’re
concerned your experience is you just
36:26
ask for what you want and we give it to
36:28
you now the problem with that is that
36:31
because it’s an advertising thing you’re
36:35
actually being observed your information
36:37
is being taken you’re being watched and
36:39
there’s a true customer this other
36:41
person off to the side who at first you
36:45
were always aware of because you could
36:46
see their little ads you know they’re
36:48
like if your local dentist or whatever
36:50
it was cute at first it was harmless at
36:51
first
36:54
and unfortunately if they come up with
37:00
this thing
37:01
after I don’t know worse law had ended
37:04
in computers were as fast as they were
37:05
ever gonna get and we’d established a
37:08
whole regulatory and ethical substrate
37:10
for computation everything maybe it
37:12
could have worked but instead they did
37:14
it in a period where there was still a
37:16
whole lot of Moore’s law to happen so
37:18
all the computers got faster and faster
37:19
cheaper and cheaper more more plentiful
37:21
more and more storeit or more connection
37:22
the algorithms got better and better
37:25
machine learning kind of started to work
37:27
a little better a lot of these
37:29
algorithms kind of kind of figured it
37:30
out we had enough computation to do
37:32
experiments and get all kinds of things
37:34
working that hadn’t worked before all
37:36
kinds of little machine vision things I
37:38
sold them on machine vision company
actually and the whole thing kind of
accelerated and what started out as an
advertising model turned into something
very different and so here we get into
our description of at least my
perception of the state that we’re in
37:54
right now so I mentioned earlier that
37:58
Norbert Wiener had described what he
38:04
viewed as a potentially horrible outcome
38:06
for the future of computation where
you’d have a computer in real time
observing a person with sensors and
providing stimulus to that person in
38:14
some form with displays or other
38:16
effectors and implementing behavior
38:19
modification feedback loops in order to
38:23
influence the person and if that was
38:24
done globally it would detach humanity
38:27
from reality and bring our species to an
38:29
end that was the fear back in the 50s
38:31
now unfortunately this innocent little
advertising model which was supposed to
address both the desire to have
everything be this Wild West open thing
and the desire to have entrepreneurs
despite everything being free landed us
right in that pocket that’s exactly
where we went
38:53
now I should say a bit about behaviorism
38:56
because that’s another historical thread
38:58
that led to where we are behaviorism is
39:02
a discipline of reducing the number of
39:07
variables in the training of an organism
39:10
so that you can corporal’s them
39:12
rigorously and reproduce effects so
39:15
let’s say if you’re whispering into your
39:18
horses ear while you’re training your
39:19
horse
39:20
that’s not behaviorism if you’re
39:22
whispering into your kids ear even if
39:24
you do offer some treats once in a while
39:26
ten cards behavior that’s not
39:27
behaviorism it has elements of it but
39:30
hardware behaviors and reduces the
39:32
variables and says look what we want to
39:34
do if we want to isolate we want to say
39:36
here’s this organism it’s in a box
39:38
sometimes they’re called Skinner boxes
39:41
remembering BF Skinner one of the famous
39:43
behaviorists and we want to say if the
39:46
creature person human whatever does a
39:48
certain thing you want you give the
person the treat does something you
don’t want give them a punishment
typically maybe candy and electric shock
39:58
the timing and the occurrence of these
40:02
things is guided by an algorithm you
40:04
find him the algorithm you need to
40:06
discover how to change behavior patterns
40:08
this science of studying behavior
40:11
behaviorism yielded surprises really
40:16
interesting surprises very early on the
40:19
first celebrity behaviorist was probably
40:22
Pavlov you’ve all heard of Pavlov I’m
40:24
sure and he demonstrated famously that
40:27
he could get a dog to salivate upon
40:30
hearing a bell whereas previously the
40:32
dogs salivated
40:33
upon being given food and hearing the
40:36
Bell so he was able to create a purely
40:39
symbolic seeming stimulus to replace the
40:43
original concrete one that’s quite
40:45
important because in many areas today
40:48
where behaviors modified addictions are
40:50
created there only abstract stimuli this
40:53
is true for instance for gambling’s that
modern gambling is based on this so are
like little games like candy crush were
there pictures of candy instead of real
41:01
candy now I have no doubt someday
41:04
there’ll be some Facebook or Google
41:07
hovercraft you know drone over your head
41:10
that drops real candy and electric
41:11
shocks on your head but for the moment
41:13
we’re in this symbolic realm that that
41:16
pavlov uncovered another amazing result
41:21
is that you might think naively that’s
41:23
simply providing punishment and reward
41:26
as reliably and as immediately as
41:29
possible would be the most effective way
41:31
to change behavior patterns but actually
41:33
that’s not true it turns out that adding
41:36
an element of randomness makes the
41:39
algorithms more effective so we don’t
41:44
fully just to state the obvious nobody
41:46
really understands the brain as yet but
41:49
it appears that the brain is is
41:52
constantly in a natural state of seeking
41:55
patterns of trying to understand the
41:57
world so if you provide a slightly
41:59
randomized feedback pattern it doesn’t
42:02
confuse or repel the brain instead of
42:05
draws the brain in the brain is a gate
42:06
there must be something more to
42:07
understand there must be something more
42:09
and gradually you’re drawn and more and
42:11
more and more and so this is why the
42:15
randomness of when you win at gambling
42:17
is actually part of the addiction
42:19
algorithm that’s part of what makes it
42:21
happen
42:21
now in the case of social media what
42:24
happens is the reward is when you get
42:27
retweeted or you go viral something like
42:30
that the term of art in Silicon Valley
42:33
companies is usually a dopamine hit
42:35
which is not an entirely accurate
42:37
description but it’s the one that that’s
42:40
most commonly is for when you have a
42:41
quick rise of a positive reward but just
42:45
as the gambler becomes addicted to the
42:48
whole cycle where they’re losing more
42:50
often than they win a Twitter addict
42:53
gets addicted to the whole cycle where
42:56
they’re most often being being punished
42:59
by other people who are tweeting and
43:00
they only get a win once in a while
43:02
right it’s the same it’s the same
43:05
algorithm and indeed
43:09
one of the side effects so in the trade
43:14
the terminology we use is engagement we
43:17
have algorithms that drive engagement
43:19
and we hire zillions of people with
43:22
recent PhDs from psych departments this
43:24
whole program there’s a program called
43:26
persuasive technology at Stanford where
43:28
you can go get a PhD in this and then
43:30
you get hired by some tech company to
43:32
drive engagement but it’s it’s really
43:34
just a sanitized word for addiction so
43:40
we drive addiction using a variety of
43:42
these algorithms and we can study them
43:45
more than the classical behavior server
43:46
did because we can study a hundred
43:48
million instances at once and and and we
43:52
can put out a hundred million variations
43:53
on all kinds of people and correlated
43:55
with data for all those people and then
43:58
cycle and cycle in a cycle the
44:00
algorithms can find new pockets of
44:03
efficacy they can tweak themselves until
44:06
they work better and we don’t even know
44:07
why they’re far ahead of any ability we
44:10
have to really keep up with them and try
44:12
to interpret exactly why some things
44:13
work better than other things
44:14
now even so it’s important to get this
44:18
right the effect is in a way not that
44:22
dramatic so Facebook for instance has
44:26
published research bragging that it can
44:28
make people sad and they don’t realize
44:29
that they were made sad by Facebook now
44:31
by the way you might wonder why would
44:34
Facebook publish that wouldn’t they want
44:37
to hide that fact it sounds pretty bad
44:39
but you have to remember that you’re not
44:42
the customer of Facebook the customer is
44:44
the person off to the side we’ve created
44:46
a world in which any time two people
44:48
connect online it’s financed by a third
44:51
person who believes they can manipulate
44:52
the first two so to the degree Facebook
44:55
can can convince that the third party
44:58
that mysterious other who’s hoping to
45:00
have influence that they can have some
45:03
mystical magical unbounded sneaky form
45:06
of influence then Facebook makes more
45:08
money that’s why they published it and
45:11
I’ve been at events where this stuff is
45:14
sold by the various tech companies and
45:15
they there’s no end to the brags and the
45:18
exaggerations when it comes to telling
45:20
the true customers what their powers are
45:22
very different from their public stance
45:23
but at any rate the the the darkness of
45:33
this all is that when you use this
45:37
technique to addictive people and we
45:40
haven’t even gotten to the final stage
45:41
of influencing their behavior patterns
45:42
we’re still just at the first stage of
45:44
getting them addicted you create
45:46
personality dysfunctions associated with
45:49
addiction because it is a form of
45:50
behavioral addiction so if any of you
45:53
who have ever dealt with somebody who’s
45:55
a gambling addict the technical
45:58
qualities of gambling addiction are
45:59
similar to the technical qualities of
46:02
social media addiction now I was just
46:06
saying before that we have to get this
46:07
right and understand the the degree of
46:10
awfulness here because it’s actually
46:13
kind of slight but just very consistent
46:15
and distributed a gambling addiction can
46:18
be really ruinous somebody can destroy
46:20
their lives and their family a social
46:22
media addiction can be ruinous as we’ve
46:24
seen by unfortunate events in just the
46:27
last few days but more often there’s a
46:30
statistical distribution where a
46:32
percentage of people are kind of
46:35
slightly effective and have their
46:37
personality slightly changed so what
46:40
will happen is some percentage and in
46:42
some of the studies I’ve seen published
46:44
maybe it’s like 5%
46:46
show like a three percent change in
46:48
personality or something like that so
46:49
and this is over hundreds of millions of
46:51
people or even over billions so it’s a
46:53
very slight very distributed statistical
46:56
effect on people with just a few who are
46:59
really dramatically affected but the
47:02
problem with that is that it compounds
47:07
like compound interest a slight effect
47:10
that’s persistent consistent repeated
47:14
starts to darken the whole society so
47:17
let’s talk a little bit about the
47:18
addictive personality that’s brought out
47:20
by these things the way I characterize
47:23
it is it becomes paranoid
47:28
insecure a little sadistic it becomes
47:36
cranky now why why those qualities so I
47:44
have a hypothesis about this and here
47:46
I’m hypothesizing a little ahead of
47:50
experimental results in science so I
47:53
want to make that clear this is a
47:54
conjecture not not something that I can
47:57
cite direct evidence for what I but but
48:01
all the the components of it are all
48:03
well studied so it’s just putting
48:05
together things that are known and I
48:07
think I think this should therefore be
48:09
worthy of public discussion you can very
48:13
roughly bundle emotional responses from
48:17
people into two kind of bins when we’ll
48:22
call positive and the other will call
48:23
negative the positive ones are things
48:25
like affection trust optimism and a
48:32
person belief in a person faith in a
48:34
person comfort with a person relaxing
48:37
around a person all that kind of stuff
48:39
the qualities you want to feel in
48:40
yourself when you’re dating somebody
48:42
let’s say the negative ones are things
48:46
like fear anger jealousy rage feeling
48:53
aggrieved feeling a need for revenge
48:55
just all this stuff now in the negative
48:58
bin a lot of these emotions are similar
49:01
to another bin that’s been described
49:03
over many years which is the startle
49:05
responses or the fight-or-flight
49:06
responses and the thing about these
49:09
negative ones is that they rise quickly
49:11
and they take a while to fall so you can
49:15
become scared really fast you can become
49:17
angry really fast and the related
49:21
positive emotions tend to rise more
49:23
slowly but can can drop quickly they
49:25
have the reverse time profile so it
49:29
takes a long time to build trust but you
49:31
can lose trust very quickly it takes a
49:33
long time to become relaxed compared to
49:37
how quickly you can become
49:38
startled scared and nervous and on edge
49:41
no this isn’t universally true there are
49:44
some fast rising positive emotions I
49:46
just talked about the dopamine hits
49:48
earlier so that’s an exception but
49:50
overall they’re more fast rising
49:52
negative ones
49:53
now these algorithms that are measuring
49:57
you all the time in order to adapt the
50:00
customized feeds that you see and the
50:02
designs of the ads that you see and just
50:04
everything about your experience they’re
50:06
watching you watching you watching you
50:07
in a zillion ways expanding all the time
50:10
now they’re following your voice tone
50:13
and trying to discern things about your
50:14
emotions based on pure correlation
50:17
without necessarily much theory behind
50:18
it they’re watching your emotions as you
50:21
move they’re watching your eyes your
50:23
smile and of course they’re watching
50:25
what you click on what you type all that
50:28
and the thing is if you have an
50:32
emotional response that’s faster the
50:35
algorithms are going to pick up on it
50:36
faster because they’re trying to get as
50:39
much speed as possible they’re rather
50:42
like high-frequency trading algorithms
50:44
in that sense we intrinsically in
50:47
Silicon Valley try to make things that
50:49
respond quickly and act quickly and so
50:51
if you have a system that’s responding
50:54
to the fast rising emotions you’ll tend
50:56
to catch more of the negative ones
50:57
you’ll tend to catch more of the
50:58
startled emotions now here’s the thing
51:01
if you look at the literature and ask
51:04
the broad question if we accept this
51:08
idea of beaming emotions into positive
51:10
and negative feedback emotions as far as
51:14
behavior change goes is positive or
51:16
negative more influential on human
51:19
behavior and the answer you’ll get is a
51:21
really complex patchwork there’s
51:24
behaviors have been around for a long
51:26
time so there’s a lot of studies you can
51:28
read hard to know exactly how high
51:30
quality all the research is especially
51:32
the older stuff but in general you can
51:34
find lots of examples of positive
51:37
feedback working better than negative or
51:39
vice versa and it’s all very situational
51:42
a lot of it’s very subtle on how things
51:44
are framed for people all kinds of stuff
51:45
but overall I what I perceive from the
51:49
literature is
51:49
approximate purity between positive and
51:52
negative but if you ask which emotions
51:56
will the algorithms pick up on when
51:58
they’re trying to get the fastest
51:59
possible feedback it’s unquestionably
52:01
true that the negative ones are faster
52:03
all right
52:05
so what you see is the algorithm
52:06
suddenly flagging oh my god I got a rise
52:09
out of that person let’s do some more of
52:10
that because we’re engaging that person
52:12
and that stuff tends to be the stuff
52:15
that makes them angry paranoid
52:17
revengeful insecure nervous jealous all
52:21
these things and so what you see is this
52:24
feedback cycle where a certain kind of
52:28
dysfunctional personality trait is
52:30
brought out more and more and people
52:33
with similar dysfunctional personalities
52:36
are introduced to each other by the
52:38
system’s
52:38
so when it’s a personality look like
52:41
well the the addiction personality
52:43
online all named three people who have
52:47
recently displayed it rather blatantly
52:49
one is the president who I’m just not
52:52
going to bother to name because I’m sick
52:53
of idiot the second is Kanye the third
52:58
is Elon Musk three people all displaying
53:02
somewhat overlapping in my view
53:05
personality distortions now I’ve no I’ve
53:09
had slight contact with two of the above
53:12
three I’ll let you guess which two they
53:14
are well know I’ll say one of them’s
53:17
trouble I’ve met Trump a few times over
53:18
a very long period of time I’ve never
53:21
known him well I’ve never had a real
53:23
conversation with him but I will say
53:24
that in the 80s and 90s he didn’t seem
53:29
like somebody who was desperate for you
53:30
to like him he didn’t seem like somebody
53:33
who was nervous about what you thought
53:34
about him he didn’t seem like somebody
53:36
who was itching for a fight he didn’t
53:38
seem like somebody who was looking for
53:40
trouble and thought it would help him he
53:43
really just didn’t seem like that at all
53:44
he seemed I think he was still a con man
53:46
I think he was but he was kind of like a
53:49
happy con man is that you know it was
53:51
like a different persona
53:53
and and I think what you know remember
53:58
how I said before that the gambling
54:00
addict is addicted to the whole cycle
54:02
where they lose a lot before they win
54:04
and I think in the same way the Twitter
54:06
addict is addicted to a cycle where they
54:08
bring a lot of wrath upon themselves and
54:10
have to deal with a lot of negative
54:12
feedback before they get positive
54:14
feedback or that you know there’s a mix
54:16
it’s very much like the losing and
54:18
winning and gambling and so I think
54:21
what’s happened is he’s gotten himself
54:22
into this state where he’s he’s like
54:24
this really nervous narcissist and this
54:27
is kind of weird like this personality
54:30
of the person who really like this
54:31
really like me I think he likes me
54:33
this kind of weird nervous narcissistic
54:36
insecure person has not been a typical
54:39
authoritarian personality in the past
54:41
and yet it’s working now and I suspect
54:45
the reason why is a lot of the followers
54:47
who respond to it see themselves in that
54:49
insecurity which is really strange I
54:52
mean if you think about this in the past
54:55
the celebrity figure or the leader
54:57
typically wanted to display a
54:59
personality that was kind of
55:02
invulnerable and an aloof and unmeaning
55:06
self-sufficient uncaring about whether
55:09
whether they’re liked or not and yet
55:12
that’s not what’s going on here it’s
55:13
really strange and and then there’s this
55:16
issue of lashing out its it be so so
55:19
it’s it’s as if because you know that
55:22
you have to get a certain if there’s a
55:23
certain amount of punishment that goes
55:25
with that reward you actually seek out
55:28
some of the punishment because you’re oh
55:29
that’s actually a part of your addiction
55:31
so if you’re a gambling addict you
55:33
actually make some stupid bets it’s it’s
55:35
it’s true it’s just what happens so you
55:38
have Elon Musk
55:39
I’m calling this guy who tried to rescue
55:41
kids in a cave in Thailand a pedophile
55:43
out of nowhere all right same thing
55:46
twitter twitter addiction dysfunctional
55:49
personality Kanye I’m not even what you
55:52
know but but basically you have people
55:54
who are kind of degrading themselves and
55:57
making themselves into fools but in a
56:01
funny way in the current environment
56:03
and there’s a whole world of addicted
56:06
fans who actually relate to it see
56:08
themselves in it and it works it works
56:10
for the first time in history and it’s
56:12
really strange it’s really it’s a really
56:15
weird moment okay so I started by
56:20
talking about the problem of losing
56:22
touch with reality
56:23
now as you heard I have a book called
56:28
ten arguments for deleting your social
56:29
media accounts right now and it goes
56:31
through a lot of reasons to delete your
56:34
social media of which the closest to my
56:37
heart is actually the final one which is
56:39
a spiritual one it’s about how I think
56:41
that Silicon Valley is kind of creating
56:44
a new religion to replace old religions
56:47
and even atheism with this new faith
56:49
about AI and the superiority of tech and
56:54
how we’re creating the future and all
56:55
this and and I feel that that religion
56:57
is an inferior woman people are being
56:59
drawn into it through practice so that
57:00
that tenth argument is the one I care
57:02
most about but what I want to focus on
57:04
here is the existential argument which
57:06
is the loss of reality so the problem we
57:11
have here is that we’ve created so many
57:15
addicts so many people who are on edge
57:17
that they perceive essentially politics
57:24
before they perceive nature they
57:26
perceive the world of human
57:31
recriminations before they perceive
57:33
actual physical reality no I presented a
57:36
theory it’s in various of my books
57:39
called the pacts which which I will
57:42
recount to you now that’s a way of
57:44
thinking about this it goes like this
57:48
there’s some species that are
57:51
intrinsically social like a lot of ants
57:54
there’s some species that tend to be
57:57
solitary like a lot of octopuses some of
58:01
my favorite animals there are some
58:04
species that can switch that can be
58:07
either solitary or social depending on
58:11
circumstances
58:13
and a famous one that we refer to in
58:16
mythology and in our storytelling is the
58:18
wolf you could have a wolf pack or you
58:21
can have a lone wolf same wolves
58:24
different social structures different
58:26
different epistemology I would say when
58:30
you’re a lone wolf you’re responsible
58:33
for your own survival you have to pay
58:36
attention to your environment where will
58:38
you find water where will you find prey
58:40
how do you avoid being attacked where do
58:43
you find shelter how do you survive bad
58:44
weather you are attached to reality like
58:47
a scientist or like an artist you are
58:50
naturalist when you’re in a wolf pack
58:54
different story now you have to worry
58:57
about your peers they’re competing with
58:59
you you have to worry about those above
59:01
you in the pack will they trash you can
59:04
will you get their station you have to
59:06
piss on those below you because you have
59:08
to maintain your status but you have to
59:11
unify with all your fellow pack members
59:13
to oppose those other packs over there
59:15
the other so all of a sudden social
59:19
perception and politics has replaced
59:22
naturalism politics versus naturalism
59:25
those are the epistemologies of the lone
59:28
wolf and the wolf pack people are also
59:33
variable in exactly this way we can
59:36
function as individuals or we can
59:38
function as members of a pack now what
59:43
happens is exactly what I am at least
59:45
hypothesizing happens with wolves it’s a
59:47
kind of interesting interaction
59:48
interacting with scientists who actually
59:50
study wolves because I haven’t actually
59:52
spent that much time with wolves just a
59:53
little bit so they’re people who know a
59:54
lot more about wolves and let’s just say
59:57
my little portrayal is overly simplified
59:59
but just I mean I’m it’s like a little
60:03
cartoon but I hope it functions to
60:04
communicate so when we are thinking as
60:11
individuals we have a chance to be
60:13
naturalist so we have a chance to be
60:14
scientists and artists we have a chance
60:16
to perceive reality uniquely from our
60:20
own unique perspective a diverse
60:22
perspective as compared to everyone else
60:23
is that
60:24
we can then share when we join into a
60:28
pack mentality we perceive politics so
60:32
what happens on social media is because
60:34
the algorithms are trying to get a rise
60:37
out of you to up your engagement and
60:39
make you ripe for receiving behavior
60:42
modification you’re constantly being
60:44
pricked with little social anxiety rage
60:50
irritations all these little things all
60:53
these little status worries is my life
60:55
as good as that person’s life am i
60:57
lonely relative to all these people what
60:59
do they think of me am i smart enough am
61:02
i getting enough attention for this why
61:03
didn’t people care about the last thing
61:05
I did online blah blah blah blah blah
61:06
and there’s just like it’s not that any
61:08
of these things by themselves are
61:10
necessarily that serious but
61:11
cumulatively what they’re doing is
61:13
they’re shifting your mindset and
61:16
suddenly you’re thinking like a packed
61:18
feature you’re so the pack switch is set
61:20
and you’re thinking politically and when
61:23
you think politically you lose
61:25
naturalism you know I think both modes
61:29
of being have a place I think being I
61:32
think if people exclusively all the time
61:34
stayed in the lone setting that would be
61:37
bad for society that would be bad for
61:39
relationships would be bad for families
61:42
and so on however there needs to be a
61:45
balance there needs to be a healthy way
61:47
of going back and forth between them and
61:49
not getting lost in one or the other and
61:52
so the hypothesis I’d put forward is
61:54
that we’re giving people so many little
61:57
anxiety-producing bits of feedback that
61:59
we’re getting them into this pack
62:00
mentality where they’ve become hyper
62:04
political without maybe even quite
62:06
realizing it and losing touch with
62:08
reality no when I say losing touch with
62:10
reality that demands some evidence
62:14
because you might say well are we less
62:16
in touch than in the past
62:18
so remember at the start after the music
62:23
I gave you what I consider to be sort of
62:27
a positive framing and a lot of good
62:29
news absolute poverty has been reduced
62:32
absolute levels of violence have been
62:33
reduced absolute levels of disease have
62:35
been reduced and so
62:36
there are many ways in which we’re
62:38
bettering ourselves but there’s this
62:40
other thing going on which is bad enough
62:44
that it might be the undoing of all of
62:47
that and that is this loss of reality
62:50
now here’s what I want to point out I I
62:53
travel around a fair amount and I
62:55
visited places that would appear on the
62:58
surface to have very little in common
63:00
I’ll mention some of them Brazil Sweden
63:03
Turkey Hungary the United States what do
63:08
they all have in common what they have
63:10
in common is the rise not just we
63:14
sometimes characterized it as right-wing
63:16
populist politics I don’t think that’s
63:20
quite right I think what we actually are
63:23
seeing is the rise of cranky paranoid
63:30
unreal politics I think that’s a better
63:34
characterization and it’s really
63:36
remarkable how it’s all happened at
63:38
about the same time and it’s happened in
63:40
some poor parts of the world too it’s
63:41
not even it’s like so it’s an you could
63:44
say well it’s something about aging
63:45
populations all the cranky old people I
63:47
have you know freshmen will tell me that
63:50
to get our minor but you know their
63:52
countries that are very young that have
63:54
that problem Turkey Brazil it’s like oh
63:56
it’s diverse countries it’s that we
63:59
can’t have democracies unless they’re
64:01
they’re ethnically monolithic or
64:03
something brazil’s diverse oh it’s it’s
64:08
inequality we can’t have the problem is
64:11
that societies are just losing their
64:14
social safety net well you know Sweden
64:17
Germany not really they might have
64:19
anxiety about actually you know it’s
64:22
it’s not so all these places are really
64:25
different they have different histories
64:26
and yeah they’ve all had similar
64:29
dysfunctions and so you have to say well
64:31
what’s in common between all of them and
64:33
you can say something vague well they
64:35
all have anxiety about the future and
64:36
this that’s true but the obviously they
64:38
have in common is that people have moved
64:40
to this mode of connecting through
64:42
manipulative systems that are designed
64:44
for the benefit of third parties who
64:45
hope to manipulate everybody sneakily
64:47
that seems like the clear thing they all
64:50
have in common
64:51
Brazil recently I mean all the same crap
64:55
that we saw was happening on whatsapp
64:58
which is the big connector down there
65:00
and Facebook I think to their credit try
65:04
to help a little bit but they couldn’t
65:05
really do it cuz the whole system is
65:07
designed to be manipulative you know
65:08
it’s if if if you have a car – that’s
65:12
designed to roll it’s very hard to say
65:14
well we won’t let it roll very much I
65:16
mean whatever it does it’ll be rolling
65:17
if you have a manipulation system and
65:19
that’s what it’s designed for you can
65:21
try to get it to roll more slowly or
65:23
something but all it can really do is
65:24
manipulate that is what these things are
65:26
optimized for that’s what they’re built
65:28
for that’s how they make money
65:29
every penny of the many billions of
65:32
dollars that some of these companies
65:33
have taken in that are totally dependent
65:35
on this and of the big companies the
65:37
only ones really totally dependent are
65:39
Google and Facebook or almost suddenly
65:41
dependent it all comes from people who
65:43
believe they’ll be able to sneakily
65:44
influence somebody else by paying money
65:46
via these places that is what they do
65:47
there’s just no other way to describe it
65:50
and so you have the typical thing that
65:57
happens is that the algorithms there
66:01
isn’t any information in them that comes
66:03
from like angels or extraterrestrials it
66:06
all has to come from people so people
66:07
input some information and often it’s
66:09
very positive at first you know it’ll a
66:11
lot of the starter information that goes
66:13
into social networks ranges from
66:16
extremely positive and constructive and
66:18
constructive to just neutral and nothing
66:21
much so there might be people who are
66:23
trying to better themselves maybe
66:24
they’re trying to help each other with
66:26
health information or something like
66:27
that
66:28
then all this information starts they’ll
66:31
say what we’re gonna forward some of
66:33
this information to this person in that
66:34
person we’ll try a 10 million times and
66:36
we’ll see if we get a rise from anybody
66:39
that ups their engagement now the people
66:41
who will be engaged quote-unquote
66:43
engaged are the ones who dislike that
66:45
information so all of a sudden you’re
66:47
getting juice from finding exactly the
66:49
horrible people who hate whatever the
66:50
positive people started off with and so
66:53
this is why you see this phenomenon over
66:55
and over again where whenever somebody
66:57
finds a great way to use a social
66:58
network they have this
66:59
initial success and then it’s echoed
67:01
later on but horrible people giving even
67:03
more mileage out of the same stuff so
67:04
you start with an Arab Spring and then
67:06
you get Isis getting even more mileage
67:08
out of the same tools you start with
67:10
black lives matter you get these
67:12
horrible racist these horrible people
67:16
who just are blackening America getting
67:18
even more mileage out of the same tools
67:20
it just keeps on happening and by the
67:24
way you start with me too and then you
67:26
get in cells and proud boys and whatever
67:28
the next stupid things gonna be because
67:30
the algorithms are finding these people
67:32
as a matter of course introducing them
67:34
to each other and then putting them in
67:35
feedback loops where they get more and
67:36
more incited without anybody planning it
67:39
there’s no evil person sitting in a
67:41
cubicle intending this I or at least I
67:44
would be very surprised to find somebody
67:46
like that I know a lot of the people in
67:49
the different places and I just don’t
67:51
believe it I believe that we backed
67:54
ourselves into this weird corner and
67:56
we’re just not able to admit it and so
67:58
we’re just kind of stuck in this stupid
68:00
thing where we keep on doing this to
68:01
ourselves so what you end up with is
68:06
electorates that are driven you have
68:09
like enough of a percentage of people
68:11
who are driven to be a little cranky and
68:14
paranoid and a little irritated and they
68:17
might have legitimate reasons I’m not
68:18
saying that they’re totally disconnected
68:20
from real life complaints but their way
68:22
of framing it is based on whatever the
68:24
algorithms found could be forwarded to
68:26
them that would irritate them the most
68:27
which is a totally different criteria
68:29
than reality so whatever it is and so if
68:34
it’s in the case of the synagogue
68:37
shooter it was one set of in
68:40
the case of the pipe bomber guy was
68:41
other thing in the case of the guy who
68:43
set up the but it’s all similar it’s all
68:45
part of the same brew of stuff that
68:46
algorithms forward now in some cases the
68:51
algorithms might have tweaked the
68:53
messages a bit because the algorithms
68:54
can do things like play with fonts and
68:56
colors and timing and all kinds of
68:57
parameters to try to if those have a
68:59
slight effect of how much of rise they
69:01
can get but typically they come from
69:03
people who are also
69:05
just trying to get as much impact as
69:08
possible and I think what I think what’s
69:11
happened is we’ve created a whole world
69:13
of people who think that it’s honorable
69:18
to be a terribly socially insecure
69:21
nitwit who feels that the world is
69:23
against them and it’s desperate to get
69:24
attention in any way and if they can get
69:26
that attention that’s the ultimate good
69:28
and the president acts that way a lot of
69:31
people act that way
69:33
that’s what musk was doing and I could
69:36
many other figures and I think what
69:38
happens is these people become both the
69:40
source of new data that furthers the
69:41
cycle and of course it drives them and
69:44
so that there’s sort of multiple levels
69:48
of evil that result from this the
69:50
obvious one is these horrible people who
69:54
make our world unsafe and make it make
69:57
our world violent and break our hearts
69:59
and just keep on doing it over and over
70:01
again and this just off the sense that
70:03
just random people are self-radicalized
70:06
and turning themselves into the heart of
70:08
the most awful version of a human
70:09
imaginable but there aren’t that many of
70:12
them in absolute numbers and I said in
70:14
earlier in terms of absolute amounts of
70:17
violence there’s actually an overall
70:18
decrease in the world despite all this
70:20
horrible stuff with some notable
70:22
exceptions like in with Isis in the
70:24
Middle East and so forth
70:25
but overall you know actually that’s
70:27
that’s true however the second evil is
70:31
the one that I think actually threatens
70:33
our overall survival and that is the one
70:35
of gradually making it impossible to
70:38
have a conversation about reality it’s
70:41
really become impossible to have a
70:44
conversation about climate it’s become
70:46
impossible to have a conversation about
70:48
health it’s become impossible to have a
70:51
conversation about poverty it’s become
70:53
impossible to have a conversation about
70:55
refugees it’s become impossible to have
70:58
a conversation about anything real it’s
71:03
only become possible to have
71:05
conversations about what the algorithms
71:07
have found upsets people and on the
71:09
terms of upsetting because that’s the
71:11
only thing that’s allowed to matter
71:15
and that is terribly dark that is
71:19
terribly dark and terribly threatening
71:21
and what I the scenario I worry about is
71:25
I mean it’s conceivable that some sort
71:30
of repeat of what happened it’s hard for
71:34
me to even say this but some sort of
71:35
repeat of what happened in the late 30s
71:37
in Germany could come about here I can
71:39
imagine that scenario I can imagine it
71:42
vividly because my own grandfather
71:43
waited too late in Vienna and my mother
71:46
was taken as a child and survived the
71:49
concentration camp so I feel it’s very
71:51
keenly having a daughter myself and yet
71:56
I don’t think that’s the most likely bad
71:58
scenario here I think the more likely
72:00
bad scenario is that we just put up with
72:03
more and more shootings more and more
72:07
absolutely useless horrible people
72:10
becoming successful and one in one
72:12
theatre or another whether politicians
72:13
or company heads or entertainers or
72:17
whatever and gradually we don’t address
72:21
the climate gradually we don’t address
72:24
where we’re gonna get our fresh water
72:26
from gradually we don’t address where
72:28
we’re gonna get a new antibiotics from
72:30
gradually we don’t wonder how we’re
72:33
gonna stop the spread of viruses vaccine
72:37
paranoia is another one of these stupid
72:39
things that spread through these
72:41
channels gradually we see more and more
72:43
young men everywhere turning themselves
72:45
into the most jerky version of a young
72:47
man sort of various weenie suppress
72:51
supremacy movements under different
72:53
names from you know gamergate to in
72:57
cells – all right – proud boys –
73:00
whatever this is going to be like this
73:02
endlessly and then gradually one day
73:04
it’s too late and we haven’t faced
73:06
reality and that and we’re we no longer
73:10
have agriculture we no longer have our
73:12
coastal cities we no longer have a world
73:15
that we can survive in and I that is you
73:21
know it’s a kind of a what I worry about
73:23
is a terribly stupid cranky undoing
73:27
fight into us not a big dramatic one
73:30
it’s neither a whimper nor a bang but
73:33
just sort of a cranky rant that could be
73:38
our end and is that a laugh line I don’t
73:44
know you guys are pretty dark anyway so
73:51
what to do about it
73:53
so here there my characterization of the
73:57
problem overlaps strongly with a lot of
74:00
other people’s characterizations of the
74:02
problem mine is perhaps not identical to
74:06
the problem as described by many others
74:08
but there’s an F overlap that I think we
74:11
have a shared we meaning many people who
74:13
hope to change us have a shared sense of
74:15
what’s gone wrong no the first thing I
74:17
want to say in terms of optimism is Wow
74:19
is that better than things used to be if
74:21
I had been giving this talk even a few
74:24
years ago not long at all ago I would
74:28
have been giving the talk as a really
74:30
radical French figure who was saying
74:31
things that almost nobody accepted who
74:33
had lost friends over these ideas and
74:36
who was really kind of surviving on the
74:40
basis of my technical abilities in my
74:42
past rather than what I was saying
74:44
presently because it was so unpopular
74:45
the last especially since I would say
74:48
like brexit Trump but also just in
74:51
general like studies showing the
74:52
horrible increase in suicides and teen
74:55
girls that that scale with their social
74:58
media use all these horrible things that
74:59
have come out oh no something that’s
75:02
really different in Silicon Valley there
75:05
are genuinely substantial movements
75:07
among the people the companies to try to
75:09
change their act regulators at least in
75:12
Europe are starting to get teeth and
75:13
really look at it seriously the tech
75:17
companies are trying to find a way to
75:19
get out of the manipulation game they
75:22
haven’t necessarily succeeded and not
75:24
all of them are trying but some of them
75:26
are
75:27
and it’s a different world it’s a world
75:29
with a lot of people who are engaged so
75:31
now having presented and the problem as
75:34
I see it it’s possible to talk about the
75:38
solution now a lot of folks feel the
75:41
solution should be privacy rights the
75:44
European regulators are really into that
75:46
we had a major conference on that in
75:48
Brussels last week where Tim Cook who
75:52
runs Apple gave a fire-breathing talk
75:53
that kind of sounded like a talk I might
75:55
have given at some point in the past I
75:57
gave a talk there too and I was like wow
76:01
I’ve got the radical anymore it’s very
76:02
straight in a way in a way I kind of
76:04
mourn the loss of radicalness because
76:08
some part of me likes being the person
76:10
like at this outer edge and I’m not and
76:12
it’s kind of like oh god I’m supposed to
76:14
be the radical but anyway I am I think
76:20
it’s great that the Europeans are
76:21
pushing for privacy the theory behind
76:24
that is that the more the harder it is
76:29
for the manipulation machine to get at
76:31
your data the less it can manipulate and
76:33
the more maybe there’s a chance for
76:36
sanity there’s a peculiar race going on
76:39
because the societies and year of that
76:42
support regulation and have and have
76:45
regulators with teeth which we really
76:47
don’t have much of in the u.s. right now
76:48
are themselves under siege by these the
76:51
the cranky political parties who are
76:54
sometimes called right-wing populist but
76:56
I think should be just called you know
76:59
the crank parties and the the cranky
77:02
parties might bring these societies down
77:04
so there’s a race can the regulator’s
77:07
influence the technology in time to
77:09
preserve themselves or will the
77:11
technology destroy their politics before
77:13
they have a chance it’s a really so
77:16
that’s a race going on right now it’s
77:17
quite dramatic and I wouldn’t know how
77:19
to handicap it now the privacy approach
77:23
is hard because these systems are
77:27
complicated like if I say okay here’s
77:30
click on this button to consent to using
77:32
your data for this like I even obviously
77:34
can’t read them
77:36
thing and even if there’s some kind of
77:38
better regulation supporting it it’s
77:40
just nobody understands that even the
77:42
companies themselves don’t understand
77:43
their own data they don’t understand
77:44
their own security they don’t I mean
77:46
like this whole thing is beyond all of
77:48
us nobody’s nobody’s really doing it
77:50
that well everybody’s having data
77:52
breaches and discovering suddenly that
77:54
they were using data they didn’t think
77:56
they were using that’s happened
77:57
repeatedly at Google and Facebook in
77:59
particular so I’ve advocated a different
78:02
approach which is instead of using
78:08
regulators to talk about privacy get
78:11
lawyers and accountants to talk about
78:13
lost value from your data being stolen
78:15
now I have several reasons for that one
78:19
is I don’t think we’ll ever lose our
78:22
accountants and our lawyers I think
78:24
they’re more persistent than our
78:26
regulators that’s one reason and I’m not
78:30
going to do lawyer jokes because it’s
78:34
about the health society’s become some
78:35
mean-spirited I don’t like to make jokes
78:37
about classes of people even lawyers
78:38
anymore but in your so my best friends
78:43
are really you know them but anyway let
78:52
me give you an example that I like to
78:55
use to explain the economic approach
78:57
here there’s a tool online that I happen
79:01
to use frequently that I really like
79:03
which is automatic translation between
79:05
languages if you want to look at a
79:06
website in another language or send
79:08
somebody now you can go online and there
79:10
at least two companies that do this
79:12
pretty well now Microsoft and Google can
79:14
enter your text in one language a usable
79:16
translation comes out on the other side
79:18
convenient free great modernity however
79:24
here’s an interesting thing it turns out
79:28
that languages are alive every single
79:30
day there’s a whole world of public
79:31
events all of a sudden today I have to
79:35
be able to talk about the Tree of Life
79:36
shooter and you have to know what I mean
79:38
all of a sudden today I have to be able
79:40
to talk about the magibon Marie you need
79:42
to know what I mean so every single day
79:44
there all of these new reference points
79:46
that come out lately often horrible ones
79:48
sometimes my
79:49
once maybe a new music video and a new
79:53
meme that people like whatever so every
79:56
single day those of us who help maintain
79:59
such systems have to scrape meaning
80:02
steal tens of millions of example phrase
80:04
translations from people who don’t know
80:06
it’s being done to them so there are
80:08
tens of millions of people who are kind
80:10
of tricked into somehow translating this
80:12
phrase or that phrase in Google and
80:14
Microsoft have to grab these things and
80:15
incorporate them to update their systems
80:17
to make them work but at the same time
80:20
the people who are good at translating
80:22
are losing their jobs
80:24
the career prospects for a typical
80:27
language translator have been decimated
80:30
meaning their tenth of what they were
80:31
following exactly the pattern of other
80:34
information based work that’s been
80:36
destroyed by the everything must be free
80:39
movement recording musicians
80:41
investigative journalists crucially
80:43
photographers all of these people are
80:46
looking at about a tenth of the career
80:48
prospects that they used to have that’s
80:50
not to say that everything’s bleak all
80:51
there there are examples in each case of
80:54
a few people who find their way and this
80:57
gets to a very interesting technical
80:58
discussion which is I won’t but you get
81:01
a zipper curve where there are few
81:03
successful people and then it falls to
81:04
nothing whereas before you before you
81:06
had a bell curve but I can if anybody
81:08
wants to know more about that I can but
81:09
anyway you have a tiny number of
81:11
successful people but almost everybody
81:13
has lost their careers now wouldn’t it
81:15
make more sense if instead of making
81:20
money by providing free translations in
81:22
order to get other people who are called
81:24
advertisers to manipulate the people who
81:26
need the translations in some sneaky way
81:28
that they don’t understand and make the
81:31
whole world more cranky and less reality
81:32
oriented instead of that what if we went
81:37
to the people providing this phrase
81:39
translations and we just told them you
81:41
know if you could just give us the
81:42
phrase translations we really need then
81:45
our system would work better and we’d
81:47
pay you because then we’d have a better
81:48
system and then if we went to the people
81:50
who need translations and say free isn’t
81:53
really quite working because that way we
81:55
that means we have to get these other
81:56
people to manipulate you to have a
81:58
customer but we’ll make it really cheap
82:00
what about a die
82:01
a translation or something like that we
82:03
worked out some kind of a system where
82:05
the people who provide the translations
82:07
meet each other because it’s a network
82:08
we can introduce them they form a union
82:11
they collectively bargain with us for a
82:13
reasonable rate so that they can all
82:15
live put their kids through school and
82:17
then we get better working translators
82:20
and yeah you pay a dime you can afford a
82:22
dime and something everybody’s happier
82:24
no there are a few things about this
82:26
that are really good in my point of view
82:28
one is we no longer have these people
82:30
from the side paying to manipulate
82:32
people everything’s become clear – we
82:35
have a whole class of people making a
82:36
living instead of needing to go on the
82:38
dole instead of saying oh we need this
82:40
basic income because everybody is
82:41
worthless 3 we’re being honest instead
82:44
of lying which is a really big deal
82:46
right now we have to lie because we’re
82:49
not telling the people that were taking
82:50
their data we’re telling them oh you’re
82:52
buggy whips you’re worthless
82:53
but in secret we need you that’s a lie
82:56
and for there’s kind of a spiritual
82:59
thing here where we’re telling people
83:01
honestly when they’re still needed like
83:03
to tell people oh actually you’re
83:05
obsolete the robots taken over your job
83:08
when it’s not true when we still need
83:10
their data there’s something very cruel
83:12
about that it cuts to some sort of issue
83:14
of dignity and human Worth and it really
83:16
bothers me so for all these reasons this
83:18
seems like a better system to me and
83:21
sure we’d have to make accommodations
83:22
for those who can’t afford whatever the
83:24
rate would be for the language
83:25
translation but we can do that we’ve
83:27
almost figured out ways to do that if
83:28
we’re a decent society and we’d be a
83:30
more decent society because we wouldn’t
83:32
have an economy that’s strictly run on
83:35
making people into assholes so so that’s
83:41
why I advocate the economic approach so
83:44
I know it’s bad form but it can I refer
83:47
you to a paper to read go look up
83:50
something called blueprint for a better
83:52
digital society I’m sorry about the
83:54
title I didn’t make it up it’s an
83:56
editor’s fault Adi Ignatius it’s your
83:59
fault Adi and it’s a Harvard Business
84:01
Review recently you can find it online
84:02
very easily blueprint for better digital
84:05
society and it’s the latest version
84:07
about how to make this thing work and a
84:09
little bit about how to transition to it
84:13
so so that’s the solution I’ve been
84:16
exploring and promoting I think there’s
84:19
room for a lot of solutions another idea
84:22
is people like the Center for Humane
84:25
technology which is Tristan Harris in
84:27
another group called Common Sense Media
84:28
are trying to educate individuals about
84:31
how to be more aware of how they are
84:33
manipulated and how to make slight
84:35
adjustments to be manipulated a little
84:37
less worth trying remember it’s a sneaky
84:42
machine the whole industry is based on
84:43
fooling you so staying ahead of it is
84:45
gonna be work you can’t just do it once
84:47
and think you’re done it’ll be a
84:48
lifetime effort that’s why I think you
84:49
should just quit the things yeah when I
84:53
say can you please delete all your
84:55
social media accounts surely one of the
84:58
first thoughts and all your minds is
85:00
well that’s ridiculous I mean you’re not
85:02
going to get billions of people to
85:04
suddenly drop these things there’s
85:06
there’s two reasons why you’re correct
85:09
if you have that that that thought one
85:14
is that you’re addicted this is an
85:16
actual addiction you can’t just go to
85:18
somebody with a gambling addiction and
85:19
say oh just so you know any more than
85:21
you can do that if they have a heroin
85:23
addiction that’s not how addiction works
85:25
you can’t just say no it’s a prop it’s
85:27
hard
85:27
addiction is hard all of us have
85:29
addictions none of us are perfect but
85:31
this particular ones destroying our
85:33
future it’s really bad it’s not just
85:34
personal we hurt each other with this
85:36
one in an exceptional way so another
85:41
reason is network effect and that means
85:43
everybody already has like all their
85:45
pictures and all their past and all
85:47
their stuff on these properties that
85:49
belong to companies like Facebook and
85:50
for everybody to get off it all at once
85:53
they can continue to have connections
85:54
with each other is a coordination
85:56
problem that’s essentially impossible at
85:57
scale so that’s that’s a network effect
86:00
problem so why am I asking people to do
86:03
something that can only happen a little
86:05
and the reason why is even if it only
86:08
happens a little it’s incredibly
86:09
important so let me let me draw a
86:12
metaphor to some things that have
86:14
happened in the past we have in the past
86:17
had mass addictions that were tied to
86:24
corrupt
86:25
mercial motives at a large scale one
86:28
example is the cigarette industry
86:32
another example is big alcohol alcoholic
86:36
beverages I could mention others lead
86:39
paint is when I bring up in the book now
86:41
in these cases well actually the lead
86:44
paint was an addiction thing so I’ll
86:45
leave I’ll leave out lead paint so let’s
86:47
just talk about cigarettes and in the
86:50
case of cigarettes when I was growing up
86:52
it was almost impossible to challenge
86:55
cigarettes
86:56
I you know like cigarettes were manly
86:59
they were cool if you were on the red
87:02
side of America they were the cowboy
87:03
thing if you were on the blue side they
87:06
were the cool beatnik thing everybody
87:07
had a cigarette and you just couldn’t be
87:10
cool without your cigarette but enough
87:12
people finally realize that they could
87:14
get out from under it that at least it
87:16
allowed a conversation the addict will
87:19
defend it if you talk to somebody who’s
87:21
really addicted to cigarettes it’s very
87:22
hard for them to really get a clear view
87:25
of what the cigarette means to society
87:27
what it means to have cigarette in
87:28
public spaces there was a time in this
87:32
room would have been filled with
87:33
cigarette smoke and we would have been
87:35
gradually killing the students who were
87:36
attending I think I’m coughing in
87:42
sympathy with remembering what that was
87:43
like because it was really horrible
87:48
alcohol Mothers Against Drunk Driving
87:50
was or drunk drivers I forgot which it
87:53
is has been one of the most effective
87:55
political organizations they changed
87:57
laws they changed awareness they changed
87:59
outcomes and saved an enormous number of
88:01
lives despite the fact that once again
88:04
alcohol is cool it’s supposed to be cool
88:06
to drink at a frat party it supposed to
88:07
be cool to drink at your fancy
88:09
restaurant everybody loves drinking and
88:11
there’s this whole world event of
88:13
advertising liquor we found a reasonable
88:17
compromise in both cases we don’t throw
88:20
people who drink or smoke cigarettes in
88:23
jail like we’ve done for marijuana for
88:25
years instead we came up with a
88:28
reasonable policy don’t do it in public
88:30
don’t do it behind the wheel it worked
88:32
that was only possible because we had
88:36
enough people who were outside of the
88:38
addiction system
88:39
have a conversation in this case we
88:42
don’t have that in this case all the
88:44
journalists who should be helping us are
88:46
addicted to Twitter and making fools of
88:47
themselves if you’re a journalist in
88:50
this room you know I’m telling the truth
88:54
the same for politicians same for public
88:57
figures celebrities who might be helpful
88:59
we need to create just a space to have a
89:04
conversation outside of the addiction
89:06
system now you might be thinking oh my
89:08
god I’ll destroy my life if I’m not on
89:10
these things I don’t think it’s true I
89:13
think if you actually drop these things
89:14
you suddenly discover you can have any
89:15
life you want I’m not claiming that I’m
89:18
the most successful writer or public
89:21
speaker but I’m pretty successful I have
89:22
best-selling books I get around I you
89:25
know you hired me to come talk to you
89:29
and I’ve never had an account on any of
89:31
these things and you could say oh but
89:33
you’re an exception in this way well I
89:34
mean how much of an exception can I be I
89:37
play any points against me I’m like this
89:39
weirdo and and I still know seriously
89:42
you know I mean I still can do it if I
89:44
can do it probably other people can do
89:45
it too
89:46
I just don’t I think that there’s this
89:48
illusion that your whole life like
89:50
they’ll be you’ll just be erased if
89:52
you’re not on these things but that
89:53
illusion is exactly part of the problem
89:55
that’s that’s exactly part of this weird
89:59
existential insecure need for attention
90:05
at any cost bizarre personality
90:08
dysfunction that’s destroying us just
90:10
give it a rest
90:11
now here’s what I would say there was a
90:13
time it’s when if you were young
90:17
especially one of the priorities that
90:20
you felt in your life was to know
90:21
yourself and the only way to know
90:23
yourself is to test yourself and the way
90:25
you test yourself is maybe you’d go
90:26
trekking in the Himalayas or something I
90:30
used to hitchhike into central Mexico
90:32
when I was really young just a really
90:34
young teenager and that’s how I tested
90:36
myself these days I think the the
90:39
similar idea would be quitting your
90:41
social media and really deleting it like
90:43
you can’t you can’t like quit Facebook
90:45
and keep Instagram that’s you
90:46
have to actually delete the whole
90:48
thing
90:49
and and then it doesn’t mean you’re
90:52
doing it for your whole life
90:53
delete everything and then stay off
90:56
stuff for six months okay if you’re
90:59
young you can afford it it will not kill
91:01
you and then after six months you will
91:03
have learned and then you make a
91:05
decision in my opinion you should not
91:08
harm your life for the sake of the ideas
91:11
I’ve talked about today if it’s really
91:13
true that your career will be better or
91:15
whatever through using these things then
91:18
you need to follow your truth and do
91:21
what makes you succeed and if it’s
91:22
really true that being a serf just
91:24
stupid Silicon Valley giant is the thing
91:27
that helps your career okay but you have
91:32
to be the one making that decision and
91:33
if you haven’t tested yourself you don’t
91:36
have standing to even know so I’m not
91:40
telling you what’s right for you but I
91:41
demand that you discover what’s right
91:44
for you that I think is a fair demand
91:47
given the stakes and with that cheerful
91:51
closing I will call it
91:53
[Applause]
92:02
[Music]
92:06
so we have is that is a mic on so do we
92:10
have the question set people well we
92:14
were gonna have cards I don’t know if
92:15
any cards have made their way here’s a
92:17
card cards okay
92:19
I’m actually an unclear on how this
92:22
whole thing works okay well this is it
92:28
alright so normally I would get a bunch
92:32
of cards but but but I haven’t that
92:34
hasn’t happened yet okay lately I’ve
92:36
noticed that I was getting progressively
92:38
more cranky that’s now a technical term
92:41
I think you’ve introduced along with a
92:43
virtual-reality cranky from a lack of
92:46
sleep because of the excessive blue
92:48
light given off by screens ah have you
92:51
factored this effect into your theory oh
92:55
yeah
92:56
well there’s the time and stuff like
92:57
that there’s more as soon as soon as I
93:02
put blue filters on my screens I got a
93:03
lot less cranky okay there the the
93:08
problem of blue light keeping you up and
93:11
those are all real problems and in fact
93:13
you might want to just turn colour off
93:15
on your computer definitely turn colour
93:17
off on your phone and all seriousness
93:18
you don’t need it
93:19
for most things I have I have color I
93:23
use a phone but I definitely cover off
93:24
and like make those changes if you know
93:28
if you notice something like that yeah
93:32
you can right you can turn off the blue
93:38
light on your computer you can go into a
93:40
setting and you know the best way to do
93:43
it go to the visual accessibility
93:45
settings because they have these high
93:46
contrast settings for people who have
93:49
trouble focusing and they get rid of
93:50
color as we come stark contrast as an
93:54
example oh for God’s sakes I have to
94:00
enter this – I was going to show you
94:01
what it looks like but I’m not going to
94:02
bother with a code anyway you just you
94:04
can do it every major platform has this
94:06
ability it’s really that and you should
94:08
do it go to Common Sense Media org or to
94:15
Center for humane technologies website
94:17
and both of them have advice on how to
94:21
do things like this and another thing is
94:23
both I’m pretty sure both Windows and
94:26
Mac if it’s a computer have ways to make
94:28
the blue light go away as the evening
94:29
approaches there’s like this kind of
94:31
stuff this is real stuff and you should
94:33
pay attention to it and the technology
94:35
should serve you and not drive you crazy
94:36
but I do have to say this is not an
94:38
existential threat this is this is at
94:41
the level of too much sugar and
94:42
breakfast cereals or something like that
94:43
it is actually a real issue it’s it’s it
94:46
does have an effect on the health of the
94:48
population but it’s not going to destroy
94:49
us this other stuff I’m talking about is
94:51
at another level okay
94:53
so rather than the cards do we have
94:57
cards we do okay great let’s give some
95:02
cards is that your card oh okay great
95:07
okay here’s my card isn’t there a design
95:09
problem for publishing online if you
95:11
know who’s pointing at you how is that
95:14
related to the problem Allen turning
95:16
faced touring it says turning he hatched
95:20
the concept of a machine like
95:22
personality isn’t that too software what
95:25
listening and compassion is to human
95:27
communication yeah it’s a kind of
95:30
interesting question to me when I read
95:34
Turing’s final notes that the Turing
95:37
test comes up twice it comes up in a
95:39
little monograph he wrote and it comes
95:40
up in a sort of a little note there’s
95:43
two statements of it and in both of them
95:45
to me reading them there’s just this
95:49
profound sadness I feel like this is
95:51
this person who’s just screaming out so
95:55
some of you might I don’t know there’s a
95:56
whole history to this thing that what
95:58
trinket is he created a metaphor oh boy
96:02
let me try to do this as fast as I can
96:04
Turing did as much as anybody to defeat
96:07
the Nazis in World War two by braking
96:10
using one of the first computers that
96:11
ever existed to break a Nazi secret code
96:13
called enigma and he he was considered a
96:18
great war hero however he lived an
96:22
identity that was illegal at that time
96:24
which is that he was gay
96:25
and he was forced by the British
96:27
government after the war to accept a
96:30
bizarre crack treatment for being gay
96:32
which was to overdose on female sexual
96:35
hormones with this bizarre idea that
96:37
female hormones would balance his over
96:39
sexiness which was supposed to be the
96:41
gay it’s like so stupid it’s hard to
96:43
even repeat it and he started developing
96:46
female physiological characteristics as
96:48
a result of that treatment and it he
96:53
committed suicide by a sort of a weird
96:56
political thing where he laced an apple
96:58
with cyanide and ate it next to the
96:59
first computer sort of anti Eve or
97:03
something and he was a very brilliant
97:05
and poetic man and in the final couple
97:08
of weeks of his life he came up with
97:10
this idea of repurposing an old
97:14
Victorian parlor game that used to be
97:18
this thing we’d have a man and a woman
97:21
behind a curtain or a screen of some
97:26
kind and all they could do is pass
97:30
little messages to a judge and the judge
97:32
would have to tell who’s the man and
97:33
who’s the woman and each of them might
97:35
be trying to fool the judge which is
97:36
kind of a weird if you think about it
97:38
the Victorians were pretty kinky and
97:40
bizarre and and so what you’re doing is
97:45
as with behaviorism and as for the
97:47
internet you’re slicing away all of
97:48
these factors and just turning it into
97:50
like this limited stream of information
97:51
so it’s kind of like tweeting or
97:53
something and that so what Turing said
97:56
is what if you got rid of the woman and
97:58
you had a man in a computer and the
98:00
judge couldn’t tell them apart wouldn’t
98:03
then finally you have to admit that the
98:05
computer should be given rights and give
98:07
in stature and be treated and when you
98:10
read it I don’t the way I read it is
98:13
it’s this person saying oh my god I
98:14
figured out how to save the world from
98:16
these people who wanted to destroy
98:18
everybody based on being of the wrong
98:19
identity these people who wanted to kill
98:22
not only gays but of course Jews and
98:24
Gypsies and and and black people and
98:27
these horrible people and I came up with
98:30
this way of defeating them and now
98:31
you’re destroying me for who I am
98:33
and I feel like there’s this kind of
98:37
astonishing sadness in it and the way
98:40
it’s the way turns and so that was the
98:43
birth of the idea of artificial
98:44
intelligence and I feel like the way
98:46
it’s remembered is completely unlike
98:48
what it’s like to read the original you
98:50
know I feel like if you look at the have
98:51
you ever read the original Turing
98:53
because if you read the original Turing
98:54
I mean it’s like it’s intense you know
98:58
here’s this person who’s being tortured
98:59
to death it’s like it’s not some kind of
99:02
nerdy thing at all it’s it’s a it’s a
99:05
difficult it’s difficult to read the
99:07
documents and I think it was like this
99:12
crazy I think he knew he was about to
99:15
die and I think he was reaching out for
99:17
some sort of a fantasy of what kind of a
99:20
thing what would it take for people to
99:23
not be cruel what would it take and I
99:26
think in this very dark moment he
99:28
thought maybe giving up humanity
99:30
entirely and we’ll just maybe if we’re
99:32
just machines maybe we won’t do this to
99:35
ourselves and the thing about that of
99:37
course is we’ve turned ourselves sort of
99:40
into machines because we’ve all kind of
99:42
acting like machines to be able to use
99:44
this stuff you’re all sitting there all
99:45
day entering your like little codes to
99:47
get online that you’re sort of turning
99:49
into machines in practice and yet we’ve
99:51
just become more and more coral like
99:53
that that’s the the ultimate irony is
99:55
that it didn’t help so that’s my take on
99:58
it and this idea that AI is some could
100:02
be some form of compassion I think it’s
100:05
kind of I think it’s really
100:06
just a way of stealing data from people
100:07
who should be paid to translate AI is
100:11
theft
100:11
to paraphrase anyway okay we we a I is
100:23
just a way look all all we can do with
100:27
computers ever look to be a good
100:30
technologist you have to believe that
100:33
people are sort of mystically better
100:35
than machines otherwise you end up with
100:38
gobbledygook and nonsense you can’t
100:40
design for machines so AI has to be
100:43
understood as a channel for taking data
100:45
from one person to help another
100:47
I take I take the data from the
100:50
translators and I apply it through a
100:52
machine learning scheme or some kind of
100:54
scheme and I can get translations that
100:56
help people in a better way than I could
100:59
without that scheme in between which is
101:01
wonderful
101:02
so it’s technology to help people
101:04
connect in a way that’s more helpful if
101:05
you understand AI that way you elevate
101:08
people and you don’t confuse yourself
101:11
okay yeah we we we don’t have a lot of
101:13
time and we have a lot of great
101:15
questions so some questions are not
101:16
going to be able to be answered now
101:19
although I want to mention that there
101:21
will be a book signing and book
101:23
purchasing outside after the event is
101:27
over there’ll be two tables please if
101:29
you want to ask me long quest you can’t
101:31
go up through it come up to me and ask
101:32
like some open-ended giant question I’m
101:34
signing your book that would taken out
101:35
like by that you really can’t do that by
101:37
a book the people who are selling ebooks
101:39
have asked that you buy a book first
101:41
before you have it signed and that note
101:48
I’ll segue into there’s a couple
101:49
questions that are connected to this how
101:51
about your market solution arguably the
101:55
mess we’re in now comes from the
101:56
monopolistic and manipulative tendencies
101:58
inherent in markets given that the world
102:01
world has never known pure markets what
102:03
would keep this one pure oh it’s not
102:05
going to be pure it’s going to be
102:07
annoying and unfair and horrible but the
102:09
thing about it is it won’t be extent
102:10
existentially horrible
102:12
the thing about market so what I would I
102:15
believe about economic philosophies is
102:17
there’s never been one that’s worked out
102:19
in practice and instead just asked with
102:21
moral philosophies and theories of how
102:24
we learn and many many other areas where
102:26
we’re trying to deal with very complex
102:27
systems it’s not so much that we can
102:29
seek the perfect answer but we have to
102:31
trade-off between partial answers so to
102:33
me there’s never been a pure market
102:37
there’s never been and I don’t think
102:39
there ever could be but I think what you
102:41
can do is you can get a balance this was
102:43
the the Keynesian approach to economics
102:45
I think is very wise you get you you you
102:48
get a balance between reasonable
102:50
oversight and and in a reasonably
102:53
unfettered market and they’ll go through
102:54
cycle for the market will need help and
102:56
you just you trade off you trade
102:58
and I think that that’s that’s the only
103:01
path we have I think being eyed and
103:04
being an ideologue for any solution to a
103:07
highly complicated problem is always
103:08
wrong okay just two more then and
103:11
there’s a couple like this as well
103:14
what about the connection force of
103:15
social media eg for the feminist
103:18
movement like me to these online
103:21
communities raise awareness and create
103:23
supportive communities and then many
103:25
people who rely on social media for
103:27
community because of the demands of
103:28
capitalist jobs yeah yeah it’s just
103:32
that’s all true except that backfires
103:34
and the backfire is worse than the
103:36
original so like what happened the it
103:40
just keeps on happening I mean like
103:41
before me too there were there was a
103:44
problem of diversity in the gaming world
103:47
and a few women in gaming just wanted to
103:51
be able to say one or two things and not
103:52
be totally invisible and and then the
103:54
result of that was this for Asia’s thing
103:56
called gamergate that was just this
103:58
total never shut up totally wipe
104:01
everybody else out totally make it
104:02
everything horrible movement and then me
104:05
too has spawned this this other thing
104:09
that’s still rising which is the in
104:10
sells and the proud boys and all this
104:12
stuff and the problem is that in these
104:15
open systems at first your experience of
104:19
finding mutual support and creating
104:21
social changes as a entik it’s real it’s
104:23
just that there’s this machine you’re
104:25
not thinking about behind the scenes
104:26
that’s using the fuel you’re providing
104:29
in the form of the data to irritate
104:31
these other people because it gets even
104:33
more of a rise from them and you’re
104:35
creating this other thing that’s even
104:36
more powerful that’s horrible
104:38
even though it wasn’t your intent and
104:39
that’s the thing that keeps on happening
104:40
over and over again it doesn’t
104:42
invalidate the validity of the good
104:44
stuff that happens first it’s just that
104:46
it always backfires well not always but
104:48
typically and you end up you end up
104:53
being slammed and you don’t even like
104:55
one of the things that’s really bad
104:56
about it is that it’s you know it seems
104:59
like it’s just the fault of the creeps
105:01
who come up where it’s actually kind of
105:02
more the fault of the algorithms that
105:04
introduced the creeps to each other and
105:05
then got them excited in this endless
105:07
cycle of using your good intentions to
105:09
irritate the worst people so I mean
105:12
I know the thing is it’s cute
105:14
blacklivesmatter was great I think I
105:16
mean I think it’s wonderful and yet the
105:18
reaction to it was horrible and of a
105:21
higher magnitude and I just think we
105:23
have to find unfortunately until we can
105:26
get rid of the advertising model and the
105:28
giant manipulation machine every time
105:30
you use the big platforms for any kind
105:33
of positive social effect it’ll backfire
105:35
and destroy you and it’s it’s a fool’s
105:38
game even though it’s valid at first in
105:41
the long term it’s a fool’s game okay
105:44
this I don’t like saying that I hate
105:45
saying that it breaks my heart this is
105:47
the last question and it’s existential
105:49
wanna I’ll combine the two two questions
105:53
here seeing how pernicious social media
105:55
has become by being hijacked toward
105:58
bummer and you know bummer is another
106:00
technical term using yeah there’s a
106:04
wonderful writer on cyber things I’m
106:06
sherry Turkle and she read my book and
106:09
she said oh I love this book but there’s
106:10
just too much touching it
106:11
and the thing because it there’s like
106:13
bummer and there’s a cat’s behind on the
106:15
cover stuff and I the problem is I
106:19
married a woman who likes butt jokes and
106:21
I just can’t I don’t know some of they
106:23
just come I don’t know anyway okay so
106:29
how to how to guard against an immersive
106:32
technology like virtual reality becoming
106:35
even more insidiously bummer and then
106:38
how do you know what is real okay oh
106:41
well all right those are small questions
106:43
so the first one I mean I think the way
106:47
to keep fort reality vert reality could
106:49
be super hyper creepy I wrote a book
106:52
about vert reality that we have mention
106:53
it’s called dawn of the new everything I
106:55
don’t know if they’ll have it upfront or
106:56
not but I talked a lot about that issue
106:58
so virtuality could potentially be
107:00
creepy I think the way to tell whether
107:03
something’s getting creepy is whether
107:04
there’s a business model for creepiness
107:06
so if the way it’s making money is that
107:08
there’s somebody to the side who thinks
107:10
they can sneak lis alter you or
107:11
manipulate you that’s the creepy engine
107:14
if there isn’t that person and if there
107:15
isn’t that business going on it’s less
107:18
likely to be creepy I think this is
107:19
actually that’s actually a pretty simple
107:21
question to answer I think it boils down
107:23
to incentives I think incentives run
107:25
world as much or more than anything else
107:27
as far as this question of how to be how
107:29
do you know what’s real
107:30
the answer is imperfect what you do is
107:34
you struggle for it you struggle to do
107:37
scientific experiments to publish you
107:39
have to always recognize you can fool
107:40
yourself you have to recognize that
107:43
whole communities of people can fool
107:44
themselves and you just struggle and
107:45
struggle and struggle and you gradually
107:47
start to form a little island in a sea
107:50
of mystery in which you never have total
107:54
confidence but you start to have a
107:56
little confidence so there’s some things
107:58
that we can be confident of now the
108:01
earth is round not on line but do we
108:06
know it in an absolute absolute sense no
108:09
you can never know reality absolutely
108:11
but you can know it pretty well and so
108:13
in order to talk about reality you have
108:16
to be used you have to get used to near
108:21
perfection that is never actual
108:24
perfection and if you’re not comfortable
108:26
with that concept you have no hope of
108:27
getting to reality because that’s the
108:29
nature of reality reality is not
108:31
something you ever know absolutely and
108:32
in fact just to be clear I in one of my
108:35
books I defined reality is the thing
108:37
that can be never that can never be
108:38
measured exactly it’s the thing that can
108:40
never be simulated accurately it’s the
108:42
thing that can never be described to
108:44
perfection that is reality but at this
108:46
because the simulation can be described
108:48
to perfection I can describe to you a
108:50
video game world or a virtual world to
108:52
perfection I can’t do that with reality
108:54
and the the thing is though that we
108:59
can’t demand absolute knowledge in order
109:01
to have any knowledge at all or else we
109:03
make ourselves into genuine fools we
109:05
have to be able to accept that we can
109:07
have better knowledge than other
109:08
knowledge it’s all an incremental sort
109:11
of eternal improvement project so the
109:16
people who demand absolutely proof of
109:18
climate change or fools but they’re
109:21
interesting like I mean some of you
109:23
might have read there was a good history
109:25
published this week about the history of
109:28
the reading wars about how we learn
109:30
reading and there’s this community of
109:32
people who’ve just been absolutely
109:34
unable to accept a load of scientific
109:37
evidence about how did he
109:38
kids to read effectively because of an
109:40
ideology and they’re sincere and it’s
109:43
like people it’s really really hard
109:45
accepting reality is your life’s work
109:47
it’s really really really hard it’s it’s
109:50
not it doesn’t come naturally
109:52
necessarily it’s a discipline thank you
109:55
all right
109:56
[Applause]
110:03
[Music]
110:04
[Applause]

Matt Taibi: The Press Does Not Doing Bipartisan Scandals

00:00
hmm so Jamie pointed out this this
00:07
congressman is that who it is the Jamie
00:11
pointed this out that there’s a
00:12
congressman and he released a series of
00:14
tweets and the first letter of all these
00:17
tweets if you put them all together it
00:18
says Epstein didn’t kill himself or did
00:20
not kill so notice it didn’t he did how
00:25
do you do the posture
00:26
should have gone with did not darting
00:28
here with that evidence of a link
00:30
wrapped Paul gaw sir what are the odds
00:33
this guy did this accidentally really
00:35
small right that’s kind of like one of
00:37
those monkeys typing Shakespeare say
00:39
yeah yeah I don’t think I could it could
00:41
work and the thing is he did it
00:43
backwards right so you didn’t see what
00:46
the puzzle was until the last tweet
00:48
because the last time he does in E I got
00:50
a tweet from someone about 35 minutes
00:52
ago that I don’t know if there’s a bunch
00:54
of people online paying attention to it
00:55
or what but someone alerted me and a few
00:57
other people what it is yeah does he
00:59
have an image of that fucking that crazy
01:01
mask is that in his shit too
01:03
okay he’s a agent he’s got the I was
01:07
November first V mask yes yeah what is
01:10
that mask for Vendetta was a
01:12
representative of something
01:14
it’s the Guy Fawkes mask yes that’s
01:16
right right yeah so this guy’s uh he’s
01:19
he’s thinking along alternative lines of
01:21
thought but that is really an
01:22
interesting way of saying it alphabet
01:25
tree that’s yeah just making a bunch of
01:27
tweets don’t ever address it just leave
01:29
it there walk away you know Lewis
01:31
Carroll was famous for that was he yeah
01:33
that was one of uh he did a lot of sort
01:35
of tricks with words you read the book
01:38
gödel Escher Bach No
01:39
yeah there’s a whole bunch of stuff in
01:41
there about people who used who put
01:45
puzzles in text mmm you know that’s kind
01:47
of a thing that people did I guess back
01:49
more in the 18th century in before well
01:51
this Epstein case is probably the most
01:54
blatant example of a public murder of a
01:59
crucial witness I’ve ever seen in my
02:00
entire life or anybody’s ever seen and
02:02
the the the minimal amount of outrage
02:05
about this the no minimal amount of
02:07
cover it’s fuckin fascinating I mean I
02:09
what’s amazing to me just as a you know
02:12
somebody works in the media
02:13
is that this was shaping up to be the
02:15
biggest like news story in history yes
02:18
and the instant he you know he died or
02:23
was died or however you want to call it
02:25
it this story just fell off the face of
02:27
the earth it’s like nobody’s doing
02:30
anything about it and I I don’t
02:32
100% understand that I mean I get it why
02:35
that’s happening but it’s it’s just
02:37
amazing
02:38
well when the woman from ABC what was
02:41
her name Amy that lady the the one who
02:46
wrote Roebuck you who had the frustrated
02:49
moment that she called it a frustrating
02:51
private moment right what she was
02:53
talking about having the scoop and
02:55
having that story and them squashing it
02:58
right like this this is all stuff that
03:01
everybody used to think was conspiracy
03:04
everybody’s think this was stoner talk
03:06
this was you know you don’t I mean like
03:08
this is stuff where people just a
03:11
delusional they believe all kinds of
03:12
wacky conspiracies sure but the reality
03:15
is much less complicated well this is
03:16
not possible this is one of those things
03:18
it’s so obvious it’s so in everyone’s
03:21
face well there’s a couple of things
03:24
going on because there there are many
03:26
different ways than this can play out I
03:27
mean you could have a news director who
03:29
just sort of instinctively decides well
03:32
we can’t do that story because I might
03:34
want to have well and quedan later or I
03:35
might want to have this politician on
03:37
later and it’s it’s not like anybody
03:39
tells them necessarily that we can’t do
03:42
this but sighs too hot you if you grow
03:44
up in this system and you’ve been in the
03:46
the business for a long time you just
03:49
you have all these things that are
03:51
drilled into you and almost like the
03:52
cellular level about what you can and
03:54
cannot get into and I think there but
03:57
there were some explicit things that
03:59
happen with Epstein to I mean they keep
04:00
there there were a lot of news agencies
04:02
that killed stories about him that you
04:04
know and we’re hearing what some of them
04:05
in Vanity Fair this thing you know so
04:07
yeah it’s it’s bit stay it’s bad it’s
04:10
terrible yeah yeah when when I found out
04:13
that Clinton flew no less than 26 times
04:17
on a plane with Epstein I was like dude
04:20
I haven’t flown that many times with my
04:22
mom
04:23
long did he know Epstein yeah I don’t
04:26
know but I mean they have that many
04:28
flights to have the Secret Service
04:29
people involved I mean that’s incredibly
04:33
bold what was he dealing with just girls
04:36
was Clinton that much of a hound that he
04:40
would go that deep into the well that
04:42
many times 26 times well that’s the
04:45
thing about the Epson story that makes
04:46
no sense to me like I I thought that the
04:48
percentage of people who were out and
04:50
out like perverts who had a serious
04:53
problem like with petty ophelia or
04:55
whatever it was was pretty small you
04:57
know yeah but your but they had a lot of
05:00
people coming in and out of this
05:02
compound and and it just seems like it’s
05:04
a it’s a very strange story what were
05:07
they really up to I have I have no idea
05:09
and was was it all a blackmail scheme
05:10
it’s just it’s just so strange well
05:12
seems like the pedophilia aspect of it
05:14
might be directly connected to Epstein
05:16
himself like he might be the one that
05:19
has a problem with girls that are like
05:20
16 and he likes them very or he did like
05:23
them but with the other guys it could
05:27
just be girls
05:28
it could be yeah yeah I mean that’s why
05:29
it’s so crazy like how could it be that
05:31
these but maybe it’s not but they must
05:34
but they knew who he was
05:35
yeah but they probably didn’t know the
05:37
extent of it probably not yeah up until
05:40
a point up until he was arrested right
05:42
and then they’re like oh well then
05:44
that’s when everybody backed off of him
05:46
right yes yeah I mean I’m not a hundred
05:48
percent yeah I haven’t covered this
05:50
story in depth I’ve only I only really
05:52
got into it a little bit we made you cuz
05:59
it mixes a lot of things that are are
06:00
very tough to cover yes you know the
06:03
intelligence world is very tough to
06:04
cover mm-hm
06:05
you know it’s it’s hard to get stories
06:07
out of there that they don’t want you to
06:08
have yeah and this is this is like the
06:11
mother of all stories and you know in
06:13
terms of that and they’re just little
06:15
little bread crumbs here and there that
06:17
whole thing about Acosta
06:18
you know the vanity vanity fair– quote
06:21
from him is that when he said that when
06:24
he looked at the case he didn’t do it
06:27
because I was told he belonged to
06:28
intelligence yes what does that mean
06:31
right now whose intelligence you know
06:32
what I mean like what agency but what
06:34
for you
06:35
and then you pair that with things like
06:37
you know I have friends on Wall Street
06:39
who told me I’ve never heard a single
06:41
instance of this guy actually having a
06:43
trade right you know so what was his
06:45
hedge fund doing you know I mean if you
06:47
think about it hedge funds a perfect way
06:48
to do blackmail well because you can
06:51
just have people putting money in and
06:52
out all the time and it would look like
06:55
investment yeah so very strange story
06:58
well Eric Weinstein had a conversation
07:00
with him you know Eric Weinstein with
07:02
Peter Thiel capital right yeah he’s like
07:04
this guy does know what the fuck he’s
07:05
talking about oh yeah he’s an actor
07:09
right this is nonsense right right that
07:11
was initial almost instantaneous
07:13
response yeah yeah and and and what real
07:16
clients did he ever have what any
07:17
jewelry trade and what has he got a
07:19
billion dollars or whatever he had yeah
07:21
no it’s half a billion under management
07:23
yeah it’s ridiculous why the guy who
07:25
owns Victoria Secrets give him a seventy
07:28
million dollar home right in New York
07:30
City like why I mean these are all
07:33
things that would have been really
07:34
interesting to get into you know try to
07:37
kill himself the suicide didn’t happen
07:39
to him like in the wire all right oh
07:41
yeah yeah so unfortunate so
07:45
unfortunately the cameras died so
07:47
unfortunately sustained an injury that’s
07:49
uh that you usually only get through
07:52
strangulation right yeah murders you he
07:54
fell on the ground and accidentally
07:56
broke his hyoid bone yeah big deal I
08:00
mean it’s so bizarre I can’t stand
08:03
consider conspiracy theories I’m one of
08:04
these people who who doesn’t like
08:06
reading but I can’t I can’t make this
08:08
story work in a way that isn’t you know
08:11
yes pure toriel’s well that’s the thing
08:13
it’s like it gets to a point where you
08:15
like okay even Michael Shermer who runs
08:17
skeptic magazine wait a minute the
08:20
cameras we’re not working seems like a
08:24
conspiracy fucking when Michael Shermer
08:26
says he that guy doesn’t believe in
08:28
anything
08:28
right right he is fucking he’s down the
08:31
line on virtually every single thing
08:33
that’s ever happened he doesn’t believe
08:35
in any conspiracies well well how do you
08:37
what’s the innocent explanation for any
08:39
has none that doesn’t make any sense you
08:41
can’t you can’t spin it in any way to
08:43
make it not a crazy conspiracies
08:46
especially when the the brother
08:48
as a doctor to do an autopsy oh yeah it
08:51
says date I was fucking murdered right
08:53
yeah Michael Baden the famous guy from
08:55
the HBO autopsy show right yep
08:57
absolutely
08:58
Oh craziness complete craziness and you
09:01
know it’s an example of you know the FG
09:07
star it’s interesting because it’s
because it’s about villains on both
sides of the aisle right this is a
classic is something I’ve written about
before is that the press does not like
to do stories where the problem is
bipartisan yeah right so when you have
an institutional problem when Democrats
and Republicans both share
responsibility for it when you know or
or if it’s an institution that kind of
exists in perpetuity no matter what the
administration is we don’t really like
to do those stories we like if Fox likes
to do stories about Democrats MSNBC
likes to do stories about Republicans
but the thing that’s kind of you know
09:42
all over the place they don’t like to do
09:43
that story Epstein is you know he’s
09:45
friends with Trump and and with Clinton
09:48
I mean yeah obviously has more friends
09:49
on the Clinton side but still and I
09:51
think that’s this is one of the reasons
09:53
why this story doesn’t have a lot of
09:55
traction in the media because neither
09:57
side really likes the idea of going too
10:00
deeply on it feels like to me well it’s
10:03
but the the blatant aspect of it they
10:06
don’t mean the closest that we have to
10:08
that is the absolute murder the Jamal
Khashoggi murder that’s the closest
thing we have to or is absolute murder
ight this one but but it’s also so
insanely blatant but now you have
10:19
foreign actors that are involved in it
10:20
and they all disperse and then there’s
10:22
left with this confusion of to who’s
10:24
responsible for it well Saudi Arabia
10:27
that’s another example where you can’t
10:29
really say it’s you know one side of the
both parties have been incredibly
complicit in their cooperation with the
Saudi regime and in you know the
massacres that are going on Yemen it’s a
classic example of what Noam Chomsky
used to talk about with worthy and
unworthy victims right like if the
Soviet communists did it they were that
was bad but if death squads in El
Salvador killed a priest or a Catholic
priest you know then that that was
something we didn’t write about because
they were our client state Yemen is a
story we don’t write about Syria
is a story we do write about but they’re
11:03
really equivalent stories and yeah the
11:07
but you’re absolutely right the
11:08
khashoggi thing I don’t think either
11:10
party and or either sides media really
11:12
wants to get into that all that deeply
11:14
how much is media shifting now like
11:17
you’ve obviously been a journalist for a
11:19
long time but come how much are things
11:21
changing in the light of the Internet
11:24
well a lot and this is what I mean I
11:26
have a new book out now that’s really
11:27
about this right what why the business
11:29
has changed what’s it called hey dink
11:31
yeah it’s out its out now and it’s it’s
11:36
really about how the press the business
11:38
model the press has changed I mean it’s
11:40
something that you talk about a lot you
11:41
hear you on your show all the time
11:43
talking about how news agencies are
11:46
always trying to push narratives on
11:48
people trying to get people wound up and
11:50
upset and that is a conscious business
11:53
strategy that we didn’t have maybe 30
11:56
years ago you know you think about
11:58
Walter Cronkite or what the news was
12:00
like back in the day you had the whole
12:02
family sitting around the table and
12:04
everybody watched it was sort of a
12:06
unifying experience to watch the news
12:08
hmm now you have news for the crazy
12:11
right-wing uncle and then you have news
12:12
for the kid in the che shirt and their
12:15
different channels and they’re trying to
12:16
wind these these people up you know to
12:20
get them upset constantly and stay there
12:22
and a lot of that has to do with the
12:23
Internet because before the internet
12:27
news companies had like a basically free
12:29
way of making money they dominated
12:31
distribution the newspaper was the only
12:33
thing in town that had a you know if you
12:35
wanted to get a wan ad it had to be
12:37
through the local newspaper
12:38
now with the internet the internet is
12:40
the distribution system anybody has
12:43
access to it not just the local
12:44
newspaper and so there the easy money is
12:48
gone and we have to chase clicks more
12:49
than we ever had had to before we have
12:52
to chase eyeballs more than we have to
12:53
so we’ve had to build new money-making
strategies and and a lot of it has to do
with just sort of monetizing anger and
division and all these things and we
13:01
just didn’t do that before and it’s a
13:03
had a profound difference on the
13:05
on the media as a writer if you
13:07
personally experienced this sort of the
13:10
influence where people have tried to
13:12
lean you in the direction of clickbait
13:14
or perhaps maybe alter titles that make
13:18
them a little bit disingenuous in order
13:20
to get people excited about the I mean
13:21
you know I my editors at Rolling Stone
13:24
are pretty good in it and they give me a
13:25
lot of weight leeway to kind of explore
13:27
whatever I want to explore but I
13:29
definitely feel a lot of pressure that I
13:30
didn’t feel before in the business
13:33
because especially in the Trump era and
13:36
and you know I’ve written a lot about
13:37
the Russia story right but you know
13:39
that’s an example of one size media does
13:43
has one take on it and another size
13:45
media has another take on it and if you
13:47
are just the journalist and you and you
13:49
want to just sort of report the facts
13:52
you feel a lot of pressure to fit the
13:53
facts into a narrative that your
13:55
audience is gonna like and I had a lot
13:57
of problem with the Russia story because
13:58
I thought you know I don’t like Donald
14:00
Trump but I’m like I don’t I don’t think
14:02
this guy’s James Bond consorting with
14:04
Russian spies I think he’s corrupt in
14:07
other ways and there was a lot of
14:09
blowback on my side of the business
14:11
because you know people in sort of
14:15
liberal quote-unquote liberal media you
14:17
just have all there’s a lot of pressure
14:18
to have everybody fit into a certain
14:20
narrative and I think that’s really
14:21
unhealthy for the business yeah
14:23
very unhealthy right it’s you know
14:25
because as soon as people can be
14:26
manipulated to conform it to that
14:27
narrative then all sorts of stories can
14:29
be shifted oh yeah yeah absolutely and
14:32
and you the the job used to be about
14:35
challenging your audience every now and
14:37
then right like if you think a certain
14:38
thing is true well it’s our job to give
14:40
you the bad news and say that you’re
14:41
wrong about that that used to be what
14:43
the job was to be journalists now it’s
14:45
the opposite now we have an audience
14:48
we’re gonna tell you exactly what you
14:49
want to hear and what you and we’re
14:51
gonna reinforce what you think and
14:53
that’s very unhealthy a great example of
14:56
this was in the summer of 2016 I was
15:00
covering the campaign I started to hear
15:03
reporters talking about how they didn’t
15:06
want to report poll numbers that showed
15:08
the race was closed they thought that
15:10
that was gonna hurt Hillary all right
15:12
like we said in other words we had
15:13
information that the race was closed and
15:15
we’re not telling this to audiences
15:17
because
15:18
they wanted to hear that it was gonna be
15:20
a blowout for Hillary right on and that
15:23
didn’t help Hillary it didn’t help the
15:25
Democrats do not warn people about this
15:27
right but it was just because if you
15:30
turned on MSNBC or CNN and you heard
15:33
that Trump was within five points or
15:35
whatever it was that was gonna be a
15:37
bummer for that audience so we stayed
15:39
away from it and you know this is the
15:42
kind of thing it’s it’s not politically
15:44
beneficial to anybody it’s just we’re
15:46
just trying to keep people glued to the
15:48
set by telling them what they want to
15:50
hear and that’s not the news that’s not
15:51
that’s not our job you know and it
15:53
drives me crazy
15:55
yeah it should drive you crazy that what
15:57
you said about journalism being it used
16:00
to be something that you’re challenging
16:02
your reader you’re you’re giving them
16:04
this reality that may be uncomfortable
16:06
but it’s it’s educational and expands
16:09
their view of the world this where do
16:11
they get that now they don’t that’s the
16:13
whole problem
16:14
like you get you can predict exactly
16:17
what the each news organization what
16:21
their take is going to be on any issue
16:23
by going oh
16:24
just did you take an example when when
16:28
the business about the Isis leader al
16:30
Baghdadi
16:31
being killed hit the news
16:34
instantaneously you knew that the New
16:36
York Times CNN and the Washington Post
16:39
that they were going to write a whole
16:40
bunch of stories about how Trump was
16:42
overplaying the significance of it that
16:44
he you know that he was telling lies
16:48
about it they were they mate they you
16:50
knew they were gonna make the entire
16:51
thing about Trump and then meanwhile Fox
16:54
had a completely different spin on about
16:55
how heroic it was but but news audiences
16:58
didn’t have anywhere to go to just
16:59
simply hear who was this person
17:01
why was he important what were the
17:03
growth of the people in the region think
17:05
you know what kind of what is this gonna
17:07
mean going forward is they actually
17:09
gonna have any impact you know is are we
17:13
gonna have to continually you know is
17:16
there gonna be a new person like this
17:17
every every time are we actually
17:19
accomplishing it you don’t get that
17:21
anywhere all you get is Trump is a
17:23
shithead on one side and Trump is a hero
17:25
on the other side that’s that’s not the
17:27
news you know yeah and but the thing is
17:29
it’s like
17:31
the business aspect of it is so weird
17:33
like you have your guys like Hannity or
17:35
you can absolutely predict what that
17:37
guy’s gonna say every single time you
17:38
know what side he’s on and he’s blatant
17:41
about it mhm and when you see someone
17:45
like that you go okay well this is okay
17:47
where this is this is peak bullshit
17:48
right so where where do we go where I
17:51
see both sides where’s the where’s the
17:53
where’s the middle ground where someone
17:55
goes well this is true but you gotta say
17:57
this is honest too and this is this is
17:59
what’s going on over on this side and
18:00
the Republicans have a point here and
18:02
you don’t you don’t
18:04
there’s no mainstream media place where
18:07
you can go for that right now
18:08
no there isn’t and that’s I mean I mean
18:10
one of this is one of things already but
18:11
this is one of the reasons why shows
18:12
like yours are so popular I mean I think
18:14
there’s a complete loss of trust that
18:17
they feel like people are not being
18:18
honest with them all right and they’re
18:20
not being straight and you know they
18:22
they come to people like you and and a
18:25
lot of other people of independent folks
18:27
who aren’t like the quote-unquote
18:29
mainstream media because they it’s not
18:34
really thought it’s not reporting it’s
18:36
not anything if you can predict a
18:37
hundred percent what a person’s going to
18:39
say that’s not thinking that’s not
18:41
reporting that’s not it’s just marketing
18:42
someone like me that’s so disturbing I’m
18:44
a fuckin comedian and a Cagefighting
18:46
commentator when people are coming to me
18:48
like this is this is the source where
18:51
you go for unbiased representations of
18:53
what’s going on the world that’s crazy
18:55
well I mean I started interview with
18:58
Barry Weiss right and you just you did a
19:01
simple base you didn’t go to journalism
19:02
school right no no so she said something
19:06
about how you know oh she’s an Assad
19:09
toady and you said what does that mean
19:12
you just ask the simple basic questions
19:14
right what does that mean where is that
19:16
coming from how do you know that you
19:18
know yeah like journalism isn’t brain
19:21
surgery that’s all it is is to asking
19:22
the simple questions that sort of pop to
19:24
mind when you when you’re in a situation
19:26
like where did this happen how do we
19:28
know that that’s true and but there’s a
19:32
whole generation of people in the press
19:33
now who just simply do not do that go
19:36
through the process of just asking
19:37
simple questions like how do I know
19:39
that’s true like after each story your
19:41
report you’re supposed to kind of like
19:43
wipe your memory clean and start over
19:45
so just because somebody was banned the
19:47
last time you covered them doesn’t mean
19:48
that they’re necessarily going to be the
19:50
bad guy this time you cover them all
19:52
right you have to continually test your
19:54
assumptions and ask yourself is this
19:57
true is that true is this true how do we
19:59
know this and we’ve just stopped doing
20:02
that like the it’s just the maratha of
20:04
like pre-written
20:06
takes on things and it’s it’s really
20:09
really bad and you can see why audiences
20:12
are fleeing from this stuff they just
20:14
don’t have the impact they used to well
20:16
it’s really interesting this a lot of
20:17
this is this unpredicted consequence of
20:20
having these open platforms like
20:22
Facebook and like where people are
20:24
getting their news and then the
20:25
algorithm sort of directs them towards
20:28
things that are going to piss them off
20:30
which I don’t even think necessarily was
20:33
initially the plan I think the plan is
20:35
to accelerate engagement right so they
20:37
find out what what what you’re engaging
20:40
with what stories you’re engaging with
20:41
and then they give you more of that like
20:44
re my friend Ari Shaffir actually tried
20:47
this out and what he did was he went on
20:50
YouTube and only looked puppy videos and
20:54
that’s all he looked at for like weeks
20:56
and then YouTube only started
20:58
recommending puppy videos to him so it’s
21:01
not necessarily that Facebook wants you
21:04
to be outraged but that when you are
21:06
outraged whether it’s over abortion or
21:08
war whatever the subject is you’re going
21:10
to engage more and their algorithm
21:12
favors you engaging more so if you’re
21:14
engaging more about something very
21:16
positive you know if you’re all about
21:17
yoga and meditation your algorithm would
21:20
probably favor yoga and meditation
21:22
because those are the things that you
21:23
engage with but it’s natural for people
21:26
to be pissed off sure to look for things
21:29
that are annoying especially if you’re
21:30
done working and you’re like kind this
21:31
world sucks what’s going on that sucks
21:33
worse and then you go to your Facebook
21:35
and oh Jesus look at this goddamn border
21:37
crisis right oh Jesus look at this while
21:39
fucking here’s the problem with these
21:41
goddamn liberal they don’t know sure and
21:42
you engage and then that’s your life and
21:46
then it’s it’s saying oh I know how to
21:47
get mad all fired up I’m gonna fucking
21:49
send them some abortion stories whoa
21:51
right and then that’s your feed right
21:53
yeah exactly but the but there’s so many
21:55
economic incentives that go in there
21:57
right they know the
21:58
more that you engage the longer that
22:01
you’re on right the more ads yes you can
22:03
you’re gonna see yeah right so that same
22:05
dynamic that Facebook and and the social
22:08
media companies figure it out
22:10
which is that if you keep feeding
22:12
something somebody something that you
22:14
know has been proven to spin that person
22:16
up and get them wound up that they’re
22:18
gonna they’re gonna come back for more
22:20
of it and they’re gonna keep coming back
22:22
and actually you can expand their desire
22:24
just to see that stuff by by making them
22:27
sort of more angry overall and they will
22:31
they will come back and they will spend
22:33
more and more and more time well the
22:34
news companies figured out the same
22:35
thing and they’re just they’re just
22:36
funneling stuff at you that they know
22:39
you’re gonna you’re gonna just be in an
22:42
endless cycle of sort of an impotent
22:43
mute rage all the time but it’s kind of
22:46
addicting you know and they know that
22:48
and in there and it’s it’s sort of like
22:50
the tobacco companies they know it’s a
22:51
bet it’s a product that’s bad for you
22:53
and they just keep giving it to you
22:55
because you know it makes money for them
22:56
yeah and it’s just the thing about it is
23:00
all of it is about ads told how many
23:04
clicks they get in ads if they just said
23:06
you can have a social media company but
23:08
you can’t have ads there’s a new federal
23:11
law no more ads on Facebook no more ads
23:13
on YouTube no our ads on Twitter no more
23:16
ads on Instagram good luck right yeah
23:19
we’re all collapse yo yeah but that
23:22
seems to be what it is it’s like they
23:24
figured out that your data is worth a
23:26
tremendous amount of money and the way
23:29
they can utilize that money is to sell
23:31
advertising mm-hmm ya know they they
23:33
coulda coming and going because they’re
23:35
they’re not only selling you ads or but
23:38
they’re also collecting the information
23:39
about your habits which they can then
23:41
sell again yeah so it’s a it’s a dual
23:43
revenue stream you know is it the media
23:46
companies they’re basically they’re just
23:49
consumer businesses where they’re
23:51
they’re trading attention for ad space
23:53
right so if they can get you to watch
23:55
four hours of television a day they have
23:57
that many ad slots that they can show
23:59
you and they know how much money they’re
24:00
gonna make you know but the the social
24:02
media companies get it two ways they’re
24:04
they they get it by you know attracting
24:06
your eyeballs and then also selling
24:08
selling your habits to the other the
24:10
next set of advertise
24:11
which you know is very insidious but
24:13
what’s interesting about this is that
24:14
most people don’t think about this as a
24:17
consumer business right like Americans
24:19
these days are very conscious of like
24:20
what they put in their bodies
24:21
you know they won’t eat too many candy
24:23
bowl depending on who they are right but
24:25
people at least look at what the
24:26
calories are but they don’t think about
24:28
the news that way or social media like
24:31
that with it put on their brains and
24:32
it’s also a consumer product yeah it
24:34
really is I’ve gone over that many times
24:37
with people that that’s a diet this is
24:39
your diet you have a mental diet as well
24:41
as you have a physical like food diet
24:43
absolutely of an information diet and a
24:46
lot of people are just eating shit with
24:48
their brain it’s the worst kind of junk
24:50
food it’s like it’s like a cigarette
24:52
sandwich the stuff yeah it’s so fucking
24:54
bad and it’s getting worse it is it is
24:56
getting worse and it’s what’s weird is
24:58
that this is a ten-year-old problem and
25:00
no one saw it coming and it’s kind of
25:02
overtaking politics it’s overtaking the
25:04
social discourse everybody’s wrapped up
25:07
in social media conversations they carry
25:09
them on over to the dinner table and it
25:11
gets people in arguments at work and all
25:14
this stuff no one saw coming these that
25:17
no one saw the this outrage economy from
25:21
you know social media sites from things
25:23
like Facebook no one saw that no one no
25:25
one ever predicted that your data was
25:27
gonna be so valuable no the fuck saw
25:29
that I don’t think anybody I mean I
25:32
think some people in the tech business
25:33
probably saw early on yeah it’s a
25:35
potential for this but you know in terms
25:38
of other other businesses like the news
25:41
media and also politics I mean you have
25:43
to think about the impact of this on
25:45
politics has been enormous
25:47
you know I covered Donald Trump Trump
25:50
really was just all about whatever
25:52
you’re pissed off about I’m right there
25:54
with you you know and people are just
25:56
sort of pissed off about lots of things
25:58
these days because they’re doing this
25:59
all day long you know and if you if you
26:02
can if you can take advantage of that
26:05
then you’re gonna have a lot of success
26:06
and I think I think a lot of people
26:07
haven’t figured that out and some of
26:09
these things are real causes like people
26:11
are upset about real things but it’s
26:14
just yeah you’re absolutely right people
26:16
did not see this coming and they didn’t
26:18
prepare for it’s just weird that it’s
26:19
one of the biggest sources of income on
26:22
and people didn’t see it coming I mean
26:24
Facebook is generating billions of
26:27
dollars and now yeah potentially
26:29
shifting global politics yeah and you
26:33
know the the whole issue of a couple of
26:36
companies like Facebook having control
26:39
over what you do and do not see is yeah
26:41
it’s an enormous problem that nobody
26:43
nobody really cares about I’ve tried to
26:45
write about it a few times I’ve written
26:47
a couple of features about it what about
26:49
how what a serious problem this is look
26:51
if you look at other countries like
26:53
Israel China there there are a number of
26:58
companies where you’ve seen this this
26:59
pattern of internet platforms
27:02
liaison with the government to decide
27:04
what people can and cannot see and
27:06
they’ll say well we don’t want to see
27:09
you know Palestinian protest movement so
27:11
we don’t want to see you know the
27:13
Venezuelan Channel tell us or like we
27:16
want to take that off you think about
27:18
how that could end up happening in the
27:20
United States and it is already a little
27:21
bit happening it’s a little bit but it
27:23
seems to be happening only in the terms
27:25
of like leaning towards the progressive
27:27
side which people are okay with because
27:28
they think especially in the light of
27:30
Donald Trump being in office this is
27:32
acceptable censorship yeah but they’re I
27:34
think they’re wrong about I think you’re
27:36
wrong about that – yeah and terribly
27:38
dangerous
27:38
it’s very short-sighted yes in and they
27:41
and I think there’s there’s also this
27:43
thing that happens with people where
27:46
they think well this is never gonna
27:48
happen to me you know like you can do
27:50
that bad thing to this person that I
27:52
don’t like but you know as long as it’s
27:53
never gonna happen to me exactly but
27:55
they’re wrong and my history shows it
27:57
always does happen to you you know and
27:58
that’s so we’re giving these companies
28:00
an enormous amount of power to decide
28:02
all kinds of things what we look at what
28:06
what kind of political ideas we can be
28:08
exposed to you know I think it’s very
28:11
very dangerous
28:11
that biased interpretation of what
28:13
something is that was what people talked
28:16
about when the initial Patriot Act was
28:18
enacted when people were like hey this
28:21
might be fine with Obama in office right
28:23
maybe Obama is not going to enact some
28:28
of the worst clauses of this and use it
28:30
on people or the was the NDAA so I would
28:34
rise yeah
28:35
where some of the things were just
28:36
completely unconstitutional but don’t
28:38
worry we’re not gonna use those but
28:40
you’re setting these tools aside for
28:43
whatever fucking president we have like
28:45
what if we have a guy you out trumps
28:46
Trump right I mean we never thought we’d
28:48
have a Trump right what if we have a
28:50
next-level guy post Trump what if
28:52
there’s some sort of catastrophe tragedy
28:56
attack something that really gets people
28:59
fired up and they vote in someone who
29:01
takes it up to another level and then he
29:03
has these tools and then he uses these
29:04
tools on his political enemies which is
29:06
entirely possible well I mean we’ve
29:08
already seen that a little bit I mean
29:10
people don’t want to bring this up I
29:11
mean i but you know a lot of the stories
29:14
that have come out about Trump they’re
29:15
coming from leaks of classified
29:17
information that are coming from those
29:19
war on terror programs that were
29:21
instituted after 9/11 yes this is five
29:23
the certifies Amendments Act the NSA
29:26
programs to collect data like they’re
29:27
they’re unmasking people like we have a
29:30
lot of evidence now that there was a
29:32
lawsuit a couple that came out about a
29:33
month ago that showed that the FBI was
29:37
doing something like 60,000 searches a
29:39
month at one point where they’re on you
29:42
know they were asking the NSA for the
29:44
ability to unmask names and that that
29:45
sort of thing so we’re I mean these
29:49
tools are incredibly powerful they’re
29:51
incredibly dangerous but people thought
29:52
after 9/11 they were scared so you know
29:55
we want to protect ourselves so that’s
29:57
okay for now
29:58
you know well we’ll pull it back later
30:00
but they mean it but you never do pull
30:02
it back right no and I mean it always
30:04
ends up being used by somebody in the
30:06
wrong way and I think we’re starting to
30:08
see that that’s going to be a problem
30:09
yeah I’m real concerned about places
30:13
like Google and Facebook altering the
30:16
path of free speech and and leaning
30:20
people in certain directions and
30:21
silencing people that have opposing
30:23
viewpoints and the fact that they think
30:26
that they’re doing this for good because
30:28
this is how they see the world and they
30:30
don’t understand that you have to let
30:32
these ideas play out in the marketplace
30:34
of free speech and free ideas if you
30:36
don’t do that if you don’t do that if
30:38
you don’t let people debate the merits
30:40
the pros the cons what’s wrong what’s
30:42
right if you don’t do that then you
30:44
don’t get real discourse if you don’t
30:45
get real discourse you’re essentially
30:47
you’ve got some sort
30:48
of an intellectual dictatorship going on
30:49
and because it’s a progressive
30:51
dictatorship you think it’s okay because
30:53
it’s people who want everybody be
30:55
inclusive and you know I mean this is
30:58
this is a weird time for that it’s a
31:00
really weird time for that because as
31:01
you said people are so short-sighted
31:03
they don’t understand that these like
31:05
the First Amendment’s in place for a
31:07
very good reason and set up a long
31:09
fucking time ago because they did the
31:10
math they saw where it was going and
31:12
they were like look we have to have the
31:14
ability to express ourselves we have to
31:15
have the ability to freely express
31:18
thoughts and ideas and challenge people
31:20
that are in a position of power because
31:21
if we don’t we wind up exactly where we
31:24
came from mm-hmm yeah no and and courts
31:27
continually reaffirmed that idea that
31:30
the the the way to deal with bad speech
31:33
was with more speech yes and they did it
31:35
over and over and over again you know we
31:37
we the the legal standard for speech you
31:41
know still I think remains that unless
31:45
it’s directly inciting violence you
31:46
couldn’t you could like you can have
31:47
speech that incites violence generally
31:49
and even the the Supreme Court even
31:51
upheld that you can have speech that’s
31:53
that comes from you know material that
31:55
was stolen illegally that’s okay but we
31:58
had a very very high bar for prohibiting
32:00
speech always and you know the the
32:03
liable cases this the cases for
32:05
defamation you know that also
32:08
established a very very high standard
32:09
for punishing speech but now all of a
32:12
sudden people have a completely
32:13
different idea but it’s like you know
32:15
forget about the fact that this was a
32:17
fundamental concept in American society
32:19
for you know two hundred and thirty
32:21
years or what they just want to change
32:22
it you know without thinking about the
32:25
consequences well that’s where a guy
32:26
like Trump could be almost like it’s
32:30
almost like a Trojan horse in a way like
32:33
if you wanted to play 3d chess which you
32:35
would do you’d get a guy who’s just so
32:37
egregious and so outrageous and then so
32:40
many people opposed them get that guy
32:42
let him get into a position of power and
32:44
then sit back watch the outrage bubble
32:46
and then take advantage of that and
32:48
funnel people into certain directions I
32:50
mean I don’t think that’s what’s
32:51
happening but if I was super fucking
32:54
tinfoil Hattie that’s how I would go
32:57
about it I would say this is what you
32:58
want if you really want to change things
33:00
for your direction
33:01
put someone that opposes it that’s
33:04
disgusting and that way people just a
33:07
rational intelligent person is never
33:10
gonna side with him
33:11
so they’re gonna side with the people
33:12
that oppose him and then you could sneak
33:14
a lot of shit in that maybe they
33:15
wouldn’t agree with and any other
33:16
circumstance yeah Trump’s election sort
33:19
of like another 9/11 right like you know
33:21
9/11 happened all of a sudden people who
33:23
weren’t in favor of the government being
33:25
able to go through your library records
33:27
or listen to your phone calls and all of
33:29
a sudden they were like oh Jesus I’m so
33:30
freaked out like yeah fine when Trump
33:33
got elected all of a sudden people
33:34
suddenly had very different ideas about
33:36
speech and like they you know hey that
33:39
guy’s so bad you know that maybe we
33:42
should consider banning x y&z yeah and I
33:46
yeah it’s me if he was conceived as a
33:52
way to discredit the First Amendment and
33:55
some other ideas that would that would
33:57
that would be a brilliant 3d chess move
33:59
yeah super sneaky yeah that’s like China
34:02
level many steps ahead exactly I mean
34:07
what do you think all this goes it seems
34:11
like this is I mean obviously just wrote
34:13
a book about it but it seems like this
34:15
is accelerating and it doesn’t seem like
34:18
anyone’s taking a step back and hitting
34:20
the brakes or opting out it seems like
34:23
people are just ramping up the rhetoric
34:25
yeah I mean I think that the the
34:27
divisive miss problem is is going to get
34:29
worse before it gets better
34:32
the the business model of of of the
34:35
media now is so entrenched that until
34:39
some of these these companies start
34:42
going out of business because they’re
34:43
doing you know they’re losing audience
34:46
because people don’t trust them anymore
34:48
the you know the news is going to keep
34:50
doing what it’s doing it’s gonna Canada
34:52
model is gonna become normal for for
34:55
news companies I think it or it already
34:57
basically is you know on both the left
34:59
and the right and in terms of you know
35:02
the Internet companies they’re
35:05
consolidating they’re getting more and
35:06
more power all the time and there’s I I
35:09
think we’ve already seen that people
35:11
have I think too much tolerance for
35:13
letting letting them make decisions
35:14
about
35:15
what we can and cannot see and I think
35:18
it’s gonna get worse before it gets
35:19
better I don’t know what do you think I
35:20
yeah that’s what I think I mean Facebook
35:22
Twitter all these play Twitter has some
35:24
of the most ridiculous reasons for
35:25
banning people one of them is dead
35:27
naming oh yeah
35:28
so if you call Caitlyn Jenner Bruce
35:30
right okay I like you better when you
35:32
were Bruce banned for life right you
35:34
can’t even say I liked you better when
35:35
you were Bruce banned for life right
35:38
yeah and and and actually that that
35:40
what’s really interesting about that is
35:42
that’s a that’s a core concept that
35:46
we’ve changed completely like all the
35:48
different ways in the past that we
35:49
punished speech we punished the speech
35:52
not the person yes right so if you know
35:55
liable defamation all those things first
35:58
of all they were all done through the
35:59
courts so you had a way to fight back if
36:02
you thought you were unjustly accused of
36:04
having defamed somebody or live with
36:05
somebody but if they found against you
36:08
the person who got something out of it
36:11
was the person who was directly harmed
36:12
right and again the courts judged that
36:14
and they you know it wasn’t like you
36:17
were banned from for life from ever
36:19
speaking again right they just gave a
36:21
bunch of money to a person who might
36:23
have suffered some kind of career injury
36:24
or whatever it was because of that and
36:28
usually there was a retraction or it was
36:30
removed from the press or whatever it
36:31
was but it wasn’t like we were we were
36:33
saying we’re never gonna allow you to be
36:35
hurt or seen from again we kind of won’t
36:38
we were sort of encouraging
36:39
optimistically people to get better
36:41
right and yeah and to be different right
36:44
now and now we’re not doing that at all
36:45
now we’re just saying you won one strike
36:47
or two strikes whatever you’re gone and
36:49
it’s not like it’s a public thing so you
36:52
can’t sue over it right yeah well that’s
36:54
what’s crazy about it because it is a
36:56
public utility in a way yes it is even
36:59
Jack Dorsey from Twitter and admitted as
37:02
much on the podcast and he wishes that
37:04
we would view it that way he’s actually
37:05
proposed two versions of Twitter a
37:08
Twitter with their standard censorship
37:11
in place and then a Wild West Twitter
37:13
mm-hmm but I’m like sign me up right
37:15
yeah on that Wild West Twitter right is
37:17
the problem with like things like gab
37:19
and I’ve gone there a few times and
37:22
watched it and I mean even Milo Union
37:24
appleís is criticized for being this is
37:26
that it’s just like so hate-filled
37:27
because it’s the place where you can go
37:29
and
37:29
fuckin say anything right so the only
37:31
people that it’s attracting are people
37:33
that just want to go there and just
37:34
fucking shoot off canons of n-bombs and
37:36
RAL everybody a kike it’s crazy
37:39
I mean it’s and there’s real
37:41
communication there as well there’s
37:43
there’s plenty of that too but the sheer
37:47
number of people that go there just to
37:49
blow off steam because they can’t say
37:51
those things on Twitter or Facebook or
37:53
any other social media platform without
37:55
being banned because of that it becomes
37:57
a channel for it mm-hmm you know and
37:59
it’s like it doesn’t get a chance it
38:01
doesn’t get a chance to the concept is
38:03
great the concept is if you’re not doing
38:05
anything illegal we’re not gonna stop
38:07
you’re not daxing anybody you’re not
38:08
threatening anybody’s life we’re not
38:09
gonna stop you go ahead but if you you
38:11
do that and you’re the only one that
38:13
does that unfortunately everyone who
38:15
wants to just say fucked up shit goes
38:18
right and you get a disproportionate
38:19
amount of fucked up shit
38:21
yeah and it’s directly because the fact
38:23
that these places like Twitter or
38:24
Facebook have censored and they make it
38:27
so you are scared to say whatever you
38:29
want to say mm-hmm and so you can so
38:31
even if you have controversial ideas
38:33
that maybe some people would agree with
38:34
in someone you get banned for life for
38:37
just controversial ideas even
38:39
controversial ideas that are
38:40
scientifically and biologically factual
38:43
right the transgender issue like if you
38:46
say there’s a woman I brought her up a
38:48
million times when Megan Murphy yes a
38:51
man is never a woman she says they tell
38:53
her to take it down she takes a
38:55
screenshot of it puts that up takes it
38:57
down but takes a screenshot of the
38:59
initial tweet haha look at that banned
39:02
for life right a man is never a woman is
39:04
a fact that is a fact it’s a biological
39:06
fact now if you decide to become a woman
39:09
and we recognize you as a woman in
39:11
society well that’s just common courtesy
39:12
in my eyes like you have a person who
39:14
has this issue they feel like they were
39:16
born in the wrong body okay I get that
39:18
I’m cool with that
39:19
but to make it so that you’re banned
39:21
forever you can call someone a dumb fuck
39:24
an idiot a piece of shit your mother
39:27
should have swallowed you everybody’s
39:28
like yeah do Petain Terms of Service
39:29
seem fine here everything’s good say a
39:32
man is never a woman gone for life right
39:35
yeah Caitlyn Jenner I liked you better
39:37
when you’re bruce dunn that’s it yeah no
39:40
and and it’s crazy and obviously
39:42
people see that and they and they just
39:44
get matter and and it seems to
39:46
legitimate
39:47
it makes people very very resentful in
39:50
ways that they wouldn’t be otherwise it
39:52
makes there’s no pathway there’s no
39:54
there’s no other thing right there’s no
39:56
free speech platform that’s universally
39:59
accepted like these ones like I said
40:02
like gab or there’s a couple other ones
40:03
out there there’s not no one’s using
40:06
them yeah it’s a very small percentage
40:07
of the people in comparison to something
40:09
like Twitter which is enormous right and
40:11
so because people don’t want to be
40:13
kicked off the platform they’re
40:14
radically changing there is no sense
40:18
right and we’re seeing this a lot also
40:19
with political ideas to like you know
40:22
you know I have a podcast used useful
40:24
idiots it’s called right we’re like we
40:26
try to talk to people who are kind of
40:28
excluded from mainstream media because
40:30
that’s happening a lot now right like if
40:32
you have the wrong idea about anything
40:35
whether it’s Russia gate or the
40:38
israel-palestine conflict or Syria or
40:40
whatever it is you’ll you will suddenly
40:44
be sort of labeled I mean with tulsi
40:46
gabbard friends they call her an
40:47
assadist right like once you get stuck
40:50
with the term assadist on twitter nobody
40:52
wants to associate you with you no one
40:54
wants to defend you right they all kind
40:56
of and be it’s your you’re like suddenly
40:59
like the kid with lice and people don’t
41:02
want that to happen to them so they stop
41:04
saying X Y & Z yeah right and and they
41:06
just sort of go with with the flow will
41:08
go with the crowd and it causes this
41:10
sort of you know uniform conformist
41:16
discourse that does isn’t really about
41:18
anything right people are afraid to talk
41:20
which is crazy yeah right well you’re
41:23
not supposed to talk to someone I
41:25
experience this all the time the this
41:27
idea of giving someone a platform like
41:29
look if I have someone on like a ben
41:31
shapiro or something like that you
41:32
shouldn’t give that guy a platform well
41:34
he’s already got a platform should
41:36
wouldn’t be better if I just talk to him
41:37
and find out what his ideas are and ask
41:40
him about those ideas like we had a very
41:41
bizarre conversation about gay people
41:43
where it means basically full on
41:46
biblical religious interpretation of gay
41:49
people which to me is always strange
41:52
like okay how do you stand on shellfish
41:54
you know do you
41:56
just as strong on shrimp whereas your on
41:59
gay guys right like why is it gay guys
42:02
it’s that like the Bible’s pretty clear
42:05
on a bunch of different things that
42:07
don’t seem to fire people up the way
42:10
homosexuality does like why why do you
42:13
care if you had a friend that was eating
42:14
shrimp would you go to his house we had
42:16
shrimp cocktail no but you wouldn’t go
42:18
to a friend’s house if he was having a
42:20
gay marriage mm-hmm so you won’t
42:23
celebrate gay marriage but you don’t
42:25
mind a guy who’s got a fucking a
42:27
shellfish platter right out at a party
42:30
like that’s in the Bible man right
42:33
you’re not supposed to wear two
42:33
different kinds of cloth you you know
42:36
that is the bunt there’s a bunch of shit
42:38
in the Bible that you like Wow God was
42:40
wrong about that like how confident are
42:42
you right how comforting you that you
42:44
can interpret God’s Word so perfectly
42:46
that you’re like you let the lobster
42:48
slide but all that but fucking we got to
42:50
stop that you know like it’s really
42:52
weird but that’s the whole point of you
42:54
to challenge the idea yes yes but but
42:57
the prevailing view now is that even
43:00
having the discussion yes because you
43:03
have a platform I mean I read that thing
43:05
in Al and the Atlantic you know where
43:06
they’re like you you you give people to
43:09
I forget what the phrase was they were
43:11
saying something like you had I give to
43:14
people too many chances too many chances
43:15
the people who had already forfeited the
43:17
right to have them or something
43:18
something along those lines but that was
43:20
silly yeah guy gave up his hand when he
43:21
said about me that I’m inexhaustible but
43:24
that he like snaps right oh it’s about
43:27
you and now that’s what it is you not
43:30
you like naps okay so you don’t like
43:32
people that have energy I’m super sorry
43:34
but the the you know I thought that
43:36
piece was really interesting because
43:38
that the whole idea that there are
43:40
people who have forfeited the right to
43:42
take a forever to communicate forever
43:44
well who decides that I mean it again
43:46
there’s this there’s this intellectual
43:48
snob ISM yet goes on and you know
43:52
frankly on my side of the media aisle
43:54
where well we’ll decide what what an
43:56
appropriate thought is what’s what’s
43:58
right thinking what’s wrong thinking you
44:01
know what who gets to have a platform
44:03
who doesn’t get to have a platform who
44:04
we who were gonna call a monster who are
44:06
not currently I mean Minh Tunder stand
44:09
that the arrogance where
44:10
from to decide that some people you know
44:13
and I totally disagree with people like
44:14
you know Alex Jones or Shapiro or you
44:17
know most things and but I don’t think
44:20
that they should be wiped off the face
44:21
of the earth I mean I don’t know
44:23
well it’s interesting to challenge
44:24
people on these weird ideas and find out
44:26
how they come to them and and you will
44:28
get a lot of fence sitters that will
44:30
recognize the flaws in their thinking if
44:32
you let them talk because there’s a lot
44:34
of people that aren’t sure either way
44:35
maybe they haven’t invested a lot of
44:37
time investigating it maybe they really
44:39
don’t know what this guy stands for
44:41
maybe they just read a cartoonish
44:42
version of who he is and then you get to
44:44
hear them talking to go oh well I see
44:46
the flaw in his thinking or oh well he’s
44:49
right about some things and a lot of
44:50
people are right about some things
44:52
they’re wrong about things and they’re
44:54
right about things and the only way you
44:55
can discern that is you communicate with
44:58
them but as soon as you deep platform
44:59
people like forever you’re just gonna
45:01
make a bunch of angry people you’re just
45:03
gonna make a bunch of people that are
45:05
completely distrusting and you’re gonna
45:07
absolutely empower the opponents of your
45:10
ideas but like people that do get to
45:13
when when do they get a chance to have
45:14
their voice well when they vote so the
45:17
more you do this shit the more you
45:18
censor conservatives the more they’re
45:20
gonna vote against liberals this is just
45:22
a fact
45:23
mm-hmm there’s no getting around that
45:24
this is human nature yeah I mean I I
45:26
lived in the former Soviet Union you
45:30
know for 11 years and 100% if you lived
45:36
in Soviet Russia and something was
45:38
published by an official publisher
45:40
people thought it was basically bullshit
45:42
right but if it was in the samizdat if
45:45
it was in the privately circled stuff
45:47
that had been repressed and censored
45:48
people thought that was the coolest
45:50
thing in the world like that that was
45:51
the hot ticket right and you’re
45:53
automatically giving something cachet
45:56
and an added weight by censoring it I
46:01
mean this is just proof it’s just the
46:02
way it works it’s human nature if people
46:04
think that you don’t want them to see
46:05
something they’re gonna run through it
46:07
twice as hard you know so I just don’t
46:09
understand a lot of that instinct I
46:11
think people people have this idea that
46:13
it works that you know the deep
46:16
platforming works but you can’t deep
46:18
platformer an idea you know you may be
46:20
able to do it to a person or to yes but
46:23
you eventually
46:23
you have to confront the idea you could
46:25
do it to a few people and it has been
46:27
successful which is one of the reason
46:28
why people are so emboldened like they
46:30
have a successful IDI platform Milo
46:32
mm-hmm I mean they really have it’s very
46:34
hard to hear him talk anymore you don’t
46:36
he’s not in the public conversation the
46:39
way he used to be right because they
46:41
kicked him off of all these different
46:42
platforms and if you go into why they
46:45
kicked him off these different platforms
46:46
but even if you don’t agree with him and
46:48
I don’t own a lot of things like boy I
46:51
don’t agree with kicking him off those
46:52
platforms if you you listen to what he
46:54
got kicked off for it’s like man I don’t
46:56
know this this doesn’t seem like this
46:58
makes a lot of sense yeah no I mean the
47:01
same thing with Alex Jones
47:02
yeah Alex Alex Jones has said you know
47:05
he’s got after me a couple of times in
47:07
ways that were pretty funny actually but
47:09
when he was you know kicked off the all
47:12
these platforms you know I wrote a piece
47:14
saying I think people are kind of doing
47:16
an end zone dance a little early on this
47:18
one you know because you Jones is a
47:22
classic example of how the system
47:25
the way the system used to work they
47:26
would have punished him for for being in
47:29
the libelous about the Sandy Hook thing
47:31
right because that that would sort of
47:33
fit the classic definition of what was
47:34
what prohibited speech was before but we
47:37
wouldn’t any he would have lost probably
47:39
a lot and he still might on in those
47:41
court cases but to remove him forever I
47:44
think you know it just sets it it
47:48
creates a new way of dealing with speech
47:51
that I think is very dangerous you know
47:53
right because the goalposts keep getting
47:54
moved right if you can ban him for that
47:57
then why don’t you ban me for repeating
47:59
the things that I said about Megan
48:01
Murphy right or ban because what I said
48:03
about Bruce Jenner banned this for that
48:06
I mean you it gets you get further and
48:08
further down the line you keep moving
48:09
these goalposts and next thing you know
48:11
you’re in a very rigid tightly
48:13
controlled area where you can
48:15
communicate and you’re suppressed and
48:17
that just it accelerates your desire to
48:21
step out of that boundary and it makes
48:23
you want to say things that maybe you
48:24
wouldn’t even have thought of before and
48:26
also logistically it’s an incredibly
48:28
it’s a it’s an insane thing to even
48:31
think about asking platforms to
48:34
rationally go through all this content I
48:36
talked to somebody who
48:37
a pretty high-ranking Facebook executive
48:38
after the Alex Jones thing and he said
48:41
think about what we used to used to do
48:42
just to keep porn off Facebook and we’re
48:46
dealing with what a couple of billion
48:48
items of content every single day we had
48:50
these really high-tech algorithms that
48:52
we design to look for flesh tones at
48:54
that site and that’s how the Vietnamese
48:57
running girl photo got taken off
48:59
Facebook because they like automatically
49:01
spotted a naked girl I know and they
49:04
took that down you know the he’s like
49:07
the Facebook I’ll go doesn’t know that’s
49:08
an icon of fucking journalism right like
49:10
it just knows it’s a naked girl so you
49:13
say you take that and now you’re gonna
49:15
ask Facebook to make decisions about
49:17
about ideas right like if it’s that hard
49:21
and that expensive for us to go through
49:23
and just just to keep child porn off of
49:27
Facebook think about how crazy it’s
49:29
gonna be when we when we start having
49:31
entry-level people deciding what is and
49:33
is not appropriate political content
49:35
yeah it’s it’s not only gonna be
49:37
impossible to enforce it’s it’s gonna
49:41
they’re gonna make a mess of it and they
49:43
will and they already are you know and I
49:44
think that’s what we’re seeing well
49:46
that’s why Twitter so weird because you
49:48
can get away with shit on Facebook you
49:50
can say things on Facebook like Facebook
49:52
doesn’t have a policy about dead naming
49:54
or Facebook doesn’t have a policy about
49:56
misgendering people but they do have a
49:59
porn policy
50:00
well now Twitter you can have porn right
50:04
me then I will have to be very careful
50:06
when I give my phone to my kids
50:08
that make sure they don’t open up the
50:09
fucking Twitter app yeah because I
50:11
follow a lot of dirty girls and some of
50:12
them I mean they’re it’s just right
50:15
there there’s no warning bang right in
50:17
your face I mean it’s kind of crazy
50:18
right they have such an open policy when
50:22
it comes to sex which I’m happy they do
50:24
I’m happy not even that I want to see
50:26
porn but I’m happy that their attitude
50:28
is just fine it’s legal do yeah you
50:32
don’t have to follow those people if you
50:33
don’t like it seems like it’s in the
50:35
American spirit to be I know but but it
50:38
all comes down to for me but but ya know
50:41
the the policies are completely
50:43
inconsistent to with with Twitter like
50:45
I’ve seen I mean I’ve talked to people
50:46
who have been removed from Twitter for
50:49
saying pretty
50:50
you know pretty borderline things right
50:53
like they’re you know basically pretty
50:55
mild insults or something that would be
50:57
threatening only if you really splinted
50:59
hard you know
51:00
there was a guy from the Ron Paul
51:01
Institute who got this who got taken
51:03
down for instance because he was having
51:04
a fight with some you know guy who was I
51:07
think a Clinton fan I forget what it was
51:09
exactly but you’ll see behavior that’s
51:13
much worse from people who have another
51:16
political ilk and they will not be
51:18
removed or they might be a smaller
51:20
profile person they won’t be removed so
51:22
and then what is that all about right
51:24
like if if it’s only a person who has
51:25
20,000 followers or higher we’re gonna
51:27
mean it’s just so you just can’t do it
51:30
there’s just too many layers and anyway
51:33
I’m against it just generally but just
51:35
in terms of logistics it doesn’t make
51:36
any sense I’m against it generally too
51:38
and when I talked to Jack and he was
51:39
explaining to me the problems with
51:41
trying to manage things at scale you
51:44
really kind of get a sense of it like oh
51:46
you guys are dealing with billions and
51:48
billions of humans using these things
51:50
right yeah yeah and and but they’re
51:53
already you know in many countries
51:56
around the world they have armies of
51:59
thousands of people who go through
52:01
content to try to flag this or that kind
52:03
of political content yeah in a niche
52:05
people yeah they have you know in
52:07
Germany has like it got I forget what
52:10
the term was they had this um some
52:11
really scary sort of authoritarian word
52:13
for like filtration centers or something
52:15
like that
52:16
you know the Chinese have have armies of
52:20
people I mean I did a story about
52:21
Facebook and how it was you know teaming
52:25
up with groups like the the Atlantic
52:27
Council here in the United States
52:28
remember a couple of years ago the
52:31
Senate called in Twitter Facebook and
52:33
Google to Washington and asked them to
52:37
devise strategies for preventing the
52:39
sowing of discord you know so they
52:41
basically what’s asking them to come up
52:44
with strategies for filtering out fake
52:47
news and then also certain kinds of
52:49
offensive content but you know that is a
52:53
stepping stone to what we’ve we’ve seen
52:55
in other countries I think you know and
52:57
I think it’s really worrisome but but
52:59
nobody seems to care on our side of the
53:01
aisle which is which is very strange
53:02
myself it’s
53:04
miles well it’s a it’s a censorship
53:06
issue you know and it’s it’s a
53:09
short-sighted thing as you said before
53:10
it’s people and it’s not even there’s
53:14
people that do pretty egregious things
53:16
from the left like the Covington school
53:18
thing when people were saying we got to
53:21
Doc’s these kids and give me their names
53:23
release their names these people are
53:25
still on Twitter to this day right
53:27
talking about kids that just happen to
53:28
have these make America great again hats
53:30
and I have a friend who used to live in
53:32
that area said like no you don’t get it
53:33
like there’s these stands these kids are
53:36
on the high school like field trip
53:38
there’s these stands we could buy these
53:39
hats everywhere these kids bought the
53:41
hats they’re they think they’re being
53:43
funny these guys play the music and then
53:45
get in their face you take a photo of it
53:47
it looks like this guy’s standing in
53:49
this Native American guy’s face but then
53:51
you see the whole video it’s no no the
53:53
Native American guy was playing his drum
53:54
walking towards him and then everybody
53:58
sorts probably it’s outrage cycle it’s
54:03
just so exhausting in a signaling
54:05
everyone’s signaling how virtuous they
54:07
are everyone’s signaling they’re on the
54:09
right side everyone’s signaling you know
54:11
I want names take these guys down like
54:14
you’re talking about sixteen year old
54:15
kids right it’s so fucking crazy and all
54:18
what is he Vic he’s guilty of smiling
54:20
was that what he guilt he’s guilty of
54:22
yeah no he’s got a mag a hat on I mean
54:24
yeah it’s crazy and the signaling thing
54:27
is crazy and you know for me the in the
54:30
in the news business a lot of people
54:32
that I know went into the when at the
54:35
journalism precisely because we didn’t
54:37
want to talk about our political views
54:39
like the whole point of the job is like
54:41
you know we’re just gonna tell you what
54:43
the facts are like not gonna tell you
54:44
about what I’m all about you can’t do
54:46
that anymore everything’s editorialize
54:48
everything is about editorializing and
54:51
signaling this is like what you’re
54:52
saying you’re telling people what your
54:55
stance is on things and that’s that’s
54:58
the opposite of what the job used to be
54:59
and this is again one of the things I’ve
55:01
been trying to focus on is that you know
55:04
what’s exactly what you’re talking about
55:05
people used to go to the news because
55:07
they wanted to find out what happened in
55:08
the world and they can’t do it anymore
55:10
because everything that you turn on
55:11
every kind of content is just
55:14
editorialized content where people are
55:16
sort of telling you
55:17
where they stand on things and you know
55:19
I don’t want to know that I wouldn’t
55:20
know what the information yes it’s so
55:22
hard how does this get resolved because
55:24
we’re dealing with essentially a two
55:26
decade old problem right I mean give or
55:28
take before that before the this the
55:32
social media and before the internet and
55:34
websites this justice wasn’t this wasn’t
55:37
what it was you could count on the New
55:38
York Times to give you an unbiased
55:41
version of what’s going on in the world
55:43
I don’t necessarily know that’s true
55:45
anymore no no other times has kind of
55:47
gone over to this model as well and
55:49
they’re super woke they’ve they’ve
55:50
struggled with it they they were that
55:52
there was an editorial and I wrote about
55:54
this in the in the book that the in the
55:56
summer of 2016 this guy Jim Ruttenberg
55:59
wrote the sort of this piece said Trump
56:00
is testing the norms of objectivity that
56:02
was the name of the piece and basically
56:05
what he said is Trump is so bad that we
56:06
have to look like rethink what
56:08
objectivity means we have to not only be
56:11
true but true to history’s judgment he
56:14
said and we have to have copious
56:16
coverage and a gret quote and aggressive
56:18
coverage so we’re gonna cover Trump a
56:19
lot we’re gonna cover him aggressively
56:21
and we’re gonna show you we’re gonna
56:23
take a stand on this issue rather than
56:26
just tell you what happened right so
56:28
rather than doing the traditional New
56:29
York Times thing of just the facts will
56:32
tell you sorted out right you figure out
56:34
we’re gonna tell you you know kind of
56:36
had it what your stance should be and
56:39
you know I think where does where do we
56:41
go from here how does it get resolved I
56:43
don’t know because you know unless the
56:46
the financial incentives change there
56:49
they’re not going to change you know the
56:51
business used to be back when you’re
56:54
talking about with New York Times and
56:55
then there were three networks and they
56:57
were all trying to get the whole
56:58
audience right so they were they were
57:00
they were doing that kind of neutral
57:02
fact-finding mission and it was working
57:04
for them financially now they can’t do
57:06
that because of the internet it’s it’s
57:07
you’re hunting for audience and little
57:09
groups yeah and they’re just giving you
57:11
hyper politicized stuff because that’s
57:12
the only way they can make money I don’t
57:14
know how we change it I don’t know how
57:15
we go you know we reverse it it’s it’s
57:18
it’s a problem it’s so interesting
57:20
though because I mean if you looked at
57:24
human interactions and if you looked at
57:27
you know dispensing news and information
57:30
and you follow trends from like the 30s
57:33
to the 40s to the 50s to the 60s to 70s
57:37
he’d be like oh well people are getting
57:38
better at this
57:39
people getting better whoa whoa what the
57:42
fuck is going on now everything is off
57:45
the rails yes two camps barking at each
57:47
other’s blatant misinformation on both
57:50
sides blatant distortions of the truth
57:52
blatant editorializing of facts and
57:55
you’re like hey what happened guys yeah
57:57
no it’s it’s it’s crazy and and not not
58:00
that the news didn’t have distortions
58:03
before like you think about you know we
58:07
covered up all sorts of thing you know
58:09
massacres in Cambodia a secret bombing
58:11
you know the use of Agent Orange like
58:14
it’s definitely I just didn’t appear in
58:15
the news in the degree it should now
58:18
though you turn on either MSNBC or Fox
58:22
and you’re right you’ll you’ll find
58:25
something that’s just totally full of
58:26
shit within five minutes usually and
58:28
that did not used to be the case you
58:32
know I think individual reporters used
58:35
to take a lot of pride in their work
58:36
you know and it’s different now now now
58:39
when you make mistakes in the business
58:41
you don’t you don’t get bounced out of
58:43
the business in the way you used to and
58:45
that’s that’s really strange like only
58:47
plagiarism plagiarism still bounces you
58:50
what plagiarism case is pretty yeah
58:52
that’s usually fatal right you’re not
58:54
gonna usually recover from that I mean
58:55
some people have kind of near in
58:57
problems with that and they they you
59:00
know I’m not gonna yes but but um but no
59:04
but you think about people who got
59:05
stories like w the WMD thing wrong right
59:08
not only do they not get bounced out of
59:10
the business they all got promoted you
59:12
know they’re like the editors of major
59:13
magazines now or you know and and so
59:16
what does that tell people in the
59:18
business well it tells you you know if
59:20
you screw up as long as you screw up
59:21
with a whole bunch of other people it’s
59:22
okay you know which is not good and and
59:25
we used to have a lot of pride about
59:26
that stuff in this business and that we
59:28
now we don’t anymore
59:29
you know and it it’s there isn’t the
59:33
shame connected with with screwing
59:34
something up that there used to be I
59:36
think there’s a real danger with in
59:38
terms of social media especially in not
59:42
complying to the Constitution
59:44
not complying to the First Amendment I
59:46
think there’s a real danger in that and
59:47
I don’t think we recognize that danger
59:49
because I don’t think we saw what social
59:51
media was until it was too late and then
59:53
by the time it was too late we had
59:55
already had these sort of standards in
59:58
place and the people that run it we’re
60:01
already getting away with enforcing
60:03
their own personal bias their
60:04
ideological bias and this is this is
60:08
that when you’re at this position where
60:10
you go well how does that ever get
60:11
resolved they’re not going to resolve it
60:13
on their own they’re still making ass
60:14
loads of money what he did is the
60:16
government resolved it well if Trump
60:18
steps in and resolves it looks like he’s
60:19
trying to resolve it to save his own
60:21
political career or right to into you
60:23
know to help his supporters it’s like
60:26
yeah no and and and no matter what if
60:29
Trump does anything about it
60:30
automatically everyone’s gonna be
60:32
against it right right you know even
60:33
even if it’s um even if there’s some
60:36
sense in there somewhere people won’t
60:37
won’t won’t get behind it but you know I
60:40
do anything about it it’s gonna be a
60:41
correction time there’s gonna be a gap
60:44
time where it’s gonna be like that where
60:46
it’s just gonna flood with people that
60:48
are just like with this newfound freedom
60:51
they’re just gonna go and shit up the
60:53
town you know but I mean but how would
60:55
you how would you fix it now like that’s
60:58
something because it’s not only about
60:59
rules it’s also about culture like
61:01
people have already they’re in this
61:02
pattern of you know not saying the wrong
61:06
thing right and they don’t
61:08
I think there’s we’re in a culture that
61:10
doesn’t even really know how to deal
61:13
with free speech if we actually had it
61:15
in the same way we used to you know no
61:16
one seems to have a forecast like no
61:19
one’s like well the storm is gonna last
61:20
about four years and then say there’s no
61:22
there’s no forecast no no one’s like wow
61:25
some fucking uncharted waters right
61:28
right but if you historically the
61:31
tendency is once you have a tool that
61:34
kind of can be used to keep people on
61:38
line and for enforce compliance of ideas
61:41
and then it always ends up worsening and
61:44
becoming more and more dictatorial and
61:46
authoritarian yes again you go back to
61:48
the Soviet example like once I started
61:50
you know really exercising a lot of
61:52
control over the press and literature
61:54
and things like that it didn’t get
61:56
better you
61:57
it just continued becoming more of a you
62:00
know an entrenched thing until so I
62:02
that’s what I worry about I think they
62:04
were headed more in that direction yeah
62:06
I think so too
62:07
I’m not really concerned with on both
62:10
sides when people dig their heels in
62:12
ideologically the other side just gets
62:14
even more convinced they’re correct oh
62:15
yeah yeah and there’s no cross dialogue
62:20
of any kind not anymore
62:23
there and even now I mean it’s it’s
62:26
interesting you had you had Bernie
62:29
Sanders on your show and Sanders all
62:31
it’s Sanders is one of the few
62:33
politicians left who has this idea that
62:36
we should talk to everybody like there’s
62:38
there are no illegitimate audiences out
62:39
there there know and like you know
62:41
that’s my job as a politician is to try
62:43
to convince you of things but that’s not
62:45
normal in the Democratic Party anymore I
62:47
mean Elizabeth Warren you know has made
62:51
a big thing about not going on Fox and
62:53
about having certain people taken taken
62:55
off Twitter and yeah and and I think
62:58
that’s increasingly the the sort of line
63:02
of thought in mainstream Democratic
63:04
Party thought now is is that we’re just
63:06
gonna rule out whatever whatever that is
63:09
47% of the electorate we’re just not
63:10
gonna talk to them anymore
63:11
right right yeah I I don’t know how
63:14
that’s that can possibly be a successful
63:16
political strategy then what and what
63:18
the point is you know I yeah no no it
63:21
doesn’t make any sense I was reading
63:24
something where people are going after
63:25
tells he gathered for being on Tucker
63:27
Carlson she’s like I’ll talk to
63:29
everybody and I’m glad she does and by
63:32
the way it’s like it’s hard for her
63:33
because she’s kind of an outside
63:34
candidate it’s hard for her to get time
63:36
on these other networks and so they want
63:39
to punish her for being on Tucker
63:40
Carlson’s and then they have this you
63:42
know reductionist view of who he is he’s
63:46
a white supremacist like to all she
63:48
supports white supremacists she goes on
63:49
a white supremacist show it okay is that
63:51
what he is really what he is and using
63:54
its knee a lot more than that there’s a
63:56
lot going on there right you guys are
63:58
fucking with life you know you’re
64:00
fucking with the reality of life and
64:03
you’re saying it in these sentences
64:05
you’re printing it out in these
64:06
paragraphs as
64:08
and you sending it out there
64:09
irresponsibly and it’s just really
64:11
strange that people don’t understand the
64:13
repercussions of that yeah that’s
64:15
something we talked about on our podcast
64:16
easily it’s all the time is that the
64:18
this it’s a catch-22 right like you you
64:22
don’t invite somebody like tulsi gabbard
64:24
on to CNN MSNBC where you they’re kind
64:28
of excluded from the same platform the
64:29
other politicians get so they go to
64:31
other platforms
64:32
all right and then you say oh you went
64:34
on that platform so you’re illegitimate
64:35
yes you know what do you want them to do
64:37
like you know what they do the same
64:39
thing with people who go on RT for
64:40
instance right oh well you’re helping
64:42
the Russians because you went on RT well
64:44
they’re that’s because you didn’t invite
64:46
them on any I mean yeah you their people
64:48
are gonna try to talk to anybody they
64:49
can to spread their ideas and that that
64:52
that kind of propaganda thing is is
64:54
pretty constant now in the use of the
64:57
term terms like what white supremacists
64:59
with Tucker Carlson I mean there there
65:01
are a million terms now that you use to
65:03
just kind of throw at people and what
65:05
they’re trying to do is create this ik
65:06
factor around people yeah right like
65:09
once you get someone gets a label
65:11
associated with them then nobody wants
65:14
to be associated with that person all
65:16
right Ryan they quickly kind of die out
65:18
of the public scene and that’s I think
65:20
that’s really bad too you know it’s it’s
65:22
like a it’s it’s just an
65:25
anti-intellectual way of dealing with
65:26
things and I and I think it’s it’s not
65:29
good it’s weird that it’s so prevalent
65:31
it’s weird that there’s so few
65:32
proponents of a more you know
65:36
open-minded way of thinking right yeah
65:38
and just to take the gap we had we had
65:42
Tulsa together on our show too and
65:44
immediately we got accused what do you
65:46
love Assad right do you want a bomb
65:49
Syrian sure you want to keep murder
65:50
Syrian children no I you know she’s a
65:52
presidential candidate and we want to
65:54
talk to want to hear what she has to say
65:55
but they immediately go to the
65:58
maximalist interpretation of everything
66:00
and then they’re what they’re basically
66:02
saying when they ask you those questions
66:04
are do you want to wear that label too
66:06
because she’s got it already
66:08
so if you have a run again you’re gonna
66:10
you’re you’re gonna have that label and
66:11
people they see that you know and and so
66:14
you know people who have who don’t have
66:16
a big following and who are worried
66:19
about their careers
66:21
in about you know the money and
66:23
advertisers and stuff like that they
66:24
they think twice about you know
66:26
interviewing that person the next time
66:27
yeah and here’s another way to get that
66:29
speech exactly and again I don’t know
66:32
how you get out of it you know and I
66:36
mean I’ve experienced some blowback I
66:39
guess but it doesn’t hasn’t worked yet
66:42
right you know I mean it’s not real it’s
66:45
just like it’s just words like okay well
66:48
but yeah and and but you’re handling it
66:50
the right way back I think people your
66:53
audience is rewarding you for for not
66:56
not bowing to it
66:58
you know and I think that more people if
67:01
they took that example and said I’m not
67:02
gonna listen to what the the pack says
67:05
about this I’m not gonna be afraid of
67:06
being called a name you know fuck that
67:09
I’m gonna talk to you I want to talk to
67:11
and I’m gonna gonna head you know
67:13
explore whatever ideas I want to explore
67:15
then the this kind of stuff wouldn’t be
67:18
as effective so yeah so easy to do to
67:22
people it’s so easy for them to deep
67:23
platform people yes always and shadow
67:26
banning and all this other weird shit
67:27
that’s going on yeah they’re channeling
67:30
people and and pushing people into these
67:34
areas of their platforms that makes them
67:38
less accessible and I know where it
67:40
comes from you know I was I was young
67:42
and politically active once you know you
67:44
you want to change the world you want to
67:46
make it a better place so you’re in
67:48
college and you don’t have any power you
67:50
don’t have any way to input make
67:54
something into legislation you know what
67:56
I mean yeah so what do you do you you
67:59
know social media gives you the illusion
68:01
that you’re having an impact in the
68:03
world by you know maybe getting somebody
68:05
deep platformed or taking off Twitter or
68:07
something like that it feels like it’s
68:08
political action that yeah but it’s not
68:11
you know what I mean it’s it’s it’s
68:12
something that they that is open to
68:14
people to do but it’s not the same as
68:18
you know getting 60 Congress 60 members
68:21
of the Senate to to raise taxes on a
68:24
corporation that’s been evading them for
68:26
20 years you know what I mean like
68:27
that’s that’s real action this you know
68:31
getting some random person taken off the
68:33
internet is just not change
68:34
you know but but people feel like it is
68:36
and they wanna they want to do the right
68:38
thing so I get it but no it’s it’s not
68:41
you know put real political action I
68:43
don’t think no it’s fucking gross yeah
68:49
and it just lead it’s there’s so much of
68:52
it on there’s so little logic also and
68:56
you this must be a personal thing for
68:59
you but it’s this is the unfunniest time
69:01
in American history
69:03
like yes no because you rewarded for for
69:07
stepping outside the box that’s true in
69:09
a big way mm-hmm like yeah you mean Dave
69:12
Chappelle gets attacked but guess what
69:14
he also gets rewarded in a huge way run
69:17
he goes on stage now people go ape shit
69:20
that’s true and part of the reason why
69:22
they go fucking bonkers is because they
69:24
know that this guy doesn’t give a fuck
69:26
and he’s one of the rare ones who
69:28
doesn’t give a fuck so when he goes up
69:29
there you know if he thinks something
69:32
crazy about whatever it is whatever
69:34
protected group or whatever idea that
69:37
he’s not supposed to explore that’s not
69:39
gonna stop him at all he’s gonna tell
69:40
you exactly what he thinks about those
69:42
things regardless of all this woke
69:44
blowback he’s not he doesn’t care right
69:46
so because of that he’s rewarded even
69:48
more and same thing with Bill burr
69:50
same thing with a lot of comics I
69:51
experienced it with my own jokes sure
69:53
did more controversial bits getting
69:55
people more fired up now they love it
69:57
because everyone’s smothered their
70:00
smothered by human resources and
70:02
smothered by office politics and you’re
70:04
smothered by social discourse
70:07
restrictions and he’s don’t feel like
70:10
you can express yourself any more this
70:12
is true and and and all people also
70:13
don’t have a they feel like they’re
70:15
being watched all the time yeah things
70:17
like that kind of can’t let it all hang
70:19
out anywhere right and and so that’s
70:21
yeah they they do feel incredibly like
70:24
repressed and under the gun yeah I think
70:26
that that’s that’s true yeah I just I
70:29
feel like it I mean I’m not a comic but
70:31
if but I just imagine it must be a more
70:34
challenging environment it’s more
70:35
challenging but more rewarding to my
70:37
friend ari said it best he said this is
70:39
a great time for comedy because comedies
70:41
dangerous again right that’s true yeah
70:43
that’s true yeah it’s kind of goes back
70:44
to like a Lenny Bruce era right
70:46
when you know you could kind of
70:48
completely freak people out with a
70:50
couple of saying a couple of sure yeah
70:53
for good or bad or prior yeah well you
70:56
like you saw it with like louis c.k
70:58
right louis ck’s under the microscope
71:00
now that joke that he made about
71:02
parkland is absolutely a louis c.k joke
71:06
if you followed him throughout his
71:08
career what was the joke again i’m sorry
71:10
the joke was why am i listening to these
71:12
parklands survivors why are you
71:13
interesting cuz you pushed some fat kid
71:16
in the way like see you’re laughing
71:18
right like that is a louis c.k joke he’s
71:23
saying something fucked up you’re not
71:24
supposed to say that is throughout his
71:26
goddamn career he’s done that that’s
71:28
always done but after the you know
71:32
jerking off in front of women all that
71:33
stuff and him coming out in admitting it
71:35
and then taking a bunch of time off now
71:37
he’s a target right now he does
71:38
something like that and they’re like oh
71:40
he’s all right now like no this is what
71:42
he’s always done right he’s always
71:44
taking this sort of contrarian outside
71:47
the box fucked up but hilarious take on
71:51
things and that bit
71:52
unfortunately because it was released by
71:54
someone who made a youtube video of it
71:56
he didn’t get a chance to he was gone
71:58
for ten months and he had only done a
71:59
couple sets when he was fleshing these
72:01
ideas out i guarantee you he would have
72:03
turned that idea into a brilliant bit
72:04
but he never got the chance because it
72:06
was just it was set out there in the
72:08
wild when it was a baby he was mauled
72:10
down by wolves it needed to be heard
72:13
right yeah i mean that’s what a bit of
72:15
these bits they they grow and they
72:17
develop and that was a controversial
72:19
idea that we’re supposed to think that
72:21
someone’s interesting just because they
72:22
survived a tragedy and his take is like
72:24
no no no no you’re not interesting right
72:27
you’re fucking boring you’re annoying
72:28
get off my get off my TV and a lot of us
72:31
have felt that way sure he just the way
72:34
he said it was easy to take and put in
72:37
you know out of context put it in quotes
72:39
and turn him into an asshole well yeah
72:42
but that’s what comedy is right it’s
72:44
it’s taking what people the the thoughts
72:46
that everybody has and vocalizing that
72:49
the end that forbidden thing in a way
72:51
that people can kind of you know come
72:54
together over right i mean i think that
72:56
was a lot a lot of what richard pryor
72:57
schemer was about like he took a lot of
72:59
the sort
73:00
comfortable race problems right and he
73:05
just kind of put them out there and both
73:07
white people and black people laughed at
73:09
it yeah right like together you know and
73:11
that that was what was good about it yes
73:12
but if you can’t if if people are afraid
73:17
to vocalize those things that they think
73:18
it’s gonna you know ruin their career is
73:20
you know that that makes it more
73:22
interesting right it’s more hype more
73:24
high stakes but if you can navigate
73:25
those waters and get to the promised
73:28
land of the punchline it’s even more
73:29
rewarding right but you just have to
73:31
explain yourself better you have to have
73:33
better points you have to have you have
73:35
to have a better structure to your
73:38
material where you while the the people
73:41
who may find your idea objectionable
73:44
they you you coax them like hold my hand
73:48
I’m gonna take you through the woods
73:50
we’re gonna be okay right follow me and
73:53
boom isn’t that funny
73:55
right right right you have to navigate
73:56
it skillfully and you have to navigate
73:58
it thoughtfully and you have to really
74:01
have a point you can’t have a half-assed
74:03
point but you can’t have a situation
74:06
where it’s fatal to be off by a little
74:08
bit I know like there was a writer that
74:11
I loved growing up a Soviet writer named
74:13
Isaac Babel Stalin ended up shooting him
74:17
but he gave a speech about I think it
74:20
was in 1936 you know to to a Soviet
74:23
writers collective and he said you know
74:25
people say that we don’t have as much
74:27
freedom as we used to but actually all
74:29
of all that the you know the the
74:31
Communist Party is done is Britta’s
74:33
prevented us from writing badly the only
74:35
thing that’s outlawed now is writing
74:36
badly right and everybody laughed but he
74:39
was actually saying something pretty
74:41
serious which is that you can’t write
74:42
well unless you can you know screw up
74:44
too you know like on the way to to being
74:48
creative in a good way you have to miss
74:50
yes you know and if missing is not
74:52
allowed and there’s high punishment for
74:55
missing you’re not going to get art yeah
74:57
you’re not gonna get revolution you’re
74:59
not gonna get all these things well and
75:01
in comedy it’s particularly important
75:03
because you have to work it out in front
75:04
of people absolutely yeah no I used to
75:06
sit at a comedy club in
75:08
hatton when I was like in college that
75:11
you know they would try out their
75:13
material like on a Wednesday right you
75:16
know early and that was always the most
75:18
interesting time for me well like
75:19
they’re trying south stuff out and a lot
75:21
of it wasn’t so good but you know it was
75:23
interesting right and you just can’t
75:26
have a situation where people feel like
75:28
you know one wrong word is gonna ruin
75:30
their career yeah you know yeah I don’t
75:32
know but there’s also people that are
75:34
wolves and they’re trying to take out
75:36
that little baby joke wandering through
75:38
the woods they they want that feeling of
75:41
being able to take someone down right
75:44
and that that’s you know that’s you’re
75:45
getting that now too which is just and
75:47
so now because that there’s like yonder
75:49
bags at the improv where I’m performing
75:51
tonight they usually on their bags you
75:53
have to put yourself on the bag when you
75:54
go in there so you can’t record things
75:56
yonder bag yes it’s a company called
75:58
yonder it’s just so strange it’s like
76:01
all the shows I did with Chappelle he
76:03
uses yonder bags and an idea is to
76:06
prevent people from from filming and
76:08
recording and you know and then
76:10
eventually putting your stuff out there
76:11
uh-huh
76:12
well you know look I’m kind of all for
76:15
that I mean I’ve seen this with
76:17
politicians on the campaign trail like
76:19
they are so tight now in ways that they
76:21
used to not be well you saw the Donald
76:23
Trump thing Donald Trump jr. where Trump
76:26
jr. what they didn’t want him to do they
76:28
wanted him to do a Q&A and he didn’t
76:30
want to do it so they booed him the
76:32
right wing people uh-huh bullying him
76:34
they’re yelling out Q&A Q&A because they
76:37
want to be able to talk oh I see it’ll
76:39
say something to him and these are
76:40
people that were like far-right
76:42
far-right people they just didn’t think
76:45
he was being right enough or he was
76:46
playing the game wrong or he wasn’t
76:47
wasn’t letting them complain to him
76:49
right right yeah yeah now that’s bad and
76:52
and and politicians are aware of that
76:56
now and they’re they’re constantly aware
76:57
that they’re on film everywhere and so
77:00
they’re you know a thousand percent less
77:02
interesting because yeah they’re there I
77:04
mean I remember covering campaign in
77:07
2004 and I was I saw Dennis Kucinich
77:10
give a speech somewhere and he was going
77:13
from I think Maine to New Hampshire and
77:16
I said can I get a ride back to New
77:17
Hampshire he’s like yeah sure so you
77:18
know it takes me on the
77:20
and he like takes his shoes off he’s
77:22
like cracking jokes and everything and
77:24
like eating udon noodles or something
77:26
political candidates would not do that
77:28
now like they’d be afraid to be off the
77:30
record with you right you know right
77:32
right and and they’re afraid to be
77:33
around people and just behave like
77:35
people you know which is not good I
77:38
don’t think it’s the weirdest time ever
77:41
to be a politician because it’s it’s
77:42
basically you’ve got this one guy who
77:45
made it through being hugely flawed
77:49
mm-hmm and just going ah
77:51
the fucking locker room talk and it was
77:53
like well yeah it is locker room talk
77:54
yes and then it works and he gets
77:56
through and he wins and so you’ve got
77:58
him who seems like he’s so greasy like
78:01
nothing sticks to him and then you have
78:04
everyone else who’s terrified of any
78:06
slight misstep yeah totally and and you
78:10
can’t replicate the way Trump does this
78:12
you know Trump Trump is he was born this
78:14
way there’s like a thing going on in his
78:16
head like he is you know pathologically
78:19
driven to behave in a certain way and
78:20
he’s not gonna be cowed by the way you
78:24
know people are but socially because he
78:26
just doesn’t think that way
78:27
no no he’s and but that’s no one else is
78:29
gonna behave like that what do you think
78:31
about him and speed what do you think I
78:34
do all that
78:34
does he take speed you mean yeah so did
78:37
you ever see his speech after Super
78:40
Tuesday yeah that’s the one we was
78:43
slurry it was that wasn’t always ramped
78:46
up he was very I just say watch that
78:50
speech you know we’re not supposed to
78:51
draw conclusions about but you know what
78:54
what my big lament pharmaceutically with
78:55
somebody but I would say just watched on
78:57
Donald Trump’s performance after the
78:59
results of the Super Tuesday roll in in
79:02
2016 let’s hear some of that firstly
79:06
chris Christie is hilarious she’s
79:09
talking about wages I’ve been poor and
79:11
everything’s poor and everything’s doing
79:13
badly but we’re gonna make it she’s been
79:14
there for so long I mean if she hasn’t
79:17
straighten it out by now she’s not gonna
79:19
straighten it out in the next four years
79:21
it’s just gonna become worse and worse
79:22
she wants to make America whole again
79:24
and I’m trying to find what is
79:26
yeah I mean it’s just I already go back
79:30
and look but yeah but he got he went on
79:32
and on also that the Christie factor was
79:34
really funny with that because he was
79:35
him he’s just sitting back there going
79:37
what am i doing what am i doing with my
79:39
life
79:40
look at his face literally you can see
79:42
his brain wander well how the fuck did
79:44
this happened I was gonna be the man
79:46
like I was the goddamn president it was
79:50
gonna happen for me I could see it
79:52
happening I saw him in uh in Ames Iowa
79:56
basically standing alone in the park
79:58
waiting for people to try to shake his
80:00
hand you know yeah it was pretty bad
80:01
like you see that and but you do you
80:03
have a theory about Trump and speed yeah
80:05
yeah yeah I think he’s on some stuff
80:07
mm-hmm I think first of all I know so
80:09
many journalists that are on speed I
80:12
know so many people that are on adderall
80:13
and it’s very effective it gives you
80:16
confidence it gives you a delusional
80:18
perspective
80:19
well you get a delusional state of
80:21
confidence mm-hmm it makes people think
80:22
they can do anything it’s basically a
80:24
low-level meth it’s very similar to
80:27
methamphetamine chemically sure and
80:29
people are in it yeah it is tell me what
80:31
it’s like because I haven’t done it yeah
80:33
I mean I’ve done speed to I mean you
80:35
know all those all those drugs are yeah
80:37
they’re like baby baby speed basically
80:39
yeah and you’re absolutely right I think
80:42
people who it’s not good for a writer
80:44
because writing is one of these things
80:47
where one of the most important things
80:49
is being able to step back and and ask
80:52
am I really my full of shit here is you
80:54
know are my jokes as funny as I think
80:55
they are like right if once that
80:57
mechanism starts to go wrong you know
81:01
you’re really lost yes
81:02
writer right because you’re just you’re
81:03
not in front of an audience you’re with
81:05
yourself in front of a computer so I
81:08
don’t think I don’t think speed is a
81:10
great drug I mean you get a lot of stuff
81:12
done so that’s that’s good but but ya
81:16
know I I think there’s a lot of people
81:18
who are on it now and also a lot of us
81:20
because kids come up through school and
81:23
they’re on it just you know and they
81:25
they get used to it so I you know I have
81:27
kids I wouldn’t dream of giving giving
81:29
them any of those drugs you know I think
81:31
it’s crazy yeah I do – did you see you
81:33
saw the I’m sure you saw the sudafed
81:34
picture – right no what was that Trump
81:37
was sitting in his office eating a
81:39
was that famous photo where he’s like I
81:41
love Hispanics where he’s eating a taco
81:43
bowl at Trump Tower and behind him
81:46
there’s an open drawer and in that open
81:47
drawer as boxes of sudafed and sudafed
81:51
sort of yeah I mean you it gives you a
81:54
low-level buzz and the I mean this is
81:59
why I used to have to go to CVS to buy
82:02
this stuff used to have to give you
82:03
drivers I guess you still do they have
82:05
to give your driver’s license because
82:06
they want to make sure you’re not
82:07
cooking meth right lying like 10 boxes
82:09
of it at a time and cooking up a batch
82:11
yeah if you’re like in a you know holler
82:13
in Kentucky and you go in and get 20 20
82:16
boxes of sudafed and pretty much people
82:18
know what you’re doing there
82:19
yeah that’s really funny did he so he
82:21
had a bunch of sudafed oh yes yeah in
82:23
his box and you know there was that one
82:26
reporter that was that guy’s name again
82:29
who had a hole he wrote a series of
82:33
tweets which he eventually wound up
82:34
taking down by the way Jamie I can’t
82:35
find those fucking tweets he wrote a
82:39
series of tweets that there was a very
82:40
specific Duane Reade pharmacy where
82:42
Trump got amphetamines for something
82:46
that was in quotes called metabolic
82:48
disorder Kurt I can Walled fun Kurt yeah
82:51
1982 Trump started taking amphetamine
82:53
derivatives abused them only supposed to
82:55
take two for 25 days stayed on it for
82:57
eight years really now is he full of
82:59
shit
83:00
so yeah Kurt I can Walt isn’t
83:03
interesting because he’s written some
83:05
really good books about finance he wrote
83:09
a book about Enron he wrote a book about
83:12
Prudential it was really really good
83:15
then when I was starting out writing
83:17
with Wall Street I was like wow these
83:18
books are really incredibly well
83:19
researched but he had some stuff in the
83:23
in 2016 where like that’s an example of
83:28
something as a reporter I see that
83:30
Michael where’s that coming from
83:31
you know and because you in journalism
83:35
you can’t really accuse somebody of
83:37
certain things unless it’s backed up to
83:39
the enth degree so right he had a couple
83:41
of things that I thought I you know
83:42
would be concerned about
83:43
he took a leap I don’t know I mean look
83:46
that’s what I’m Sam stepped outside of
83:47
the journalistic boundaries of what you
83:50
can absolutely prove and not prove
83:53
and took a leap and that’s why I think
83:54
he took down the duane reade pharmacy he
83:56
didn’t take it down oh it’s still there
83:58
as well there wasn’t okay there it is
84:00
there was another thing about a well
84:03
he’s got the milligrams per day Wow
84:05
where’s this from I don’t know it
84:08
doesn’t show it or anything but I
84:10
believe he eats drug use a copy of it
84:11
from someone or talk to the doctor drug
84:14
was diethyl propane 75 milligrams a day
84:16
prescription filled Duane Reade on 57th
84:19
Street Manhattan not that I know things
84:20
so you know get the doctor’s name to dr.
84:24
Joseph greenberg I countered with
84:27
medical records a white house admitted
84:30
to me only a short time for diet that he
84:32
took it when he was not located and
84:34
that’s fun he says I countered with
84:36
medical records they cut me off Wow yeah
84:40
I mean you know one thing I will say is
84:42
that when you’re when you’re covering
84:43
stories sometimes you hear things and
84:46
and you know they’re pretty solid but
84:48
you put you it’s not quite reportable
84:50
because the person won’t put their name
84:51
on it
84:52
or you know you’re not a hundred percent
84:55
sure that the document is a real
84:56
document maybe it’s a photocopy and that
84:59
that can be very very tough for
85:00
reporters cuz they know something’s true
85:02
but they can’t write they can’t and and
85:05
social media has eliminated a barrier
85:07
that we used to have we used to have to
85:09
go through editors and fact checkers and
85:11
now you know you’re on Twitter or you
85:13
can just kind of you know right right or
85:15
you can hint at something you know and I
85:18
think that’s that’s something you don’t
85:19
want to get into as a reporter too much
85:21
you know yeah that’s a weird use of
85:23
social media right it’s like sort of a
85:25
slippery escape from journalistic rules
85:29
yeah exactly yeah you know or you can
85:32
you can insinuate that somebody did X Y
85:35
& Z or you can you can use terms that
85:38
are a little bit sloppy like you know
85:41
again but it seems like they did admit
85:42
that he took that stuff forth ah yes so
85:44
if you have the the white house
85:46
you know spokesperson saying that they
85:47
he took it for a short time for a diet
85:49
then you find that’s a reportable story
85:50
right yeah yeah well I think when people
85:53
get into that shit it’s very hard for
85:55
them to get out of that shit mm-hmm
85:57
that’s a the the speed train and I’ve
85:59
seen many people hop on it it’s got a
86:01
lot of stops nobody seems to get off
86:04
yeah not with
86:06
keep intact right yeah no it’s uh that’s
86:08
that’s not a good old he’s so old he
86:12
doesn’t exercise he eats fast food and
86:14
he gets so much fucking energy I and I
86:16
mean people want to think he’s this
86:17
super person you know but maybe he’s on
86:20
speed maybe yeah I mean he’s just gonna
86:23
collapse turn over and collapse one not
86:25
can go a lot longer on speed than people
86:28
think maybe if you just do it the right
86:30
way but isn’t that kind of the way
86:31
history always works it’s like again not
86:34
to go back to the Russian thing but all
86:36
the various terrible leaders of Russia
86:38
like they all died of natural causes
86:40
when they were 85 right whereas you know
86:42
in a country where people get murdered
86:43
and die of industrial accidents and bad
86:45
health when they’re you know 30 all the
86:47
time right but the worst people in the
86:49
country make it to very old age and you
86:52
know and die and in their alcoholics and
86:55
maybe that’s the thing right maybe maybe
86:56
you know he has the worst diet in the
87:00
world and maybe he’s on speed and maybe
87:02
it’s also your perception of how you
87:05
interface with the world maybe because
87:07
he’s not this introspective guy that’s
87:09
really worried about how people see him
87:10
and feel about him maybe he doesn’t feel
87:12
you know whatever whether it’s
87:14
sociopathy or whatever it is he doesn’t
87:16
feel the bad feelings they don’t get in
87:19
there yeah and this he doesn’t have the
87:21
the stress impact right right and that’s
87:23
the thing about speed apparently it
87:25
because of the fact that it makes you
87:27
feel delusional and it makes you feel
87:29
like you’re the fucking man like don’t
87:31
worry about what other people think in
87:32
losers who cares right right yeah you
87:37
know that was why not by greenly why not
87:39
by greenland yeah that came out of
87:41
what’s wrong with that we bought Alaska
87:43
well we based it Alaska yeah yeah we
87:45
were supposed to give it back but we we
87:47
didn’t it seems like Greenland would be
87:48
a good place to scoop up especially as
87:50
things get warmer right yeah exactly and
87:52
the fuckin tweet that he made when he
87:54
put the Trump Tower I promise not to do
87:56
this and have a giant Trump Tower in the
87:58
middle of Greenland I was laughing my
88:00
ass off like love or hate that is
88:03
hilarious his trolling skills are
88:05
top-notch very good they’re they’re
88:07
fantastic oh he knows how to fuck with
88:09
people when he starts calling people
88:10
crazy or gives him a nickname like it’s
88:12
so good because like it sticks yes I
88:17
mean part of me wants
88:19
see a trump button race next year just
88:22
for that reason this is because the the
88:25
abuse will be on below I mean nothing
88:27
I’m encouraging that necessarily but
88:29
this is a spectacle it’s gonna be
88:31
unbelievable you can tell that he he is
88:33
salivating at the idea of by muscleman
88:35
Biden to me is like having a flashlight
88:39
with a dying battery and going for a
88:41
long hike in the woods is not going to
88:45
work out it’s not gonna make it
88:48
yeah no he’s he’s so faded he you know