WordPress can now turn blog posts into tweetstorms automatically

Earlier this year, WordPress .com introduced an easier way to post your Twitter threads, also known as tweetstorms, to your blog with the introduction of the “unroll” option for Twitter  embeds. Today, the company is addressing the flip side of tweetstorm publication — it’s making it possible to turn your existing WordPress blog post into a tweetstorm with just a couple of clicks.

The new feature will allow you to tweet out every word of your post, as well as the accompanying images and videos, the company says. These will be automatically inserted into the thread where they belong alongside your text.

To use the tweetstorm feature, a WordPress user will first click on the Jetpack icon on the top right of the page, then connect their Twitter account to their WordPress site, if that hadn’t been done already.

Jaron Lanier: How the Internet Failed and How to Recreate It

Transcript

00:03
[Music]
00:07
welcome everybody I’m Nathaniel Deutsch
00:09
I’m the director of the humanities
00:11
Institute here at UC Santa Cruz and I
00:14
want to welcome everyone here to see
00:17
Jaron Lanier
00:19
he’ll be talking tonight I also want to
00:21
welcome everybody who is watching this
00:22
on a live stream that we have running at
00:26
the same time we’re very thankful for
00:28
the support from the Peggy Downes Baskin
00:31
humanities endowment for
00:32
interdisciplinary ethics for supporting
00:35
this lecture we’re also very thankful
00:37
for the Andrew W mellon foundation for
00:39
supporting year-long series of events
00:42
that will be hosting at the humanities
00:43
Institute on data and democracy and
00:46
Jaron Lanier stalk is going to be
00:48
launching that series in addition to the
00:52
event tonight we will be hosting in the
00:55
coming year a series of other events
00:57
including questions that matter at
01:00
khumba jazz center on January 29th which
01:04
we invite all of you to I know there’s
01:06
some of you have been to some of our
01:07
past questions that matter events and
01:09
also an event that we have been planning
01:13
actually for a while and has become even
01:16
more necessary because of the events of
01:18
recent days and that is a conversation
01:21
on anti-semitism in the internet which
01:25
we will be hosting on a date to be
01:27
announced
01:28
I want to thank there’s many people I
01:30
could thank but I’ll leave out the names
01:33
I’ve already cleared it with them I’m
01:34
just gonna thank thank their units the
01:38
humanities Institute staff which is as
01:39
always amazing and then the staff of the
01:43
humanities divisions development office
01:45
which is also always amazing so thank
01:47
you everyone for all the work that went
01:49
into this event tonight’s program will
01:52
include a lecture followed by again and
01:54
I think some music followed by a
01:58
question-and-answer session and book
01:59
signing and I’ll be talking a little bit
02:01
more about the book signing later
02:03
questions and answers will be
02:04
facilitated by note cards and we have
02:07
some uh sure’s that are moving around
02:10
the room and if you would like to
02:13
a question please raise your hand now
02:14
and they will hand you cards and you can
02:18
write out the the question they’ll pick
02:20
it up and then they’ll give it to me and
02:21
I will be facilitating the question and
02:23
answer that way so I’ve had the pleasure
02:29
of spending the day with Jaron and I can
02:32
tell you that he is a fascinating person
02:35
a very generous person as well with his
02:39
time he met with some students earlier
02:41
today and had a conversation with them
02:44
which was wonderful for him to do and
02:47
tonight he will be giving a lecture
02:50
we’re lucky to have him here he’s a path
02:53
breaking computer scientist a virtual
02:55
reality pioneer if I’m not mistaken you
02:58
coined the phrase mutual virtual reality
03:02
he’s a composer and artist and author
03:04
who writes and numerous topics including
03:06
technology the social impact of
03:08
Technology the philosophy of
03:09
consciousness and information internet
03:11
politics and the future of humanism and
03:14
one of the things that we believe in so
03:16
strongly at the humanities Institute is
03:18
that conversations about technology
03:19
cannot simply be left to a computer
03:22
scientists no offense to any computer
03:24
scientists in the room we love you too
03:25
but we also think that it is critical to
03:28
have people who work in the humanities
03:30
involved in those conversations and this
03:32
is part of why we are doing this tonight
03:35
he is the author of best-selling and
03:37
award-winning books including you are
03:39
not a gadget a manifesto and who owns
03:41
the future most recently he’s the author
03:43
of 10 arguments for deleting your social
03:45
media account right now his lecture
03:49
tonight is entitled how the internet
03:51
failed and how to recreate it please
03:54
join me in welcoming Jaron Lanier
03:56
[Applause]
04:04
hey how are you all any students here is
04:12
this all this is the the adult okay good
04:15
good ah good excellent there for you
04:17
here I’m going to start with some music
04:21
because some of what I have to talk
04:24
about is not the most cheerful stuff
04:26
because our times aren’t universally
04:28
cheerful lately and music is how I
04:32
survive anyway any of you heard me play
04:36
this thing okay
04:45
[Music]
05:06
[Applause]
05:44
[Music]
05:57
[Music]
06:06
[Music]
06:16
you all know weight is up you all know
06:19
what that is right yeah it’s called a
06:24
cab
06:25
it’s from Laos it’s arguably the origin
06:33
of digital information if you look at it
06:38
it’s got a parallel set of objects that
06:42
are either off Iran there’s 16 of them
06:45
in this one 16-bit number they go back
06:49
many thousands of years they appear to
06:52
be older than the abacus in ancient
06:55
times they were traded across the Silk
06:58
Route from Asia and were known to the
07:01
ancient Greeks and Romans the Romans
07:04
made their own copy which was called a
07:06
hydrolyse and it was a giant egotistical
07:10
Roman version that was so big it has to
07:14
be run on Steam it was operated by teams
07:19
of slave boys because despite have
07:22
Festus is best efforts they didn’t have
07:24
computer AI yet and the slave boys
07:29
couldn’t quite operate all the planks
07:31
that open to close the holes and sink
07:33
and so they developed this crossbar
07:35
system and we know about it because
07:37
there’s a surviving hydrolyse believe it
07:39
or not and that automation evolved along
07:45
with the hydrolyse in in two directions
07:47
it turned into the mediaeval pipe organ
07:50
and there were player mechanisms on the
07:52
earliest pipe organs experimentally and
07:55
it also turned into a family of string
07:58
instruments that had various assists
08:02
like the early pre clavichord
08:04
instruments that eventually evolved as
08:07
the piano the notion of automating these
08:10
things was always present so there were
08:12
always attempts to make player pianos
08:14
around Mozart’s time somebody made a
08:18
non-deterministic player piano which
08:20
meant it didn’t play exactly the same
08:22
thing twice Mozart was inspired by that
08:26
made some music that included dice rolls
08:29
but another person who was inspired was
08:32
a guy named jacquard who used the
08:34
similar mechanism to make a programmable
08:36
loom that in turn inspired somebody
08:40
named Charles Babbage to make a
08:42
programmable calculator and his daughter
08:46
ada
08:46
to articulate a lot of ideas about
08:48
software for the first time and what it
08:49
meant to be a programmer and then in
08:52
turn that all inspired a dimm’d fellow
08:55
named Alan Turing to formalize the whole
08:58
thing and invent the modern computer so
09:01
there’s a direct line this is it this is
09:03
the origin of digital information now of
09:07
course it’s not the only line and if I
09:10
was if I was paid to be a historian I
09:12
wouldn’t have told you that story with
09:14
such authority and yet I’m not so this
09:23
is a charming tale it’s a happy place to
09:26
begin it’s a it’s a reminder that
09:30
inventions can bring delight and joy and
09:33
it’s part of why I’m a technologist but
09:38
unfortunately we have some matters to
09:40
discuss here that are not quite so happy
09:45
we live in a world that has been
09:50
darkening lately it’s not just a
09:57
historical lensing effect where it feels
10:00
worse than ever it’s bad in a new way
10:03
there’s something weird going on and I
10:05
want to begin by trying to distinguish
10:09
what’s going on with our present moment
10:11
of darkness as compared to earlier times
10:14
because this is tricky
10:16
it’s almost impossible I think to not be
10:20
embedded in one’s moment in time it’s
10:22
almost impossible not to have illusions
10:27
due to where you’re situated right and
10:29
so I don’t claim to have perfected the
10:32
art of absolute objectivity at all I’m
10:35
struggling and I’m sure that I don’t
10:38
have it quite right but I want to share
10:39
with you my attempts up to this
10:41
now the first thing to say is that by
10:46
many extremely crucial measures we’re
10:49
living in spectacularly good times where
10:53
the beneficiaries of a steady
10:55
improvement in the average standard of
10:58
living in the world we’ve seen a
11:01
lowering of most kinds of violence we’ve
11:04
seen an improvement in health in most
11:07
ways and for most people it’s actually
11:11
kind of remarkable in many ways these
11:14
are really good times and those trend
11:17
lines go way back over over centuries
11:20
we’ve seen steady improvement of
11:22
societies kind of gotten its act
11:23
together and we’ve been able to hold on
11:26
to a few memories about things that
11:28
didn’t work so we’ve tried new things
11:30
we’ve we’ve developed relatively more
11:33
humane societies and relatively better
11:36
science and better Public Health and
11:38
it’s amazing it’s wonderful it’s
11:42
something that’s a precious gift to us
11:47
from earlier generations that we should
11:49
be unendingly grateful for and I always
11:54
keep that in mind I always keep in mind
11:56
that just in our modern human-made world
11:59
just the fact you can walk into a
12:01
building and it doesn’t collapse on us
12:02
is a tribute to the people who made it
12:04
and the people who funded them and
12:07
regulated them and the people that
12:09
taught them there’s like this whole
12:11
edifice of love that’s apparent all the
12:14
time that we can forget about and during
12:16
times that feel dark one of the
12:18
antidotes is gratitude and just in these
12:20
simple things
12:21
I feel extraordinary gratitude and it
12:25
reminds me of how overall there’s been
12:27
so much success in the project of
12:29
Science and Technology it’s so easy to
12:33
lose sight of that and yet there is
12:35
something really screwy going on that
12:38
seems to me to be fairly distinct from
12:42
previous problems it’s a new sneaky
12:45
problem we’ve brought upon ourselves and
12:48
we have yet to fully invent our way out
12:50
of it
12:51
so what exactly is going on
12:54
I think at a most fundamental level
12:59
we’ve created a way of managing
13:03
information among ourselves that
13:05
detaches us from reality I think that is
13:10
the most serious problem if the only
13:14
problem was that our technology makes us
13:18
at times more batty
13:21
more irritable paranoid more
13:26
mean-spirited more separated more lonely
13:30
if that kind of problem was what we were
13:33
talking about that would be important it
13:36
would be serious it would be important
13:38
to address it but what really scares me
13:42
about the present moment is that I fear
13:44
we’ve lost the ability to have a
13:47
societal conversation about actual
13:49
reality about things like climate change
13:53
the need to have adequate food and water
13:56
for peak population which is coming the
13:58
need for dealing with changes in the
14:01
profile of diseases that are coming
14:04
there’s so many so many issues are real
14:08
they’re not just fantasy issues their
14:10
existential real issues climate above
14:14
all and the question is are we still
14:18
able to have a conversation about
14:20
reality or not
14:21
that becomes the existential question of
14:23
the moment and so far the way we’ve been
14:26
running things has been pulling us away
14:28
from reality that scares me and I think
14:32
that’s the core darkness that we have to
14:34
address we can survive everything else
14:36
but we cannot survive if we fail to
14:38
address that now in the title of this
14:42
lecture I promised a little bit of
14:44
history how the internet got screwed up
14:46
or something like that so I’ll tell you
14:48
a bit about that but I want to focus
14:51
more on trying to characterize this
14:54
issue a little more tightly and trying
14:58
to explain at least my thoughts on how
15:00
to remedy it and maybe some other
15:02
people’s thoughts to try to give you a
15:04
bit of a sense of it
15:06
now to begin with one of the infuriating
15:11
aspects of our current problem is that
15:13
it was well foreseen in advance that’s
15:16
the thing about it nobody can claim that
15:18
they were surprised and I can point to
15:22
many folks who were talking about this
15:24
in advance I’m good as good a starting
15:26
place as any is to talk about iam
15:28
foresters story the machine stops who
15:31
here has read it ok well a few people
15:36
terrifying right all right
15:38
the machine stops was written I believe
15:41
in 1907 is that right it might have been
15:43
on nine but you know a century in a
15:47
decade ago or so and it foresees a world
15:50
remarkably like ours it’s a world and
15:53
this was written well before touring
15:54
well before any of this stuff
15:57
I mean before there was computation and
15:59
it describes a world of people in front
16:02
of their screens interacting social
16:04
networking doing search and getting lost
16:07
in a bunch of stupid and
16:10
finally when the machine experiences a
16:14
crash there’s this calamity on earth and
16:17
people become so dependent on it that
16:19
the loss of this machine becomes a
16:21
calamity in itself and at the very end
16:23
of the book people are crawling out from
16:24
their screens and looking at the real
16:26
world and saying oh my god the Sun and
16:29
it’s like this it’s a really amazing
16:32
piece because it’s possibly the most
16:34
precious thing prescient thing that’s
16:37
ever been written at all it was written
16:40
in part as a response to the techie
16:44
utopianism of the day it was a response
16:47
to writers like HG Wells saying wait a
16:50
second these are still going to be
16:52
people we have to think about what this
16:53
will mean to people it’s often the case
16:56
that the first arrive or on a scene has
16:58
a clearer view and can have this kind of
17:01
lucidity that later people find it very
17:03
difficult to achieve and I think
17:04
something like that happens very long
17:06
ago but then honestly we could talk
17:11
about Touring’s last writing just before
17:13
his suicide where he was realizing the
17:16
even though he played as great a role as
17:19
anyone in defeating fascism he hadn’t
17:21
defeated fascism at all because here he
17:24
was being destroyed for his identity you
17:28
all know the story of trade by now it’s
17:30
not obscure anywhere there was a movie
17:31
and everything for a long time I would
17:33
speak to computer science classes nobody
17:35
knew about Turing’s death at all which
17:37
is a scandal but at this point I think
17:39
everyone knows and if you read his final
17:42
writings you read this kind of in a way
17:45
an inner glow of somebody who does have
17:48
some kind of a faith and some kind of a
17:50
stronger Center but also this kind of
17:52
sense of defeat and by the way it’s
17:55
within the context of that that he
17:57
invented artificial intelligence that he
17:58
invented the Turing test and this notion
18:00
of this person who would transcend this
18:03
non person who could transcend sexuality
18:06
and be just this pristine abstract
18:08
platonic being an escaped oppression
18:11
perhaps but anyway so we have that in
18:15
the immediate early generation of
18:17
computer scientists we had Norbert
18:19
Wiener who here has read Norbert Wiener
18:22
I don’t see a single young person’s hand
18:26
up and unfold if you’re young if you’re
18:29
a student and you haven’t read any of
18:30
these people would you please correct
18:31
that and read them seriously you’ll
18:33
you’ll be so happy if you take this
18:34
advice I’m actually read these people so
18:37
Norbert wieners one of the very first
18:39
computer scientists first generation and
18:43
he wrote books that were incredibly
18:45
prescient about this he wrote a book
18:46
called the human use of human beings and
18:49
he pointed out if you could attach a
18:51
computer to input and output devices
18:53
interacting with a person you could get
18:55
algorithms that would enacted adaptive
18:58
behaviors technologies to take control
19:01
of the person and he viewed this as this
19:05
extraordinary moral failure that to be
19:08
avoided any SS thought experiment at the
19:10
end of the book Reese’s well you could
19:12
imagine some kind of global system where
19:13
everybody would have devices on them
19:15
attached to such algorithms that would
19:17
be manipulating them in ways they
19:19
couldn’t quite follow and this would
19:21
bring humanity to a disastrous end but
19:23
of course this is only a thought
19:24
experiment no such thing is feasible
19:25
because there wouldn’t be enough
19:27
bandwidth on the radio waves and all
19:28
this
19:28
you know he then explained why it
19:30
couldn’t be done and of course we built
19:32
exactly the thing he warned about I
19:35
could give many other examples I worked
19:40
on it myself in 92 I wrote an essay
19:41
describing how little AI BOTS could
19:45
create fake social perception in order
19:47
to confuse people and throw elections
19:49
big deal
19:51
lots of people were prescient about this
19:53
this wasn’t a surprise we knew and
19:57
that’s the thing that’s so depressing
20:00
there was a lot of good cautionary
20:03
science fiction there were a lot of good
20:05
cautionary essays there were good
20:07
cautionary technical writings and we
20:10
ignored all of it we ignored it all how
20:15
could that have happened so I I would
20:21
rather tell the story about how
20:23
everybody was surprised and a lot of
20:25
people who are entrepreneurs in Silicon
20:27
Valley were surprised but only because
20:29
they don’t like reading don’t be like
20:33
them so the social history of how
20:39
everything screwed up is a reasonable
20:41
way to talk about the particular way in
20:43
which is screwed up so I’m gonna give it
20:46
a try the first thing to say is that in
20:51
the generation of media technologists
20:57
and artists and viewers from immediately
21:00
before computation went pop in like the
21:03
60s into the 70s into the 80s some of
21:07
the personality dysfunctions and some of
21:10
the craziness was already apparent we
21:11
started to see this notion that anybody
21:14
could be a celebrity and people became
21:16
obsessed with this idea that maybe I
21:17
could be one and maybe there’s something
21:19
wrong with me if I’m not and this kind
21:21
of mass media insecurity obsession thing
21:28
I it’s hard to trace the moment when
21:33
this personality dysfunction really hit
21:35
the mainstream and really started to
21:37
darken the world
21:38
we were talking earlier actually about
21:41
what moment to choose I was thinking
21:42
actually the assassination of John
21:44
Lennon because here you had somebody who
21:46
basically just wanted to be famous for
21:47
being able to be a kill a random killer
21:49
and that was a little new if you look at
21:53
crappy evil people earlier sure there
21:57
was someone to be famous I don’t know
21:58
Bonnie and Clyde or something like that
22:00
but there are a few different things
22:02
about them one thing is that they were
22:06
also stealing money there was a kind of
22:07
a way in which they were I don’t know
22:10
there’s some kind of a part of a system
22:11
they had peers they weren’t they weren’t
22:13
typically total loners the most typical
22:18
profile of really evil person before was
22:21
actually a hyper conformist the typical
22:23
Nazi was actually somebody who didn’t
22:25
want to stand out who just was going
22:27
with the flow and and fully internalized
22:30
the the social milieu around them and
22:32
because it felt normal and that’s that’s
22:34
been a much more typical way that people
22:37
behaved appallingly in history this this
22:40
sort of weird loner celebrity seeker
22:43
thing I’m sure it existed before but it
22:45
started to become prominent I I want to
22:48
say something I’ve never said publicly
22:49
before but it’s just been gnawing at me
22:50
for many years I’m old enough to have
22:53
had some contact back in the day with
22:56
both Marshall McLuhan and Andy Warhol
22:58
who were two figures who had a kind of a
23:01
loose way of talking about this early
23:04
but they didn’t condemn it they just
23:06
stood aloof and say oh we’re super smart
23:09
and wise for being able to see this
23:10
happening and what they should have done
23:11
as they should have said this is
23:13
and I it’s actually really been
23:15
bothering me I’ve never said that before
23:16
I feel it should be said because once
23:19
again the first people on the scene
23:20
sometimes have a kind of a vision and
23:23
they should be judgmental about it the
23:24
way M Forster was and I feel like they
23:27
maybe failed us morally at that point
23:29
because they saw it better than a lot of
23:30
other people maybe than anybody at that
23:32
time anyway that’s maybe not useful to
23:35
say now but at some point it has to be
23:37
said let’s fast forward a little bit
23:41
computation starts to get cheap enough
23:43
that it’s starting to creep out of the
23:44
lab this is the early 1980s
23:48
and here we hit another juncture there
23:54
was this thing that happened oh man I
23:57
was right there for it it was the birth
23:59
of the open free software idea there was
24:01
a friend of mine named Richard Stallman
24:04
any chance Richards here no I guess not
24:08
anyway you never know when saw I saw
24:10
four things anyway Richard had this
24:13
horrible he like one day he just art
24:16
saying oh my god Mike my girlfriend’s
24:18
been killed like my lovers been killed I
24:20
said oh my god that’s horrible but what
24:22
it really was was the software system
24:24
he’d been working on for this kind of
24:26
computer and what had happened is it had
24:28
gone into a commercial mode where the
24:30
companies and it was I think all the
24:31
Lisp machine which would probably nobody
24:33
remembers anymore a sort of early
24:35
attempt to make an AI specialized
24:37
computer and he he was upset he said he
24:43
sort of melded his anger about this with
24:46
a kind of an anti-capitalist feeling
24:47
said no software must be free it must be
24:50
just the thing that’s distributed it
24:51
can’t be property property is theft and
24:54
it really spoke to a lot of people it
24:57
melded with with these other ideas that
24:59
were going on at the time and so it
25:02
became this kind of feeling I would say
25:05
sort of a leftist feeling that was
25:07
profound and remains to this day a lot
25:10
of times if somebody wants to do
25:11
something useful with tech they’ll have
25:13
to put in the word open-source lately
25:15
they also have to put in blockchain and
25:17
so very typically it’s open source it’s
25:19
got blockchain and then then you know
25:21
it’s good so there was this other thing
25:26
going on which is this feeling that the
25:27
purpose of computers was to hide and
25:31
that’s that deserves a little bit of
25:33
explanation
25:34
they were America has always had this
25:37
divide this red-blue divide or whatever
25:40
remember it used to be a north-south
25:41
divide we but we fought one of history’s
25:43
horrible wars once is a civil war and so
25:48
people on what we’d call now the red
25:49
side of the divide we’re very upset
25:54
there was a Democratic president named
25:56
Jimmy Carter that a few people other
25:58
than me in the room might be old enough
25:59
to remember and there was a period when
26:02
there was an Arab oil embargo and we did
26:05
we had long lines at gas stations and he
26:07
imposed a 55 mile an hour speed limit on
26:09
the freeways which a lot of people
26:12
really hated because I wanted to drive
26:13
fast and so this thing sprang up called
26:16
CB radio and CB radios were these little
26:19
analog radios you’d install on your car
26:21
and you’d create a false persona a
26:24
handle and then you’d warn other people
26:27
about where the police were hiding so
26:29
that you could all drive fast
26:30
collectively by sharing information and
26:32
it was all anonymous he could never
26:34
trace it and this thing was huge this
26:36
had as high a profile at the time as
26:38
Twitter does today probably there were
26:40
songs celebrating it it was a really big
26:42
deal but then on the left side of
26:45
America on the blue side people also
26:48
wanted to hide and in that case there
26:52
were two things going on one is the
26:53
draft hadn’t quite died down and it was
26:55
still the Vietnam era and that was just
26:57
terrifying because people didn’t really
27:01
believe in that war and the idea of
27:03
being drafted into this horribly violent
27:05
war that appeared to have no good
27:07
purpose just absolutely broke people’s
27:09
hearts and terrified people so they
27:11
wanted to hide and a lot of people did
27:13
and then there was marijuana and the
27:16
drug laws and a lot of people really
27:19
were hiding from those as well so you
27:21
basically had both red and blue America
27:23
feeling like the number one priority for
27:27
freedom for goodness is to be able to
27:29
hide from the government so encryption
27:32
and hiding and fake personas became this
27:36
celebrated thing so this in this milieu
27:41
there was this idea that online
27:44
networking which didn’t really exist yet
27:45
I mean we had networks but they were all
27:46
very specialized and isolated there
27:48
wasn’t a broad internet yet there would
27:50
be this idea that everything would be
27:52
free and open everything would be
27:54
anonymous and it’d just be like this
27:56
giant black weird place where you
27:59
everything you never knew anything but
28:01
you were also free and nobody could find
28:02
you
28:03
hmm okay so that was that was this
28:06
starting idea so there were a few other
28:09
things that fed into it another thing
28:11
was that there was a famous rock band
28:12
called the Grateful Dead that encouraged
28:14
people to tape their songs and didn’t
28:16
care about privacy and all this there
28:17
are all these different factors
28:19
now oh this was going on and then
28:21
simultaneously this other thing happened
28:23
which is we started to have the figure
28:26
of the glorified practically superhuman
28:31
tech entrepreneur and these were in the
28:34
80s they but these were figures like
28:36
Steve Jobs Bill Gates people we still
28:39
remember of course bill still with us
28:41
and they were just worshipped they were
28:45
the coolest people ever well around
28:47
around here in California people hated
28:49
Bill but they loved Steve and there was
28:55
this kind of interesting problem which
28:58
is we not we didn’t just like our tech
29:02
entrepreneurs we made them into sort of
29:04
superhuman figures the the phrase dent
29:08
the universe is associated with jobs
29:09
it’s this notion that there’s this this
29:12
kind of michi and super power to create
29:16
the flow of reality to direct the future
29:18
because you are the tech entrepreneur
29:19
and computation is reality and the way
29:22
we set these architectures will create
29:24
future societies and that’ll ultimately
29:25
change the shape of the universe once we
29:27
get even greater powers over physics and
29:30
there was just like this no end to the
29:32
fantastical thinking we were at the
29:34
birth point for every form of absolute
29:36
God like you know immortality and
29:39
shape-shifting and every crazy thing I
29:40
was a little bit of that I’m sorry to
29:43
say I was I kind of got a little off
29:44
I was pretty intense in the 80s myself
29:47
but anyway there was this feeling that
29:50
the entrepreneur could just just like
29:55
was had more cosmic power than the
29:58
average person okay so now here you have
30:00
a dilemma that had been kind of sneaking
30:03
up and nobody had really faced it on the
30:05
one hand everything’s supposed to be
30:07
free everything’s supposed to be
30:09
anonymous everything is supposed to be
30:11
like this completely open thing but on
30:14
the other hand
30:15
we love our entrepreneurs we worship our
30:17
entrepreneurs the entrepreneurs are
30:18
inventing reality so it should be clear
30:21
that there’s a bit of a potential
30:23
conflict here everything must be free
30:24
but we worship entrepreneurs how do we
30:27
do it how do we do it how do we do it
30:28
and so a set of compromises were created
30:32
over the years that ended up giving us
30:35
the worst of both sides of that I would
30:37
say so I’ll give you the the story is is
30:40
long and interesting but I’ll give you
30:42
just a few highlights one thing that
30:45
happened is when we finally got around
30:47
to actually creating the Internet
30:49
we decided it has to be super bare-bones
30:52
it would represent machines because
30:54
without having a number representing a
30:56
machine you can’t have an Internet but
30:58
it wouldn’t represent people it didn’t
30:59
intrinsically have accounts built-in for
31:01
humans it had no storage for humans
31:05
built-in it had no transactions it had
31:07
no authentication it had no persistence
31:10
of information guaranteed it had no
31:12
historical function we had it was like
31:14
super bare-bones just this thing
31:16
connects with that thing that’s all it
31:17
did and the reason why was that we were
31:21
supposed to leave room for future
31:23
entrepreneurs those who we worshipped
31:26
you know the Internet so if I was about
31:30
to say the Internet as you know was
31:31
invented by Al Gore some of you would
31:33
laugh and that’s because it was a laugh
31:36
line for a while because he was a
31:37
democratic he was a vice president and
31:40
before that a senator from Tennessee and
31:42
he was accused of over claiming that
31:44
he’d invented the internet on a TV show
31:45
which didn’t happen however I think he
31:48
should claim it I think he did invent it
31:50
he didn’t invent it technologically not
31:52
at all all of the underlying stuff which
31:55
is called a packet switch network and a
31:56
few other elements that existed in lots
31:59
of instances from before he had this
32:01
idea of throwing some government money
32:03
into it to bribe everybody to become
32:04
interoperable so they’d just be one damn
32:06
network and people could actually
32:07
connect that really was him and he
32:10
deserves credit for having done that
32:12
unless you think it was a terrible idea
32:14
but when that was happening
32:16
I remember having conversations about is
32:18
like we by creating this thing in such
32:21
an incredibly bare-bones way we are
32:24
creating gifts of hundreds of billions
32:26
of
32:26
for persons unknown who will be required
32:28
to fill in these missing things that
32:30
everybody knows have to be filled in and
32:34
then a little while later this other
32:36
thing happened which is Tim berners-lee
32:37
who’s great came up with a world wide
32:39
web protocol and here he did this thing
32:44
up to that point all of the ideas for
32:48
how to create shared you know shareable
32:51
media experiences online which are
32:53
called hypertext after Ted Nelson had
32:55
come up with the first Network design

32:57
back in 1960 the HTTP is from his for
33:00
hypertext they a core tenet of these is
33:04
that anytime one thing on the internet
33:06
pointed at something else that other
33:07
thing had to know it was being pointed
33:08
out so that there were two-way legs you
33:11
always knew who was pointing at you and
the reason for that is that way you
could preserve context provenance
history you could create chains of
payment where if people mashed up stuff
from somebody else in that person mashed
up from somebody else you could pay
payments that would populate back to pay
for everybody who contributed so if you
wanted to have an economy of information
you could
the information wouldn’t be
dropped but Tim just had one wailings
you could point it somebody they have no
idea that we’re being pointed out and
the reason for that is that it’s just to
actually do the two-way links is
genuinely a pain in the butt it’s just
more work if you do one way links the
whole thing could spread a lot faster

anybody can do it it’s just a much
easier system and that embedded in it
not only this idea of virality or me
meanness where whatever can spread the
fastest is what wins

and so it was a quantity over quality
thing in my view that was another thing
that happened so another thing that
happened didn’t come from Silicon Valley
in the late 80s people in Wall Street
started to use automated trading in the
first flash crash from out of control
trading algorithms was 89 and they
figured out something very basic
although an Forester had described
exactly this problem so much earlier
which is that if you had a bigger
computer than everybody else and it was
more central getting more information
you could calculate ahead of everybody
in gained an information advantage and
in economics information advantages
everything so if you’ve had just a
little bit more information on everybody
else you could just turn that into money
and it wasn’t really new insight but it
had actually been implemented before
then shortly after that a company called
Walmart realized they could apply that
not just to financial instruments to
investments but to the real world and
they created a software model of their
supply chain and dominated it
they could
35:10
go to anybody who was involved somewhere
35:12
in giving them products and figure out
35:14
what their bottom line was so they could
35:15
negotiate everybody down they knew who
35:17
everybody’s competitor was they went
35:19
into every negotiation with superior
35:21
information when they built this giant
35:23
retail empire on information superiority
35:28
Dada all happened before anybody in
35:30
Silicon Valley started doing it okay now
fast-forward to the birth of Google so
you have these super bright kids Sergey
and Larry some of the students I talked
to today on campus here remind me of
35:43
what they were like at the same age
35:44
super bright super optimistic idealistic
35:47
actually focused and they were backed
35:54
into a corner in my view on the one hand
35:57
the whole hacker community the whole
tech community would have just slammed
them if they did anything other than
everything being free but on the other
hand everybody wanted them to be the
next Steve Jobs the next Bill Gates
that
was like practically a hunger like we
want we want our next star and the only
way to combine the two things was the
advertising model
the advertising model
would say you’ll get everything for free
you can be you know as far as you’re
concerned your experience is you just
36:26
ask for what you want and we give it to
36:28
you now the problem with that is that
36:31
because it’s an advertising thing you’re
36:35
actually being observed your information
36:37
is being taken you’re being watched and
36:39
there’s a true customer this other
36:41
person off to the side who at first you
36:45
were always aware of because you could
36:46
see their little ads you know they’re
36:48
like if your local dentist or whatever
36:50
it was cute at first it was harmless at
36:51
first
36:54
and unfortunately if they come up with
37:00
this thing
37:01
after I don’t know worse law had ended
37:04
in computers were as fast as they were
37:05
ever gonna get and we’d established a
37:08
whole regulatory and ethical substrate
37:10
for computation everything maybe it
37:12
could have worked but instead they did
37:14
it in a period where there was still a
37:16
whole lot of Moore’s law to happen so
37:18
all the computers got faster and faster
37:19
cheaper and cheaper more more plentiful
37:21
more and more storeit or more connection
37:22
the algorithms got better and better
37:25
machine learning kind of started to work
37:27
a little better a lot of these
37:29
algorithms kind of kind of figured it
37:30
out we had enough computation to do
37:32
experiments and get all kinds of things
37:34
working that hadn’t worked before all
37:36
kinds of little machine vision things I
37:38
sold them on machine vision company
actually and the whole thing kind of
accelerated and what started out as an
advertising model turned into something
very different and so here we get into
our description of at least my
perception of the state that we’re in
37:54
right now so I mentioned earlier that
37:58
Norbert Wiener had described what he
38:04
viewed as a potentially horrible outcome
38:06
for the future of computation where
you’d have a computer in real time
observing a person with sensors and
providing stimulus to that person in
38:14
some form with displays or other
38:16
effectors and implementing behavior
38:19
modification feedback loops in order to
38:23
influence the person and if that was
38:24
done globally it would detach humanity
38:27
from reality and bring our species to an
38:29
end that was the fear back in the 50s
38:31
now unfortunately this innocent little
advertising model which was supposed to
address both the desire to have
everything be this Wild West open thing
and the desire to have entrepreneurs
despite everything being free landed us
right in that pocket that’s exactly
where we went
38:53
now I should say a bit about behaviorism
38:56
because that’s another historical thread
38:58
that led to where we are behaviorism is
39:02
a discipline of reducing the number of
39:07
variables in the training of an organism
39:10
so that you can corporal’s them
39:12
rigorously and reproduce effects so
39:15
let’s say if you’re whispering into your
39:18
horses ear while you’re training your
39:19
horse
39:20
that’s not behaviorism if you’re
39:22
whispering into your kids ear even if
39:24
you do offer some treats once in a while
39:26
ten cards behavior that’s not
39:27
behaviorism it has elements of it but
39:30
hardware behaviors and reduces the
39:32
variables and says look what we want to
39:34
do if we want to isolate we want to say
39:36
here’s this organism it’s in a box
39:38
sometimes they’re called Skinner boxes
39:41
remembering BF Skinner one of the famous
39:43
behaviorists and we want to say if the
39:46
creature person human whatever does a
39:48
certain thing you want you give the
person the treat does something you
don’t want give them a punishment
typically maybe candy and electric shock
39:58
the timing and the occurrence of these
40:02
things is guided by an algorithm you
40:04
find him the algorithm you need to
40:06
discover how to change behavior patterns
40:08
this science of studying behavior
40:11
behaviorism yielded surprises really
40:16
interesting surprises very early on the
40:19
first celebrity behaviorist was probably
40:22
Pavlov you’ve all heard of Pavlov I’m
40:24
sure and he demonstrated famously that
40:27
he could get a dog to salivate upon
40:30
hearing a bell whereas previously the
40:32
dogs salivated
40:33
upon being given food and hearing the
40:36
Bell so he was able to create a purely
40:39
symbolic seeming stimulus to replace the
40:43
original concrete one that’s quite
40:45
important because in many areas today
40:48
where behaviors modified addictions are
40:50
created there only abstract stimuli this
40:53
is true for instance for gambling’s that
modern gambling is based on this so are
like little games like candy crush were
there pictures of candy instead of real
41:01
candy now I have no doubt someday
41:04
there’ll be some Facebook or Google
41:07
hovercraft you know drone over your head
41:10
that drops real candy and electric
41:11
shocks on your head but for the moment
41:13
we’re in this symbolic realm that that
41:16
pavlov uncovered another amazing result
41:21
is that you might think naively that’s
41:23
simply providing punishment and reward
41:26
as reliably and as immediately as
41:29
possible would be the most effective way
41:31
to change behavior patterns but actually
41:33
that’s not true it turns out that adding
41:36
an element of randomness makes the
41:39
algorithms more effective so we don’t
41:44
fully just to state the obvious nobody
41:46
really understands the brain as yet but
41:49
it appears that the brain is is
41:52
constantly in a natural state of seeking
41:55
patterns of trying to understand the
41:57
world so if you provide a slightly
41:59
randomized feedback pattern it doesn’t
42:02
confuse or repel the brain instead of
42:05
draws the brain in the brain is a gate
42:06
there must be something more to
42:07
understand there must be something more
42:09
and gradually you’re drawn and more and
42:11
more and more and so this is why the
42:15
randomness of when you win at gambling
42:17
is actually part of the addiction
42:19
algorithm that’s part of what makes it
42:21
happen
42:21
now in the case of social media what
42:24
happens is the reward is when you get
42:27
retweeted or you go viral something like
42:30
that the term of art in Silicon Valley
42:33
companies is usually a dopamine hit
42:35
which is not an entirely accurate
42:37
description but it’s the one that that’s
42:40
most commonly is for when you have a
42:41
quick rise of a positive reward but just
42:45
as the gambler becomes addicted to the
42:48
whole cycle where they’re losing more
42:50
often than they win a Twitter addict
42:53
gets addicted to the whole cycle where
42:56
they’re most often being being punished
42:59
by other people who are tweeting and
43:00
they only get a win once in a while
43:02
right it’s the same it’s the same
43:05
algorithm and indeed
43:09
one of the side effects so in the trade
43:14
the terminology we use is engagement we
43:17
have algorithms that drive engagement
43:19
and we hire zillions of people with
43:22
recent PhDs from psych departments this
43:24
whole program there’s a program called
43:26
persuasive technology at Stanford where
43:28
you can go get a PhD in this and then
43:30
you get hired by some tech company to
43:32
drive engagement but it’s it’s really
43:34
just a sanitized word for addiction so
43:40
we drive addiction using a variety of
43:42
these algorithms and we can study them
43:45
more than the classical behavior server
43:46
did because we can study a hundred
43:48
million instances at once and and and we
43:52
can put out a hundred million variations
43:53
on all kinds of people and correlated
43:55
with data for all those people and then
43:58
cycle and cycle in a cycle the
44:00
algorithms can find new pockets of
44:03
efficacy they can tweak themselves until
44:06
they work better and we don’t even know
44:07
why they’re far ahead of any ability we
44:10
have to really keep up with them and try
44:12
to interpret exactly why some things
44:13
work better than other things
44:14
now even so it’s important to get this
44:18
right the effect is in a way not that
44:22
dramatic so Facebook for instance has
44:26
published research bragging that it can
44:28
make people sad and they don’t realize
44:29
that they were made sad by Facebook now
44:31
by the way you might wonder why would
44:34
Facebook publish that wouldn’t they want
44:37
to hide that fact it sounds pretty bad
44:39
but you have to remember that you’re not
44:42
the customer of Facebook the customer is
44:44
the person off to the side we’ve created
44:46
a world in which any time two people
44:48
connect online it’s financed by a third
44:51
person who believes they can manipulate
44:52
the first two so to the degree Facebook
44:55
can can convince that the third party
44:58
that mysterious other who’s hoping to
45:00
have influence that they can have some
45:03
mystical magical unbounded sneaky form
45:06
of influence then Facebook makes more
45:08
money that’s why they published it and
45:11
I’ve been at events where this stuff is
45:14
sold by the various tech companies and
45:15
they there’s no end to the brags and the
45:18
exaggerations when it comes to telling
45:20
the true customers what their powers are
45:22
very different from their public stance
45:23
but at any rate the the the darkness of
45:33
this all is that when you use this
45:37
technique to addictive people and we
45:40
haven’t even gotten to the final stage
45:41
of influencing their behavior patterns
45:42
we’re still just at the first stage of
45:44
getting them addicted you create
45:46
personality dysfunctions associated with
45:49
addiction because it is a form of
45:50
behavioral addiction so if any of you
45:53
who have ever dealt with somebody who’s
45:55
a gambling addict the technical
45:58
qualities of gambling addiction are
45:59
similar to the technical qualities of
46:02
social media addiction now I was just
46:06
saying before that we have to get this
46:07
right and understand the the degree of
46:10
awfulness here because it’s actually
46:13
kind of slight but just very consistent
46:15
and distributed a gambling addiction can
46:18
be really ruinous somebody can destroy
46:20
their lives and their family a social
46:22
media addiction can be ruinous as we’ve
46:24
seen by unfortunate events in just the
46:27
last few days but more often there’s a
46:30
statistical distribution where a
46:32
percentage of people are kind of
46:35
slightly effective and have their
46:37
personality slightly changed so what
46:40
will happen is some percentage and in
46:42
some of the studies I’ve seen published
46:44
maybe it’s like 5%
46:46
show like a three percent change in
46:48
personality or something like that so
46:49
and this is over hundreds of millions of
46:51
people or even over billions so it’s a
46:53
very slight very distributed statistical
46:56
effect on people with just a few who are
46:59
really dramatically affected but the
47:02
problem with that is that it compounds
47:07
like compound interest a slight effect
47:10
that’s persistent consistent repeated
47:14
starts to darken the whole society so
47:17
let’s talk a little bit about the
47:18
addictive personality that’s brought out
47:20
by these things the way I characterize
47:23
it is it becomes paranoid
47:28
insecure a little sadistic it becomes
47:36
cranky now why why those qualities so I
47:44
have a hypothesis about this and here
47:46
I’m hypothesizing a little ahead of
47:50
experimental results in science so I
47:53
want to make that clear this is a
47:54
conjecture not not something that I can
47:57
cite direct evidence for what I but but
48:01
all the the components of it are all
48:03
well studied so it’s just putting
48:05
together things that are known and I
48:07
think I think this should therefore be
48:09
worthy of public discussion you can very
48:13
roughly bundle emotional responses from
48:17
people into two kind of bins when we’ll
48:22
call positive and the other will call
48:23
negative the positive ones are things
48:25
like affection trust optimism and a
48:32
person belief in a person faith in a
48:34
person comfort with a person relaxing
48:37
around a person all that kind of stuff
48:39
the qualities you want to feel in
48:40
yourself when you’re dating somebody
48:42
let’s say the negative ones are things
48:46
like fear anger jealousy rage feeling
48:53
aggrieved feeling a need for revenge
48:55
just all this stuff now in the negative
48:58
bin a lot of these emotions are similar
49:01
to another bin that’s been described
49:03
over many years which is the startle
49:05
responses or the fight-or-flight
49:06
responses and the thing about these
49:09
negative ones is that they rise quickly
49:11
and they take a while to fall so you can
49:15
become scared really fast you can become
49:17
angry really fast and the related
49:21
positive emotions tend to rise more
49:23
slowly but can can drop quickly they
49:25
have the reverse time profile so it
49:29
takes a long time to build trust but you
49:31
can lose trust very quickly it takes a
49:33
long time to become relaxed compared to
49:37
how quickly you can become
49:38
startled scared and nervous and on edge
49:41
no this isn’t universally true there are
49:44
some fast rising positive emotions I
49:46
just talked about the dopamine hits
49:48
earlier so that’s an exception but
49:50
overall they’re more fast rising
49:52
negative ones
49:53
now these algorithms that are measuring
49:57
you all the time in order to adapt the
50:00
customized feeds that you see and the
50:02
designs of the ads that you see and just
50:04
everything about your experience they’re
50:06
watching you watching you watching you
50:07
in a zillion ways expanding all the time
50:10
now they’re following your voice tone
50:13
and trying to discern things about your
50:14
emotions based on pure correlation
50:17
without necessarily much theory behind
50:18
it they’re watching your emotions as you
50:21
move they’re watching your eyes your
50:23
smile and of course they’re watching
50:25
what you click on what you type all that
50:28
and the thing is if you have an
50:32
emotional response that’s faster the
50:35
algorithms are going to pick up on it
50:36
faster because they’re trying to get as
50:39
much speed as possible they’re rather
50:42
like high-frequency trading algorithms
50:44
in that sense we intrinsically in
50:47
Silicon Valley try to make things that
50:49
respond quickly and act quickly and so
50:51
if you have a system that’s responding
50:54
to the fast rising emotions you’ll tend
50:56
to catch more of the negative ones
50:57
you’ll tend to catch more of the
50:58
startled emotions now here’s the thing
51:01
if you look at the literature and ask
51:04
the broad question if we accept this
51:08
idea of beaming emotions into positive
51:10
and negative feedback emotions as far as
51:14
behavior change goes is positive or
51:16
negative more influential on human
51:19
behavior and the answer you’ll get is a
51:21
really complex patchwork there’s
51:24
behaviors have been around for a long
51:26
time so there’s a lot of studies you can
51:28
read hard to know exactly how high
51:30
quality all the research is especially
51:32
the older stuff but in general you can
51:34
find lots of examples of positive
51:37
feedback working better than negative or
51:39
vice versa and it’s all very situational
51:42
a lot of it’s very subtle on how things
51:44
are framed for people all kinds of stuff
51:45
but overall I what I perceive from the
51:49
literature is
51:49
approximate purity between positive and
51:52
negative but if you ask which emotions
51:56
will the algorithms pick up on when
51:58
they’re trying to get the fastest
51:59
possible feedback it’s unquestionably
52:01
true that the negative ones are faster
52:03
all right
52:05
so what you see is the algorithm
52:06
suddenly flagging oh my god I got a rise
52:09
out of that person let’s do some more of
52:10
that because we’re engaging that person
52:12
and that stuff tends to be the stuff
52:15
that makes them angry paranoid
52:17
revengeful insecure nervous jealous all
52:21
these things and so what you see is this
52:24
feedback cycle where a certain kind of
52:28
dysfunctional personality trait is
52:30
brought out more and more and people
52:33
with similar dysfunctional personalities
52:36
are introduced to each other by the
52:38
system’s
52:38
so when it’s a personality look like
52:41
well the the addiction personality
52:43
online all named three people who have
52:47
recently displayed it rather blatantly
52:49
one is the president who I’m just not
52:52
going to bother to name because I’m sick
52:53
of idiot the second is Kanye the third
52:58
is Elon Musk three people all displaying
53:02
somewhat overlapping in my view
53:05
personality distortions now I’ve no I’ve
53:09
had slight contact with two of the above
53:12
three I’ll let you guess which two they
53:14
are well know I’ll say one of them’s
53:17
trouble I’ve met Trump a few times over
53:18
a very long period of time I’ve never
53:21
known him well I’ve never had a real
53:23
conversation with him but I will say
53:24
that in the 80s and 90s he didn’t seem
53:29
like somebody who was desperate for you
53:30
to like him he didn’t seem like somebody
53:33
who was nervous about what you thought
53:34
about him he didn’t seem like somebody
53:36
who was itching for a fight he didn’t
53:38
seem like somebody who was looking for
53:40
trouble and thought it would help him he
53:43
really just didn’t seem like that at all
53:44
he seemed I think he was still a con man
53:46
I think he was but he was kind of like a
53:49
happy con man is that you know it was
53:51
like a different persona
53:53
and and I think what you know remember
53:58
how I said before that the gambling
54:00
addict is addicted to the whole cycle
54:02
where they lose a lot before they win
54:04
and I think in the same way the Twitter
54:06
addict is addicted to a cycle where they
54:08
bring a lot of wrath upon themselves and
54:10
have to deal with a lot of negative
54:12
feedback before they get positive
54:14
feedback or that you know there’s a mix
54:16
it’s very much like the losing and
54:18
winning and gambling and so I think
54:21
what’s happened is he’s gotten himself
54:22
into this state where he’s he’s like
54:24
this really nervous narcissist and this
54:27
is kind of weird like this personality
54:30
of the person who really like this
54:31
really like me I think he likes me
54:33
this kind of weird nervous narcissistic
54:36
insecure person has not been a typical
54:39
authoritarian personality in the past
54:41
and yet it’s working now and I suspect
54:45
the reason why is a lot of the followers
54:47
who respond to it see themselves in that
54:49
insecurity which is really strange I
54:52
mean if you think about this in the past
54:55
the celebrity figure or the leader
54:57
typically wanted to display a
54:59
personality that was kind of
55:02
invulnerable and an aloof and unmeaning
55:06
self-sufficient uncaring about whether
55:09
whether they’re liked or not and yet
55:12
that’s not what’s going on here it’s
55:13
really strange and and then there’s this
55:16
issue of lashing out its it be so so
55:19
it’s it’s as if because you know that
55:22
you have to get a certain if there’s a
55:23
certain amount of punishment that goes
55:25
with that reward you actually seek out
55:28
some of the punishment because you’re oh
55:29
that’s actually a part of your addiction
55:31
so if you’re a gambling addict you
55:33
actually make some stupid bets it’s it’s
55:35
it’s true it’s just what happens so you
55:38
have Elon Musk
55:39
I’m calling this guy who tried to rescue
55:41
kids in a cave in Thailand a pedophile
55:43
out of nowhere all right same thing
55:46
twitter twitter addiction dysfunctional
55:49
personality Kanye I’m not even what you
55:52
know but but basically you have people
55:54
who are kind of degrading themselves and
55:57
making themselves into fools but in a
56:01
funny way in the current environment
56:03
and there’s a whole world of addicted
56:06
fans who actually relate to it see
56:08
themselves in it and it works it works
56:10
for the first time in history and it’s
56:12
really strange it’s really it’s a really
56:15
weird moment okay so I started by
56:20
talking about the problem of losing
56:22
touch with reality
56:23
now as you heard I have a book called
56:28
ten arguments for deleting your social
56:29
media accounts right now and it goes
56:31
through a lot of reasons to delete your
56:34
social media of which the closest to my
56:37
heart is actually the final one which is
56:39
a spiritual one it’s about how I think
56:41
that Silicon Valley is kind of creating
56:44
a new religion to replace old religions
56:47
and even atheism with this new faith
56:49
about AI and the superiority of tech and
56:54
how we’re creating the future and all
56:55
this and and I feel that that religion
56:57
is an inferior woman people are being
56:59
drawn into it through practice so that
57:00
that tenth argument is the one I care
57:02
most about but what I want to focus on
57:04
here is the existential argument which
57:06
is the loss of reality so the problem we
57:11
have here is that we’ve created so many
57:15
addicts so many people who are on edge
57:17
that they perceive essentially politics
57:24
before they perceive nature they
57:26
perceive the world of human
57:31
recriminations before they perceive
57:33
actual physical reality no I presented a
57:36
theory it’s in various of my books
57:39
called the pacts which which I will
57:42
recount to you now that’s a way of
57:44
thinking about this it goes like this
57:48
there’s some species that are
57:51
intrinsically social like a lot of ants
57:54
there’s some species that tend to be
57:57
solitary like a lot of octopuses some of
58:01
my favorite animals there are some
58:04
species that can switch that can be
58:07
either solitary or social depending on
58:11
circumstances
58:13
and a famous one that we refer to in
58:16
mythology and in our storytelling is the
58:18
wolf you could have a wolf pack or you
58:21
can have a lone wolf same wolves
58:24
different social structures different
58:26
different epistemology I would say when
58:30
you’re a lone wolf you’re responsible
58:33
for your own survival you have to pay
58:36
attention to your environment where will
58:38
you find water where will you find prey
58:40
how do you avoid being attacked where do
58:43
you find shelter how do you survive bad
58:44
weather you are attached to reality like
58:47
a scientist or like an artist you are
58:50
naturalist when you’re in a wolf pack
58:54
different story now you have to worry
58:57
about your peers they’re competing with
58:59
you you have to worry about those above
59:01
you in the pack will they trash you can
59:04
will you get their station you have to
59:06
piss on those below you because you have
59:08
to maintain your status but you have to
59:11
unify with all your fellow pack members
59:13
to oppose those other packs over there
59:15
the other so all of a sudden social
59:19
perception and politics has replaced
59:22
naturalism politics versus naturalism
59:25
those are the epistemologies of the lone
59:28
wolf and the wolf pack people are also
59:33
variable in exactly this way we can
59:36
function as individuals or we can
59:38
function as members of a pack now what
59:43
happens is exactly what I am at least
59:45
hypothesizing happens with wolves it’s a
59:47
kind of interesting interaction
59:48
interacting with scientists who actually
59:50
study wolves because I haven’t actually
59:52
spent that much time with wolves just a
59:53
little bit so they’re people who know a
59:54
lot more about wolves and let’s just say
59:57
my little portrayal is overly simplified
59:59
but just I mean I’m it’s like a little
60:03
cartoon but I hope it functions to
60:04
communicate so when we are thinking as
60:11
individuals we have a chance to be
60:13
naturalist so we have a chance to be
60:14
scientists and artists we have a chance
60:16
to perceive reality uniquely from our
60:20
own unique perspective a diverse
60:22
perspective as compared to everyone else
60:23
is that
60:24
we can then share when we join into a
60:28
pack mentality we perceive politics so
60:32
what happens on social media is because
60:34
the algorithms are trying to get a rise
60:37
out of you to up your engagement and
60:39
make you ripe for receiving behavior
60:42
modification you’re constantly being
60:44
pricked with little social anxiety rage
60:50
irritations all these little things all
60:53
these little status worries is my life
60:55
as good as that person’s life am i
60:57
lonely relative to all these people what
60:59
do they think of me am i smart enough am
61:02
i getting enough attention for this why
61:03
didn’t people care about the last thing
61:05
I did online blah blah blah blah blah
61:06
and there’s just like it’s not that any
61:08
of these things by themselves are
61:10
necessarily that serious but
61:11
cumulatively what they’re doing is
61:13
they’re shifting your mindset and
61:16
suddenly you’re thinking like a packed
61:18
feature you’re so the pack switch is set
61:20
and you’re thinking politically and when
61:23
you think politically you lose
61:25
naturalism you know I think both modes
61:29
of being have a place I think being I
61:32
think if people exclusively all the time
61:34
stayed in the lone setting that would be
61:37
bad for society that would be bad for
61:39
relationships would be bad for families
61:42
and so on however there needs to be a
61:45
balance there needs to be a healthy way
61:47
of going back and forth between them and
61:49
not getting lost in one or the other and
61:52
so the hypothesis I’d put forward is
61:54
that we’re giving people so many little
61:57
anxiety-producing bits of feedback that
61:59
we’re getting them into this pack
62:00
mentality where they’ve become hyper
62:04
political without maybe even quite
62:06
realizing it and losing touch with
62:08
reality no when I say losing touch with
62:10
reality that demands some evidence
62:14
because you might say well are we less
62:16
in touch than in the past
62:18
so remember at the start after the music
62:23
I gave you what I consider to be sort of
62:27
a positive framing and a lot of good
62:29
news absolute poverty has been reduced
62:32
absolute levels of violence have been
62:33
reduced absolute levels of disease have
62:35
been reduced and so
62:36
there are many ways in which we’re
62:38
bettering ourselves but there’s this
62:40
other thing going on which is bad enough
62:44
that it might be the undoing of all of
62:47
that and that is this loss of reality
62:50
now here’s what I want to point out I I
62:53
travel around a fair amount and I
62:55
visited places that would appear on the
62:58
surface to have very little in common
63:00
I’ll mention some of them Brazil Sweden
63:03
Turkey Hungary the United States what do
63:08
they all have in common what they have
63:10
in common is the rise not just we
63:14
sometimes characterized it as right-wing
63:16
populist politics I don’t think that’s
63:20
quite right I think what we actually are
63:23
seeing is the rise of cranky paranoid
63:30
unreal politics I think that’s a better
63:34
characterization and it’s really
63:36
remarkable how it’s all happened at
63:38
about the same time and it’s happened in
63:40
some poor parts of the world too it’s
63:41
not even it’s like so it’s an you could
63:44
say well it’s something about aging
63:45
populations all the cranky old people I
63:47
have you know freshmen will tell me that
63:50
to get our minor but you know their
63:52
countries that are very young that have
63:54
that problem Turkey Brazil it’s like oh
63:56
it’s diverse countries it’s that we
63:59
can’t have democracies unless they’re
64:01
they’re ethnically monolithic or
64:03
something brazil’s diverse oh it’s it’s
64:08
inequality we can’t have the problem is
64:11
that societies are just losing their
64:14
social safety net well you know Sweden
64:17
Germany not really they might have
64:19
anxiety about actually you know it’s
64:22
it’s not so all these places are really
64:25
different they have different histories
64:26
and yeah they’ve all had similar
64:29
dysfunctions and so you have to say well
64:31
what’s in common between all of them and
64:33
you can say something vague well they
64:35
all have anxiety about the future and
64:36
this that’s true but the obviously they
64:38
have in common is that people have moved
64:40
to this mode of connecting through
64:42
manipulative systems that are designed
64:44
for the benefit of third parties who
64:45
hope to manipulate everybody sneakily
64:47
that seems like the clear thing they all
64:50
have in common
64:51
Brazil recently I mean all the same crap
64:55
that we saw was happening on whatsapp
64:58
which is the big connector down there
65:00
and Facebook I think to their credit try
65:04
to help a little bit but they couldn’t
65:05
really do it cuz the whole system is
65:07
designed to be manipulative you know
65:08
it’s if if if you have a car – that’s
65:12
designed to roll it’s very hard to say
65:14
well we won’t let it roll very much I
65:16
mean whatever it does it’ll be rolling
65:17
if you have a manipulation system and
65:19
that’s what it’s designed for you can
65:21
try to get it to roll more slowly or
65:23
something but all it can really do is
65:24
manipulate that is what these things are
65:26
optimized for that’s what they’re built
65:28
for that’s how they make money
65:29
every penny of the many billions of
65:32
dollars that some of these companies
65:33
have taken in that are totally dependent
65:35
on this and of the big companies the
65:37
only ones really totally dependent are
65:39
Google and Facebook or almost suddenly
65:41
dependent it all comes from people who
65:43
believe they’ll be able to sneakily
65:44
influence somebody else by paying money
65:46
via these places that is what they do
65:47
there’s just no other way to describe it
65:50
and so you have the typical thing that
65:57
happens is that the algorithms there
66:01
isn’t any information in them that comes
66:03
from like angels or extraterrestrials it
66:06
all has to come from people so people
66:07
input some information and often it’s
66:09
very positive at first you know it’ll a
66:11
lot of the starter information that goes
66:13
into social networks ranges from
66:16
extremely positive and constructive and
66:18
constructive to just neutral and nothing
66:21
much so there might be people who are
66:23
trying to better themselves maybe
66:24
they’re trying to help each other with
66:26
health information or something like
66:27
that
66:28
then all this information starts they’ll
66:31
say what we’re gonna forward some of
66:33
this information to this person in that
66:34
person we’ll try a 10 million times and
66:36
we’ll see if we get a rise from anybody
66:39
that ups their engagement now the people
66:41
who will be engaged quote-unquote
66:43
engaged are the ones who dislike that
66:45
information so all of a sudden you’re
66:47
getting juice from finding exactly the
66:49
horrible people who hate whatever the
66:50
positive people started off with and so
66:53
this is why you see this phenomenon over
66:55
and over again where whenever somebody
66:57
finds a great way to use a social
66:58
network they have this
66:59
initial success and then it’s echoed
67:01
later on but horrible people giving even
67:03
more mileage out of the same stuff so
67:04
you start with an Arab Spring and then
67:06
you get Isis getting even more mileage
67:08
out of the same tools you start with
67:10
black lives matter you get these
67:12
horrible racist these horrible people
67:16
who just are blackening America getting
67:18
even more mileage out of the same tools
67:20
it just keeps on happening and by the
67:24
way you start with me too and then you
67:26
get in cells and proud boys and whatever
67:28
the next stupid things gonna be because
67:30
the algorithms are finding these people
67:32
as a matter of course introducing them
67:34
to each other and then putting them in
67:35
feedback loops where they get more and
67:36
more incited without anybody planning it
67:39
there’s no evil person sitting in a
67:41
cubicle intending this I or at least I
67:44
would be very surprised to find somebody
67:46
like that I know a lot of the people in
67:49
the different places and I just don’t
67:51
believe it I believe that we backed
67:54
ourselves into this weird corner and
67:56
we’re just not able to admit it and so
67:58
we’re just kind of stuck in this stupid
68:00
thing where we keep on doing this to
68:01
ourselves so what you end up with is
68:06
electorates that are driven you have
68:09
like enough of a percentage of people
68:11
who are driven to be a little cranky and
68:14
paranoid and a little irritated and they
68:17
might have legitimate reasons I’m not
68:18
saying that they’re totally disconnected
68:20
from real life complaints but their way
68:22
of framing it is based on whatever the
68:24
algorithms found could be forwarded to
68:26
them that would irritate them the most
68:27
which is a totally different criteria
68:29
than reality so whatever it is and so if
68:34
it’s in the case of the synagogue
68:37
shooter it was one set of in
68:40
the case of the pipe bomber guy was
68:41
other thing in the case of the guy who
68:43
set up the but it’s all similar it’s all
68:45
part of the same brew of stuff that
68:46
algorithms forward now in some cases the
68:51
algorithms might have tweaked the
68:53
messages a bit because the algorithms
68:54
can do things like play with fonts and
68:56
colors and timing and all kinds of
68:57
parameters to try to if those have a
68:59
slight effect of how much of rise they
69:01
can get but typically they come from
69:03
people who are also
69:05
just trying to get as much impact as
69:08
possible and I think what I think what’s
69:11
happened is we’ve created a whole world
69:13
of people who think that it’s honorable
69:18
to be a terribly socially insecure
69:21
nitwit who feels that the world is
69:23
against them and it’s desperate to get
69:24
attention in any way and if they can get
69:26
that attention that’s the ultimate good
69:28
and the president acts that way a lot of
69:31
people act that way
69:33
that’s what musk was doing and I could
69:36
many other figures and I think what
69:38
happens is these people become both the
69:40
source of new data that furthers the
69:41
cycle and of course it drives them and
69:44
so that there’s sort of multiple levels
69:48
of evil that result from this the
69:50
obvious one is these horrible people who
69:54
make our world unsafe and make it make
69:57
our world violent and break our hearts
69:59
and just keep on doing it over and over
70:01
again and this just off the sense that
70:03
just random people are self-radicalized
70:06
and turning themselves into the heart of
70:08
the most awful version of a human
70:09
imaginable but there aren’t that many of
70:12
them in absolute numbers and I said in
70:14
earlier in terms of absolute amounts of
70:17
violence there’s actually an overall
70:18
decrease in the world despite all this
70:20
horrible stuff with some notable
70:22
exceptions like in with Isis in the
70:24
Middle East and so forth
70:25
but overall you know actually that’s
70:27
that’s true however the second evil is
70:31
the one that I think actually threatens
70:33
our overall survival and that is the one
70:35
of gradually making it impossible to
70:38
have a conversation about reality it’s
70:41
really become impossible to have a
70:44
conversation about climate it’s become
70:46
impossible to have a conversation about
70:48
health it’s become impossible to have a
70:51
conversation about poverty it’s become
70:53
impossible to have a conversation about
70:55
refugees it’s become impossible to have
70:58
a conversation about anything real it’s
71:03
only become possible to have
71:05
conversations about what the algorithms
71:07
have found upsets people and on the
71:09
terms of upsetting because that’s the
71:11
only thing that’s allowed to matter
71:15
and that is terribly dark that is
71:19
terribly dark and terribly threatening
71:21
and what I the scenario I worry about is
71:25
I mean it’s conceivable that some sort
71:30
of repeat of what happened it’s hard for
71:34
me to even say this but some sort of
71:35
repeat of what happened in the late 30s
71:37
in Germany could come about here I can
71:39
imagine that scenario I can imagine it
71:42
vividly because my own grandfather
71:43
waited too late in Vienna and my mother
71:46
was taken as a child and survived the
71:49
concentration camp so I feel it’s very
71:51
keenly having a daughter myself and yet
71:56
I don’t think that’s the most likely bad
71:58
scenario here I think the more likely
72:00
bad scenario is that we just put up with
72:03
more and more shootings more and more
72:07
absolutely useless horrible people
72:10
becoming successful and one in one
72:12
theatre or another whether politicians
72:13
or company heads or entertainers or
72:17
whatever and gradually we don’t address
72:21
the climate gradually we don’t address
72:24
where we’re gonna get our fresh water
72:26
from gradually we don’t address where
72:28
we’re gonna get a new antibiotics from
72:30
gradually we don’t wonder how we’re
72:33
gonna stop the spread of viruses vaccine
72:37
paranoia is another one of these stupid
72:39
things that spread through these
72:41
channels gradually we see more and more
72:43
young men everywhere turning themselves
72:45
into the most jerky version of a young
72:47
man sort of various weenie suppress
72:51
supremacy movements under different
72:53
names from you know gamergate to in
72:57
cells – all right – proud boys –
73:00
whatever this is going to be like this
73:02
endlessly and then gradually one day
73:04
it’s too late and we haven’t faced
73:06
reality and that and we’re we no longer
73:10
have agriculture we no longer have our
73:12
coastal cities we no longer have a world
73:15
that we can survive in and I that is you
73:21
know it’s a kind of a what I worry about
73:23
is a terribly stupid cranky undoing
73:27
fight into us not a big dramatic one
73:30
it’s neither a whimper nor a bang but
73:33
just sort of a cranky rant that could be
73:38
our end and is that a laugh line I don’t
73:44
know you guys are pretty dark anyway so
73:51
what to do about it
73:53
so here there my characterization of the
73:57
problem overlaps strongly with a lot of
74:00
other people’s characterizations of the
74:02
problem mine is perhaps not identical to
74:06
the problem as described by many others
74:08
but there’s an F overlap that I think we
74:11
have a shared we meaning many people who
74:13
hope to change us have a shared sense of
74:15
what’s gone wrong no the first thing I
74:17
want to say in terms of optimism is Wow
74:19
is that better than things used to be if
74:21
I had been giving this talk even a few
74:24
years ago not long at all ago I would
74:28
have been giving the talk as a really
74:30
radical French figure who was saying
74:31
things that almost nobody accepted who
74:33
had lost friends over these ideas and
74:36
who was really kind of surviving on the
74:40
basis of my technical abilities in my
74:42
past rather than what I was saying
74:44
presently because it was so unpopular
74:45
the last especially since I would say
74:48
like brexit Trump but also just in
74:51
general like studies showing the
74:52
horrible increase in suicides and teen
74:55
girls that that scale with their social
74:58
media use all these horrible things that
74:59
have come out oh no something that’s
75:02
really different in Silicon Valley there
75:05
are genuinely substantial movements
75:07
among the people the companies to try to
75:09
change their act regulators at least in
75:12
Europe are starting to get teeth and
75:13
really look at it seriously the tech
75:17
companies are trying to find a way to
75:19
get out of the manipulation game they
75:22
haven’t necessarily succeeded and not
75:24
all of them are trying but some of them
75:26
are
75:27
and it’s a different world it’s a world
75:29
with a lot of people who are engaged so
75:31
now having presented and the problem as
75:34
I see it it’s possible to talk about the
75:38
solution now a lot of folks feel the
75:41
solution should be privacy rights the
75:44
European regulators are really into that
75:46
we had a major conference on that in
75:48
Brussels last week where Tim Cook who
75:52
runs Apple gave a fire-breathing talk
75:53
that kind of sounded like a talk I might
75:55
have given at some point in the past I
75:57
gave a talk there too and I was like wow
76:01
I’ve got the radical anymore it’s very
76:02
straight in a way in a way I kind of
76:04
mourn the loss of radicalness because
76:08
some part of me likes being the person
76:10
like at this outer edge and I’m not and
76:12
it’s kind of like oh god I’m supposed to
76:14
be the radical but anyway I am I think
76:20
it’s great that the Europeans are
76:21
pushing for privacy the theory behind
76:24
that is that the more the harder it is
76:29
for the manipulation machine to get at
76:31
your data the less it can manipulate and
76:33
the more maybe there’s a chance for
76:36
sanity there’s a peculiar race going on
76:39
because the societies and year of that
76:42
support regulation and have and have
76:45
regulators with teeth which we really
76:47
don’t have much of in the u.s. right now
76:48
are themselves under siege by these the
76:51
the cranky political parties who are
76:54
sometimes called right-wing populist but
76:56
I think should be just called you know
76:59
the crank parties and the the cranky
77:02
parties might bring these societies down
77:04
so there’s a race can the regulator’s
77:07
influence the technology in time to
77:09
preserve themselves or will the
77:11
technology destroy their politics before
77:13
they have a chance it’s a really so
77:16
that’s a race going on right now it’s
77:17
quite dramatic and I wouldn’t know how
77:19
to handicap it now the privacy approach
77:23
is hard because these systems are
77:27
complicated like if I say okay here’s
77:30
click on this button to consent to using
77:32
your data for this like I even obviously
77:34
can’t read them
77:36
thing and even if there’s some kind of
77:38
better regulation supporting it it’s
77:40
just nobody understands that even the
77:42
companies themselves don’t understand
77:43
their own data they don’t understand
77:44
their own security they don’t I mean
77:46
like this whole thing is beyond all of
77:48
us nobody’s nobody’s really doing it
77:50
that well everybody’s having data
77:52
breaches and discovering suddenly that
77:54
they were using data they didn’t think
77:56
they were using that’s happened
77:57
repeatedly at Google and Facebook in
77:59
particular so I’ve advocated a different
78:02
approach which is instead of using
78:08
regulators to talk about privacy get
78:11
lawyers and accountants to talk about
78:13
lost value from your data being stolen
78:15
now I have several reasons for that one
78:19
is I don’t think we’ll ever lose our
78:22
accountants and our lawyers I think
78:24
they’re more persistent than our
78:26
regulators that’s one reason and I’m not
78:30
going to do lawyer jokes because it’s
78:34
about the health society’s become some
78:35
mean-spirited I don’t like to make jokes
78:37
about classes of people even lawyers
78:38
anymore but in your so my best friends
78:43
are really you know them but anyway let
78:52
me give you an example that I like to
78:55
use to explain the economic approach
78:57
here there’s a tool online that I happen
79:01
to use frequently that I really like
79:03
which is automatic translation between
79:05
languages if you want to look at a
79:06
website in another language or send
79:08
somebody now you can go online and there
79:10
at least two companies that do this
79:12
pretty well now Microsoft and Google can
79:14
enter your text in one language a usable
79:16
translation comes out on the other side
79:18
convenient free great modernity however
79:24
here’s an interesting thing it turns out
79:28
that languages are alive every single
79:30
day there’s a whole world of public
79:31
events all of a sudden today I have to
79:35
be able to talk about the Tree of Life
79:36
shooter and you have to know what I mean
79:38
all of a sudden today I have to be able
79:40
to talk about the magibon Marie you need
79:42
to know what I mean so every single day
79:44
there all of these new reference points
79:46
that come out lately often horrible ones
79:48
sometimes my
79:49
once maybe a new music video and a new
79:53
meme that people like whatever so every
79:56
single day those of us who help maintain
79:59
such systems have to scrape meaning
80:02
steal tens of millions of example phrase
80:04
translations from people who don’t know
80:06
it’s being done to them so there are
80:08
tens of millions of people who are kind
80:10
of tricked into somehow translating this
80:12
phrase or that phrase in Google and
80:14
Microsoft have to grab these things and
80:15
incorporate them to update their systems
80:17
to make them work but at the same time
80:20
the people who are good at translating
80:22
are losing their jobs
80:24
the career prospects for a typical
80:27
language translator have been decimated
80:30
meaning their tenth of what they were
80:31
following exactly the pattern of other
80:34
information based work that’s been
80:36
destroyed by the everything must be free
80:39
movement recording musicians
80:41
investigative journalists crucially
80:43
photographers all of these people are
80:46
looking at about a tenth of the career
80:48
prospects that they used to have that’s
80:50
not to say that everything’s bleak all
80:51
there there are examples in each case of
80:54
a few people who find their way and this
80:57
gets to a very interesting technical
80:58
discussion which is I won’t but you get
81:01
a zipper curve where there are few
81:03
successful people and then it falls to
81:04
nothing whereas before you before you
81:06
had a bell curve but I can if anybody
81:08
wants to know more about that I can but
81:09
anyway you have a tiny number of
81:11
successful people but almost everybody
81:13
has lost their careers now wouldn’t it
81:15
make more sense if instead of making
81:20
money by providing free translations in
81:22
order to get other people who are called
81:24
advertisers to manipulate the people who
81:26
need the translations in some sneaky way
81:28
that they don’t understand and make the
81:31
whole world more cranky and less reality
81:32
oriented instead of that what if we went
81:37
to the people providing this phrase
81:39
translations and we just told them you
81:41
know if you could just give us the
81:42
phrase translations we really need then
81:45
our system would work better and we’d
81:47
pay you because then we’d have a better
81:48
system and then if we went to the people
81:50
who need translations and say free isn’t
81:53
really quite working because that way we
81:55
that means we have to get these other
81:56
people to manipulate you to have a
81:58
customer but we’ll make it really cheap
82:00
what about a die
82:01
a translation or something like that we
82:03
worked out some kind of a system where
82:05
the people who provide the translations
82:07
meet each other because it’s a network
82:08
we can introduce them they form a union
82:11
they collectively bargain with us for a
82:13
reasonable rate so that they can all
82:15
live put their kids through school and
82:17
then we get better working translators
82:20
and yeah you pay a dime you can afford a
82:22
dime and something everybody’s happier
82:24
no there are a few things about this
82:26
that are really good in my point of view
82:28
one is we no longer have these people
82:30
from the side paying to manipulate
82:32
people everything’s become clear – we
82:35
have a whole class of people making a
82:36
living instead of needing to go on the
82:38
dole instead of saying oh we need this
82:40
basic income because everybody is
82:41
worthless 3 we’re being honest instead
82:44
of lying which is a really big deal
82:46
right now we have to lie because we’re
82:49
not telling the people that were taking
82:50
their data we’re telling them oh you’re
82:52
buggy whips you’re worthless
82:53
but in secret we need you that’s a lie
82:56
and for there’s kind of a spiritual
82:59
thing here where we’re telling people
83:01
honestly when they’re still needed like
83:03
to tell people oh actually you’re
83:05
obsolete the robots taken over your job
83:08
when it’s not true when we still need
83:10
their data there’s something very cruel
83:12
about that it cuts to some sort of issue
83:14
of dignity and human Worth and it really
83:16
bothers me so for all these reasons this
83:18
seems like a better system to me and
83:21
sure we’d have to make accommodations
83:22
for those who can’t afford whatever the
83:24
rate would be for the language
83:25
translation but we can do that we’ve
83:27
almost figured out ways to do that if
83:28
we’re a decent society and we’d be a
83:30
more decent society because we wouldn’t
83:32
have an economy that’s strictly run on
83:35
making people into assholes so so that’s
83:41
why I advocate the economic approach so
83:44
I know it’s bad form but it can I refer
83:47
you to a paper to read go look up
83:50
something called blueprint for a better
83:52
digital society I’m sorry about the
83:54
title I didn’t make it up it’s an
83:56
editor’s fault Adi Ignatius it’s your
83:59
fault Adi and it’s a Harvard Business
84:01
Review recently you can find it online
84:02
very easily blueprint for better digital
84:05
society and it’s the latest version
84:07
about how to make this thing work and a
84:09
little bit about how to transition to it
84:13
so so that’s the solution I’ve been
84:16
exploring and promoting I think there’s
84:19
room for a lot of solutions another idea
84:22
is people like the Center for Humane
84:25
technology which is Tristan Harris in
84:27
another group called Common Sense Media
84:28
are trying to educate individuals about
84:31
how to be more aware of how they are
84:33
manipulated and how to make slight
84:35
adjustments to be manipulated a little
84:37
less worth trying remember it’s a sneaky
84:42
machine the whole industry is based on
84:43
fooling you so staying ahead of it is
84:45
gonna be work you can’t just do it once
84:47
and think you’re done it’ll be a
84:48
lifetime effort that’s why I think you
84:49
should just quit the things yeah when I
84:53
say can you please delete all your
84:55
social media accounts surely one of the
84:58
first thoughts and all your minds is
85:00
well that’s ridiculous I mean you’re not
85:02
going to get billions of people to
85:04
suddenly drop these things there’s
85:06
there’s two reasons why you’re correct
85:09
if you have that that that thought one
85:14
is that you’re addicted this is an
85:16
actual addiction you can’t just go to
85:18
somebody with a gambling addiction and
85:19
say oh just so you know any more than
85:21
you can do that if they have a heroin
85:23
addiction that’s not how addiction works
85:25
you can’t just say no it’s a prop it’s
85:27
hard
85:27
addiction is hard all of us have
85:29
addictions none of us are perfect but
85:31
this particular ones destroying our
85:33
future it’s really bad it’s not just
85:34
personal we hurt each other with this
85:36
one in an exceptional way so another
85:41
reason is network effect and that means
85:43
everybody already has like all their
85:45
pictures and all their past and all
85:47
their stuff on these properties that
85:49
belong to companies like Facebook and
85:50
for everybody to get off it all at once
85:53
they can continue to have connections
85:54
with each other is a coordination
85:56
problem that’s essentially impossible at
85:57
scale so that’s that’s a network effect
86:00
problem so why am I asking people to do
86:03
something that can only happen a little
86:05
and the reason why is even if it only
86:08
happens a little it’s incredibly
86:09
important so let me let me draw a
86:12
metaphor to some things that have
86:14
happened in the past we have in the past
86:17
had mass addictions that were tied to
86:24
corrupt
86:25
mercial motives at a large scale one
86:28
example is the cigarette industry
86:32
another example is big alcohol alcoholic
86:36
beverages I could mention others lead
86:39
paint is when I bring up in the book now
86:41
in these cases well actually the lead
86:44
paint was an addiction thing so I’ll
86:45
leave I’ll leave out lead paint so let’s
86:47
just talk about cigarettes and in the
86:50
case of cigarettes when I was growing up
86:52
it was almost impossible to challenge
86:55
cigarettes
86:56
I you know like cigarettes were manly
86:59
they were cool if you were on the red
87:02
side of America they were the cowboy
87:03
thing if you were on the blue side they
87:06
were the cool beatnik thing everybody
87:07
had a cigarette and you just couldn’t be
87:10
cool without your cigarette but enough
87:12
people finally realize that they could
87:14
get out from under it that at least it
87:16
allowed a conversation the addict will
87:19
defend it if you talk to somebody who’s
87:21
really addicted to cigarettes it’s very
87:22
hard for them to really get a clear view
87:25
of what the cigarette means to society
87:27
what it means to have cigarette in
87:28
public spaces there was a time in this
87:32
room would have been filled with
87:33
cigarette smoke and we would have been
87:35
gradually killing the students who were
87:36
attending I think I’m coughing in
87:42
sympathy with remembering what that was
87:43
like because it was really horrible
87:48
alcohol Mothers Against Drunk Driving
87:50
was or drunk drivers I forgot which it
87:53
is has been one of the most effective
87:55
political organizations they changed
87:57
laws they changed awareness they changed
87:59
outcomes and saved an enormous number of
88:01
lives despite the fact that once again
88:04
alcohol is cool it’s supposed to be cool
88:06
to drink at a frat party it supposed to
88:07
be cool to drink at your fancy
88:09
restaurant everybody loves drinking and
88:11
there’s this whole world event of
88:13
advertising liquor we found a reasonable
88:17
compromise in both cases we don’t throw
88:20
people who drink or smoke cigarettes in
88:23
jail like we’ve done for marijuana for
88:25
years instead we came up with a
88:28
reasonable policy don’t do it in public
88:30
don’t do it behind the wheel it worked
88:32
that was only possible because we had
88:36
enough people who were outside of the
88:38
addiction system
88:39
have a conversation in this case we
88:42
don’t have that in this case all the
88:44
journalists who should be helping us are
88:46
addicted to Twitter and making fools of
88:47
themselves if you’re a journalist in
88:50
this room you know I’m telling the truth
88:54
the same for politicians same for public
88:57
figures celebrities who might be helpful
88:59
we need to create just a space to have a
89:04
conversation outside of the addiction
89:06
system now you might be thinking oh my
89:08
god I’ll destroy my life if I’m not on
89:10
these things I don’t think it’s true I
89:13
think if you actually drop these things
89:14
you suddenly discover you can have any
89:15
life you want I’m not claiming that I’m
89:18
the most successful writer or public
89:21
speaker but I’m pretty successful I have
89:22
best-selling books I get around I you
89:25
know you hired me to come talk to you
89:29
and I’ve never had an account on any of
89:31
these things and you could say oh but
89:33
you’re an exception in this way well I
89:34
mean how much of an exception can I be I
89:37
play any points against me I’m like this
89:39
weirdo and and I still know seriously
89:42
you know I mean I still can do it if I
89:44
can do it probably other people can do
89:45
it too
89:46
I just don’t I think that there’s this
89:48
illusion that your whole life like
89:50
they’ll be you’ll just be erased if
89:52
you’re not on these things but that
89:53
illusion is exactly part of the problem
89:55
that’s that’s exactly part of this weird
89:59
existential insecure need for attention
90:05
at any cost bizarre personality
90:08
dysfunction that’s destroying us just
90:10
give it a rest
90:11
now here’s what I would say there was a
90:13
time it’s when if you were young
90:17
especially one of the priorities that
90:20
you felt in your life was to know
90:21
yourself and the only way to know
90:23
yourself is to test yourself and the way
90:25
you test yourself is maybe you’d go
90:26
trekking in the Himalayas or something I
90:30
used to hitchhike into central Mexico
90:32
when I was really young just a really
90:34
young teenager and that’s how I tested
90:36
myself these days I think the the
90:39
similar idea would be quitting your
90:41
social media and really deleting it like
90:43
you can’t you can’t like quit Facebook
90:45
and keep Instagram that’s you
90:46
have to actually delete the whole
90:48
thing
90:49
and and then it doesn’t mean you’re
90:52
doing it for your whole life
90:53
delete everything and then stay off
90:56
stuff for six months okay if you’re
90:59
young you can afford it it will not kill
91:01
you and then after six months you will
91:03
have learned and then you make a
91:05
decision in my opinion you should not
91:08
harm your life for the sake of the ideas
91:11
I’ve talked about today if it’s really
91:13
true that your career will be better or
91:15
whatever through using these things then
91:18
you need to follow your truth and do
91:21
what makes you succeed and if it’s
91:22
really true that being a serf just
91:24
stupid Silicon Valley giant is the thing
91:27
that helps your career okay but you have
91:32
to be the one making that decision and
91:33
if you haven’t tested yourself you don’t
91:36
have standing to even know so I’m not
91:40
telling you what’s right for you but I
91:41
demand that you discover what’s right
91:44
for you that I think is a fair demand
91:47
given the stakes and with that cheerful
91:51
closing I will call it
91:53
[Applause]
92:02
[Music]
92:06
so we have is that is a mic on so do we
92:10
have the question set people well we
92:14
were gonna have cards I don’t know if
92:15
any cards have made their way here’s a
92:17
card cards okay
92:19
I’m actually an unclear on how this
92:22
whole thing works okay well this is it
92:28
alright so normally I would get a bunch
92:32
of cards but but but I haven’t that
92:34
hasn’t happened yet okay lately I’ve
92:36
noticed that I was getting progressively
92:38
more cranky that’s now a technical term
92:41
I think you’ve introduced along with a
92:43
virtual-reality cranky from a lack of
92:46
sleep because of the excessive blue
92:48
light given off by screens ah have you
92:51
factored this effect into your theory oh
92:55
yeah
92:56
well there’s the time and stuff like
92:57
that there’s more as soon as soon as I
93:02
put blue filters on my screens I got a
93:03
lot less cranky okay there the the
93:08
problem of blue light keeping you up and
93:11
those are all real problems and in fact
93:13
you might want to just turn colour off
93:15
on your computer definitely turn colour
93:17
off on your phone and all seriousness
93:18
you don’t need it
93:19
for most things I have I have color I
93:23
use a phone but I definitely cover off
93:24
and like make those changes if you know
93:28
if you notice something like that yeah
93:32
you can right you can turn off the blue
93:38
light on your computer you can go into a
93:40
setting and you know the best way to do
93:43
it go to the visual accessibility
93:45
settings because they have these high
93:46
contrast settings for people who have
93:49
trouble focusing and they get rid of
93:50
color as we come stark contrast as an
93:54
example oh for God’s sakes I have to
94:00
enter this – I was going to show you
94:01
what it looks like but I’m not going to
94:02
bother with a code anyway you just you
94:04
can do it every major platform has this
94:06
ability it’s really that and you should
94:08
do it go to Common Sense Media org or to
94:15
Center for humane technologies website
94:17
and both of them have advice on how to
94:21
do things like this and another thing is
94:23
both I’m pretty sure both Windows and
94:26
Mac if it’s a computer have ways to make
94:28
the blue light go away as the evening
94:29
approaches there’s like this kind of
94:31
stuff this is real stuff and you should
94:33
pay attention to it and the technology
94:35
should serve you and not drive you crazy
94:36
but I do have to say this is not an
94:38
existential threat this is this is at
94:41
the level of too much sugar and
94:42
breakfast cereals or something like that
94:43
it is actually a real issue it’s it’s it
94:46
does have an effect on the health of the
94:48
population but it’s not going to destroy
94:49
us this other stuff I’m talking about is
94:51
at another level okay
94:53
so rather than the cards do we have
94:57
cards we do okay great let’s give some
95:02
cards is that your card oh okay great
95:07
okay here’s my card isn’t there a design
95:09
problem for publishing online if you
95:11
know who’s pointing at you how is that
95:14
related to the problem Allen turning
95:16
faced touring it says turning he hatched
95:20
the concept of a machine like
95:22
personality isn’t that too software what
95:25
listening and compassion is to human
95:27
communication yeah it’s a kind of
95:30
interesting question to me when I read
95:34
Turing’s final notes that the Turing
95:37
test comes up twice it comes up in a
95:39
little monograph he wrote and it comes
95:40
up in a sort of a little note there’s
95:43
two statements of it and in both of them
95:45
to me reading them there’s just this
95:49
profound sadness I feel like this is
95:51
this person who’s just screaming out so
95:55
some of you might I don’t know there’s a
95:56
whole history to this thing that what
95:58
trinket is he created a metaphor oh boy
96:02
let me try to do this as fast as I can
96:04
Turing did as much as anybody to defeat
96:07
the Nazis in World War two by braking
96:10
using one of the first computers that
96:11
ever existed to break a Nazi secret code
96:13
called enigma and he he was considered a
96:18
great war hero however he lived an
96:22
identity that was illegal at that time
96:24
which is that he was gay
96:25
and he was forced by the British
96:27
government after the war to accept a
96:30
bizarre crack treatment for being gay
96:32
which was to overdose on female sexual
96:35
hormones with this bizarre idea that
96:37
female hormones would balance his over
96:39
sexiness which was supposed to be the
96:41
gay it’s like so stupid it’s hard to
96:43
even repeat it and he started developing
96:46
female physiological characteristics as
96:48
a result of that treatment and it he
96:53
committed suicide by a sort of a weird
96:56
political thing where he laced an apple
96:58
with cyanide and ate it next to the
96:59
first computer sort of anti Eve or
97:03
something and he was a very brilliant
97:05
and poetic man and in the final couple
97:08
of weeks of his life he came up with
97:10
this idea of repurposing an old
97:14
Victorian parlor game that used to be
97:18
this thing we’d have a man and a woman
97:21
behind a curtain or a screen of some
97:26
kind and all they could do is pass
97:30
little messages to a judge and the judge
97:32
would have to tell who’s the man and
97:33
who’s the woman and each of them might
97:35
be trying to fool the judge which is
97:36
kind of a weird if you think about it
97:38
the Victorians were pretty kinky and
97:40
bizarre and and so what you’re doing is
97:45
as with behaviorism and as for the
97:47
internet you’re slicing away all of
97:48
these factors and just turning it into
97:50
like this limited stream of information
97:51
so it’s kind of like tweeting or
97:53
something and that so what Turing said
97:56
is what if you got rid of the woman and
97:58
you had a man in a computer and the
98:00
judge couldn’t tell them apart wouldn’t
98:03
then finally you have to admit that the
98:05
computer should be given rights and give
98:07
in stature and be treated and when you
98:10
read it I don’t the way I read it is
98:13
it’s this person saying oh my god I
98:14
figured out how to save the world from
98:16
these people who wanted to destroy
98:18
everybody based on being of the wrong
98:19
identity these people who wanted to kill
98:22
not only gays but of course Jews and
98:24
Gypsies and and and black people and
98:27
these horrible people and I came up with
98:30
this way of defeating them and now
98:31
you’re destroying me for who I am
98:33
and I feel like there’s this kind of
98:37
astonishing sadness in it and the way
98:40
it’s the way turns and so that was the
98:43
birth of the idea of artificial
98:44
intelligence and I feel like the way
98:46
it’s remembered is completely unlike
98:48
what it’s like to read the original you
98:50
know I feel like if you look at the have
98:51
you ever read the original Turing
98:53
because if you read the original Turing
98:54
I mean it’s like it’s intense you know
98:58
here’s this person who’s being tortured
98:59
to death it’s like it’s not some kind of
99:02
nerdy thing at all it’s it’s a it’s a
99:05
difficult it’s difficult to read the
99:07
documents and I think it was like this
99:12
crazy I think he knew he was about to
99:15
die and I think he was reaching out for
99:17
some sort of a fantasy of what kind of a
99:20
thing what would it take for people to
99:23
not be cruel what would it take and I
99:26
think in this very dark moment he
99:28
thought maybe giving up humanity
99:30
entirely and we’ll just maybe if we’re
99:32
just machines maybe we won’t do this to
99:35
ourselves and the thing about that of
99:37
course is we’ve turned ourselves sort of
99:40
into machines because we’ve all kind of
99:42
acting like machines to be able to use
99:44
this stuff you’re all sitting there all
99:45
day entering your like little codes to
99:47
get online that you’re sort of turning
99:49
into machines in practice and yet we’ve
99:51
just become more and more coral like
99:53
that that’s the the ultimate irony is
99:55
that it didn’t help so that’s my take on
99:58
it and this idea that AI is some could
100:02
be some form of compassion I think it’s
100:05
kind of I think it’s really
100:06
just a way of stealing data from people
100:07
who should be paid to translate AI is
100:11
theft
100:11
to paraphrase anyway okay we we a I is
100:23
just a way look all all we can do with
100:27
computers ever look to be a good
100:30
technologist you have to believe that
100:33
people are sort of mystically better
100:35
than machines otherwise you end up with
100:38
gobbledygook and nonsense you can’t
100:40
design for machines so AI has to be
100:43
understood as a channel for taking data
100:45
from one person to help another
100:47
I take I take the data from the
100:50
translators and I apply it through a
100:52
machine learning scheme or some kind of
100:54
scheme and I can get translations that
100:56
help people in a better way than I could
100:59
without that scheme in between which is
101:01
wonderful
101:02
so it’s technology to help people
101:04
connect in a way that’s more helpful if
101:05
you understand AI that way you elevate
101:08
people and you don’t confuse yourself
101:11
okay yeah we we we don’t have a lot of
101:13
time and we have a lot of great
101:15
questions so some questions are not
101:16
going to be able to be answered now
101:19
although I want to mention that there
101:21
will be a book signing and book
101:23
purchasing outside after the event is
101:27
over there’ll be two tables please if
101:29
you want to ask me long quest you can’t
101:31
go up through it come up to me and ask
101:32
like some open-ended giant question I’m
101:34
signing your book that would taken out
101:35
like by that you really can’t do that by
101:37
a book the people who are selling ebooks
101:39
have asked that you buy a book first
101:41
before you have it signed and that note
101:48
I’ll segue into there’s a couple
101:49
questions that are connected to this how
101:51
about your market solution arguably the
101:55
mess we’re in now comes from the
101:56
monopolistic and manipulative tendencies
101:58
inherent in markets given that the world
102:01
world has never known pure markets what
102:03
would keep this one pure oh it’s not
102:05
going to be pure it’s going to be
102:07
annoying and unfair and horrible but the
102:09
thing about it is it won’t be extent
102:10
existentially horrible
102:12
the thing about market so what I would I
102:15
believe about economic philosophies is
102:17
there’s never been one that’s worked out
102:19
in practice and instead just asked with
102:21
moral philosophies and theories of how
102:24
we learn and many many other areas where
102:26
we’re trying to deal with very complex
102:27
systems it’s not so much that we can
102:29
seek the perfect answer but we have to
102:31
trade-off between partial answers so to
102:33
me there’s never been a pure market
102:37
there’s never been and I don’t think
102:39
there ever could be but I think what you
102:41
can do is you can get a balance this was
102:43
the the Keynesian approach to economics
102:45
I think is very wise you get you you you
102:48
get a balance between reasonable
102:50
oversight and and in a reasonably
102:53
unfettered market and they’ll go through
102:54
cycle for the market will need help and
102:56
you just you trade off you trade
102:58
and I think that that’s that’s the only
103:01
path we have I think being eyed and
103:04
being an ideologue for any solution to a
103:07
highly complicated problem is always
103:08
wrong okay just two more then and
103:11
there’s a couple like this as well
103:14
what about the connection force of
103:15
social media eg for the feminist
103:18
movement like me to these online
103:21
communities raise awareness and create
103:23
supportive communities and then many
103:25
people who rely on social media for
103:27
community because of the demands of
103:28
capitalist jobs yeah yeah it’s just
103:32
that’s all true except that backfires
103:34
and the backfire is worse than the
103:36
original so like what happened the it
103:40
just keeps on happening I mean like
103:41
before me too there were there was a
103:44
problem of diversity in the gaming world
103:47
and a few women in gaming just wanted to
103:51
be able to say one or two things and not
103:52
be totally invisible and and then the
103:54
result of that was this for Asia’s thing
103:56
called gamergate that was just this
103:58
total never shut up totally wipe
104:01
everybody else out totally make it
104:02
everything horrible movement and then me
104:05
too has spawned this this other thing
104:09
that’s still rising which is the in
104:10
sells and the proud boys and all this
104:12
stuff and the problem is that in these
104:15
open systems at first your experience of
104:19
finding mutual support and creating
104:21
social changes as a entik it’s real it’s
104:23
just that there’s this machine you’re
104:25
not thinking about behind the scenes
104:26
that’s using the fuel you’re providing
104:29
in the form of the data to irritate
104:31
these other people because it gets even
104:33
more of a rise from them and you’re
104:35
creating this other thing that’s even
104:36
more powerful that’s horrible
104:38
even though it wasn’t your intent and
104:39
that’s the thing that keeps on happening
104:40
over and over again it doesn’t
104:42
invalidate the validity of the good
104:44
stuff that happens first it’s just that
104:46
it always backfires well not always but
104:48
typically and you end up you end up
104:53
being slammed and you don’t even like
104:55
one of the things that’s really bad
104:56
about it is that it’s you know it seems
104:59
like it’s just the fault of the creeps
105:01
who come up where it’s actually kind of
105:02
more the fault of the algorithms that
105:04
introduced the creeps to each other and
105:05
then got them excited in this endless
105:07
cycle of using your good intentions to
105:09
irritate the worst people so I mean
105:12
I know the thing is it’s cute
105:14
blacklivesmatter was great I think I
105:16
mean I think it’s wonderful and yet the
105:18
reaction to it was horrible and of a
105:21
higher magnitude and I just think we
105:23
have to find unfortunately until we can
105:26
get rid of the advertising model and the
105:28
giant manipulation machine every time
105:30
you use the big platforms for any kind
105:33
of positive social effect it’ll backfire
105:35
and destroy you and it’s it’s a fool’s
105:38
game even though it’s valid at first in
105:41
the long term it’s a fool’s game okay
105:44
this I don’t like saying that I hate
105:45
saying that it breaks my heart this is
105:47
the last question and it’s existential
105:49
wanna I’ll combine the two two questions
105:53
here seeing how pernicious social media
105:55
has become by being hijacked toward
105:58
bummer and you know bummer is another
106:00
technical term using yeah there’s a
106:04
wonderful writer on cyber things I’m
106:06
sherry Turkle and she read my book and
106:09
she said oh I love this book but there’s
106:10
just too much touching it
106:11
and the thing because it there’s like
106:13
bummer and there’s a cat’s behind on the
106:15
cover stuff and I the problem is I
106:19
married a woman who likes butt jokes and
106:21
I just can’t I don’t know some of they
106:23
just come I don’t know anyway okay so
106:29
how to how to guard against an immersive
106:32
technology like virtual reality becoming
106:35
even more insidiously bummer and then
106:38
how do you know what is real okay oh
106:41
well all right those are small questions
106:43
so the first one I mean I think the way
106:47
to keep fort reality vert reality could
106:49
be super hyper creepy I wrote a book
106:52
about vert reality that we have mention
106:53
it’s called dawn of the new everything I
106:55
don’t know if they’ll have it upfront or
106:56
not but I talked a lot about that issue
106:58
so virtuality could potentially be
107:00
creepy I think the way to tell whether
107:03
something’s getting creepy is whether
107:04
there’s a business model for creepiness
107:06
so if the way it’s making money is that
107:08
there’s somebody to the side who thinks
107:10
they can sneak lis alter you or
107:11
manipulate you that’s the creepy engine
107:14
if there isn’t that person and if there
107:15
isn’t that business going on it’s less
107:18
likely to be creepy I think this is
107:19
actually that’s actually a pretty simple
107:21
question to answer I think it boils down
107:23
to incentives I think incentives run
107:25
world as much or more than anything else
107:27
as far as this question of how to be how
107:29
do you know what’s real
107:30
the answer is imperfect what you do is
107:34
you struggle for it you struggle to do
107:37
scientific experiments to publish you
107:39
have to always recognize you can fool
107:40
yourself you have to recognize that
107:43
whole communities of people can fool
107:44
themselves and you just struggle and
107:45
struggle and struggle and you gradually
107:47
start to form a little island in a sea
107:50
of mystery in which you never have total
107:54
confidence but you start to have a
107:56
little confidence so there’s some things
107:58
that we can be confident of now the
108:01
earth is round not on line but do we
108:06
know it in an absolute absolute sense no
108:09
you can never know reality absolutely
108:11
but you can know it pretty well and so
108:13
in order to talk about reality you have
108:16
to be used you have to get used to near
108:21
perfection that is never actual
108:24
perfection and if you’re not comfortable
108:26
with that concept you have no hope of
108:27
getting to reality because that’s the
108:29
nature of reality reality is not
108:31
something you ever know absolutely and
108:32
in fact just to be clear I in one of my
108:35
books I defined reality is the thing
108:37
that can be never that can never be
108:38
measured exactly it’s the thing that can
108:40
never be simulated accurately it’s the
108:42
thing that can never be described to
108:44
perfection that is reality but at this
108:46
because the simulation can be described
108:48
to perfection I can describe to you a
108:50
video game world or a virtual world to
108:52
perfection I can’t do that with reality
108:54
and the the thing is though that we
108:59
can’t demand absolute knowledge in order
109:01
to have any knowledge at all or else we
109:03
make ourselves into genuine fools we
109:05
have to be able to accept that we can
109:07
have better knowledge than other
109:08
knowledge it’s all an incremental sort
109:11
of eternal improvement project so the
109:16
people who demand absolutely proof of
109:18
climate change or fools but they’re
109:21
interesting like I mean some of you
109:23
might have read there was a good history
109:25
published this week about the history of
109:28
the reading wars about how we learn
109:30
reading and there’s this community of
109:32
people who’ve just been absolutely
109:34
unable to accept a load of scientific
109:37
evidence about how did he
109:38
kids to read effectively because of an
109:40
ideology and they’re sincere and it’s
109:43
like people it’s really really hard
109:45
accepting reality is your life’s work
109:47
it’s really really really hard it’s it’s
109:50
not it doesn’t come naturally
109:52
necessarily it’s a discipline thank you
109:55
all right
109:56
[Applause]
110:03
[Music]
110:04
[Applause]

Joe Rogan Experience #1107 – Sam Harris & Maajid Nawaz

26:52
say so he had made moves in this debate
that I considered intellectually
dishonest and and I mean he because he’s
playing a game and this is not a real
conversation
this is a formal academic
style debate where you know his job is
not to leave his view open to influence
by the other
discussions he’s making a
case and I didn’t know it at the time
but he felt unnaturally constrained by
the format of the debate he had to argue
that Islam is a religion of peace and
some of the moves he made there I
thought were dishonest and so I said ma
Jude I remember this more or less
verbatim because we talked about anyway
since transcribed it into a book but I
said ma j’tia you know everyone in this
room recognizes that you have the
hardest job in the world and we’re all
very glad that you’re doing it you have
to somehow convince the next generation
of Muslims that Islam really is a
religion of peace and the jihad is just
an inner spiritual struggle
and that the
martyrs don’t get 72 virgins in paradise
and all the rest and so my question for
you is is this do you really believe
that this is the case now or do you do
you think that pretending that is that
is the case is the method by which you
will make it the case that if you just
pretend long enough and hard enough
it’ll become so
and the extra line here
was and can you just be honest with us
but I find my final sentence was and you
know you know we’re not on we’re not
televised now can you just be honest
with us here and so there so I responded
immediately and said are you calling me
a liar and so now there’s like 70 we
have 70 people and I’m like into my
second gin and tonic and and and he’s
given me the the sort of you know
middle-eastern stare down across it so
he repeats it I said no no I’m asking
just here
that where there’s no cameras can you
just be honest with us and I said are
you calling me a liar and it didn’t go
too well at all the entire everyone on
the table kind of went quiet and and I
didn’t know who this guy was I never met
him and and I should have known who he
was and and then I think somebody very
tactfully changed the conversation and
just completely veered off this and I’d
never I never spoke to him again for
another what was it a couple of years a
couple years I’d never cross paths in
center the reason I bring this up is
that I was one of those guys that didn’t
want to entertain a conversation with
Sam based upon the defensiveness
when it
came to this topic and and I think that
actually it’s important to say that to
people that because you asked him a
question about the Charles Murray
situation a lot of people rather than
actually wanting to engage with someone
on the substance of their ideas
that I
think in the climate we’re in today that
they’re engaging with people based upon
their on their feelings
and those
feelings are valid of course everyone
has the right to their feelings but
we’ve got to try as hard as we can to
detach those feelings from because
that’s clearly not what you know if the
principle of charity means you lend the
person that you’re speaking to the best
possible interpretation of what they’re
saying and and allow them to clarify
what they mean as opposed to you putting
into their mouths
what what they mean
and telling them what they mean I learnt
that you know because then two years
later he reaches out to me and he says I
think we can try again you know are you
willing to have a conversation with me
and and I hadn’t originally remembered
it was the same guy so that’s fine I got
my foot in the door just because he
30:12
didn’t know who I was and then we had
30:14
this conversation which it’s a lesson
30:16
for me because we had this conversation
30:17
it’s it’s it’s called Islam in the
30:19
future of tolerance it’s it’s become a
30:21
book right published by Harvard
30:22
University Press we had this
30:24
conversation that became a book that’s
30:25
been made into a film which I think any
30:27
couple of weeks now we hear some news on
30:28
that yeah I don’t know one that it’s
30:30
coming out with that so we do days a
30:31
lecture tour of Australia and the people
30:34
who organized that made a documentary
30:36
that week but they realized this lesson
30:39
to your question and that is that I am
30:41
somebody that didn’t engage with him on
30:44
the substance of his question but
30:45
actually fired a misfire an emotional
30:49
misfire on on on what was really
30:51
questioning and his motives for asking
30:54
the question
30:55
rather than actually addressing
30:56
addressing the points he was making and
30:57
I think that when I because I didn’t
30:59
rember who he was I then started the
31:02
conversation anew without the memory of
31:05
my original judgment on him hmm and the
31:07
conversation went really well
31:08
so we’ve got some he’ll be able to
31:10
divorce ourselves on that background
31:11
that can’t happen I mean it can be done
31:13
it’s just it takes people of strong
31:15
character to try to like abandon all
31:19
preconceived notions from the past
31:21
conversation just start fresh
31:22
yeah unfortunately this example of a
31:25
kind of a signal success has has caused
31:29
me to in the end kind of miss spend a
31:33
lot of energy just assuming that’s it I
31:35
can’t keep thinking I keep walking into
31:37
another situation thinking this is
31:39
possible is that why you deleted Twitter
31:43
so you haven’t deleted your account
31:45
no I’m still on Twitter but I I will
31:47
based on this recent episode I is a damn
31:51
fascinated by people and their struggles
31:54
with social media with like detaching
31:57
from it reattaching from it getting
31:59
addicted to it I mean I know so many
32:01
people that will look at their Twitter
32:03
at like 1 o’clock in the morning before
32:05
they go to bed and something pisses them
32:07
off and then they can’t sleep yeah oh
32:08
yeah really common I was not I don’t
32:13
consider myself someone who had a a real
32:17
pathology with it I was you know I have
32:19
I don’t know
32:20
6,000 tweets or 7,000 tweets over the
32:23
course of many years so I’m not I was
32:26
not tweeting that much I was not even
32:28
looking that much I was I was fairly
32:31
disengaged and I’ve never used Facebook
32:33
as I’ve never I just used Facebook as
32:35
kind of a publishing channel I never
32:37
engaged with comments but I was looking
32:40
enough and it it was one was clearly
32:44
making me a worse person imagine it was
32:46
I was I was reacting to stuff that I
32:48
didn’t need to react to and it was
32:50
amplifying certain McCrystal isms and
32:52
and voices which need not have been
32:55
amplified and in this in this last case
32:57
it just turned a it just created a huge
33:02
kind of explosion in my life I was in
33:05
the middle of a vacation which I
33:06
basically torpedoed the
33:08
because of what I saw on Twitter and it
33:10
was just it was like the perfect
33:12
infomercial for why you don’t want to be
33:15
he told you our vacation how well so I’m
33:18
in the middle of it like the first
33:19
vacation taken with my family for in a
33:21
very long time was at least a year and
33:23
Wow and what you do so I you know we’re
33:26
on Hawaii and just like I’m supposed to
33:29
put everything down to be the best
33:30
father and husband I can be right and
33:32
that was my intention that’s what was
33:34
happening it happened for a good solid
33:37
24 hours and then I pick up my phone and
33:42
I see that that Reza Aslan and Glenn
33:45
Greenwald
33:46
and Ezra Klein had all attacked me in
33:48
the space of an hour oh no it goes out
33:51
to millions of people is this over that
33:53
what he was Austria was asking about the
33:54
charles murray thing yeah yeah well I
33:55
true that I can’t even see what I didn’t
33:58
look at what Greenwald had done he was
34:00
circulating somebody’s video about me
34:03
how I’m I think I’m a racist in that
34:05
video Reza Aslan blocks me so I can’t
34:08
even see what if he attacks me by name
34:10
but he blocks me so I can’t even see
34:12
what his but so I just saw the the
34:15
aftermath of that you know lots of stuff
34:17
you know lots of notifications coming to
34:19
me with both of us tagged and then Ezra
34:22
published this message I suppose I
34:25
should back up however painfully to
34:27
describe what happened here but so I had
34:29
charles murray on my podcast a year ago
34:31
and charles murray is this this social
34:33
scientist who published the bell curve
34:36
back in the 90s which it was a a book
34:39
about IQ and and success in in western
34:43
societies like our own and it’s a book
34:46
where he worries a lot about the
34:48
cognitive stratification of society we
34:50
have a society that is selecting more
34:52
and more for a narrow band of talents
34:54
that are very well fairly well captured
34:57
by what we call IQ and there is a kind
35:00
of winner-take-all situation where
35:02
people are really you know 500 years ago
35:04
if you had a a very high IQ and you’re
35:08
just pushing a plough next to your
35:09
neighbor you had no real advantage but
35:12
now you can start a hedge fund or you
35:14
can start a software company and we’re
35:15
seeing the this this real shocking
35:19
disparity and
35:20
in good fortune really so he wrote this
35:25
book it had a chapter on race which
35:28
talked about the disparities in in
35:30
racial groups I observe disparities
35:35
right and the claim about the source of
35:40
those disparities was by even the
35:42
standards of the time but certainly the
35:44
standards of today an incredibly tepid
35:48
mealy-mouthed just hand-waving it was
35:50
not this you know here comes the Third
35:53
Reich declaration of white supremacy it
35:56
was undoubtedly there are environmental
36:01
and genetic reasons for this and we
36:03
don’t understand them you know it was
36:05
like to think that is one or the other
36:08
we’re not in a position to know what the
36:10
mixes of influences now and that is
36:14
virtually any honest scientists take on
36:17
the matter and certainly today and it’s
36:22
only become more so but that went off
36:25
like a nuclear bomb I mean that was just
36:26
so that was such a I mean it’s it’s the
36:31
most I saw at the time I never read the
36:35
book I just thought this had to be just
36:36
racist cause Marie would be vilified for
36:39
films and he’s been vilified ever since
36:42
and ever since you know I’ve ignored him
36:44
there’s any deep platformed and was
36:46
assaulted recent yeah so that’s what
36:47
happened so he went to Middlebury to
36:49
give a talk you know 20-some odd years
36:52
25 years after he wrote this book oh by
36:55
the way he’s also listed by the Southern
36:56
Poverty Law Center oh and so that that
36:58
that’s what contributed to the D
37:00
platforming and the violent protests
37:02
against him at Middlebury what’s crazy
37:04
is the whole thing is a propaganda for
37:06
the superiority of the Asian race and
37:08
everyone’s talking about white supremacy
37:12
decisions are the ones far and above I
37:14
mean that’s basically what his book
37:16
proved and you know they’re suing
37:18
Harvard now there’s a group of Asian
37:20
students that are suing Harvard because
37:21
they’re discriminated against because
37:23
they’re required to have higher scores
37:24
because they’re assumed to be smarter so
37:27
their standards for Asian students
37:29
entering into Harvard is higher than
37:31
white people
37:32
Wow yes while Asian privilege has a big
37:34
problem yeah your grandfather was
37:37
working on the railroads in California
37:39
as an indentured servant and all that
37:42
privilege trickled down there’s
37:43
obviously a lot of factors that lead to
37:45
IQ to hierarchy but to ignore what those
37:48
are to ignore it completely to disinvite
37:51
all of you yes exactly only ideology and
37:54
this idea that you cannot look at
37:55
statistics you cannot look at facts and
37:58
in your conversation with ezra’s charles
38:00
that sort of as a recline rather that’s
38:01
what I got is that this is this is an
38:04
ideological issue and that you you it’s
38:08
almost like an impossible subject to
38:10
breach like you can’t even discuss the
38:12
fact that certain races demonstrate low
38:16
IQ and then let’s look at what could be
38:19
the cause of those even discussing that
38:21
somehow another is so inherently racist
38:23
that it must be ignored or must be
38:25
silenced and that you you must first
38:28
concentrate on all the various and
38:30
justices that have been done to those
38:31
people who have this lower IQ yeah well
38:34
let me just take a couple of minutes to
38:35
close the various doors to hell that are
38:37
now ajar based on what we’ve just said
38:40
on your holiday and you get it yeah so
38:43
we’ll just take a little more context so
38:45
yeah as you said Charles Murray went to
38:47
Middlebury College and was D platformed
38:48
and he was not only the platform so the
38:50
usual D platform and with the students
38:52
turning their back to the speaker and
38:53
shout in and not let anything happen but
38:56
the professor who invited him who was a
38:58
liberal professor who wanted to
39:00
essentially debate him she was attacked
39:01
when they’re leaving the hall they both
39:04
get physically attacked by a crowd of
39:07
students charles was was not hurt his
39:10
host this female professor got a
39:13
concussion and a neck injury that that
39:15
still persists and this is now more than
39:17
a year later so it’s like sure that she
39:18
was a registered devil arm by this no
39:21
doubt and and they’re driving out in an
39:23
SUV where that gets I mean someone pulls
39:25
a stop sign out of the the sidewalk and
39:28
I still got the concrete ball on the end
39:29
of it and that this SUV gets smashed
39:31
with this you know concrete Laden stop
39:33
sign I mean this was this is happening
39:35
at one of the most liberal privileged
39:38
colleges on earth it’s nuts so anyway
39:42
that was the thing that put Murray on my
39:44
radar after
39:45
all these many years of my ignoring him
39:46
and I had actually I felt guilty because
39:48
I had declined to be a part of at least
39:51
one project because his name was
39:53
attached right because I just thought
39:55
that this guy is radioactive he’s got
39:57
some white supremacist agenda I had
39:59
believed the the the the lies about him
40:02
and then I saw this I thought okay well
40:04
maybe he’s the canary in the coal mine
40:06
or certainly one of the Canaries in the
40:08
coal mine that I had ignored where the
40:09
as you say there’s a certain topics are
40:12
considered so politically fraught that
40:15
you cannot discuss them no matter what
40:17
is true like it’s just a you know there
40:20
has to be a firewall between your
40:23
conversation about reality and these
40:25
sorts of facts and so you know he so
40:30
he’s been you know suffering from having
40:32
transgressed that boundary and so I had
40:35
him on the on the podcast being fairly
40:39
agnostic about his his actual social
40:42
policy commitments and his political
40:44
concerns and just wanting to talk about
40:48
you know the facts and so far as we
40:51
touch them lightly may had zero interest
40:52
in intelligence as measured by IQ
40:57
although I mean it’s an interesting
40:58
subject but I hadn’t you know I hadn’t
41:00
spent much time focused on that and I
41:02
had truly zero interest in establishing
41:06
differences between populations with
41:08
respect to intelligence or anything else
41:10
but I see what’s coming I see the fact
41:13
that that the the more we understand
41:15
ourselves genetically and
41:17
environmentally the more we will if we
41:20
go looking or even if we’re not looking
41:21
we will discover differences between
41:23
groups and the endgame for us as a
41:27
species is not to deny that those
41:29
differences exist or could possibly
41:30
exist it’s to deny that they have real
41:35
political implication I mean with the
41:37
political implica lurk we need is a
41:40
commitment to to equality across the
41:44
board and a commitment to treating
41:46
individuals as individuals there’s
41:48
nobody who’s that the average of a
41:50
population is meaningless with respect
41:53
to you and that will always be so and
41:57
and whatever you know and whatever
42:00
diversity of talents there is
42:01
statistically in various populations we
42:04
want societies that simply don’t care
42:09
politically about that I mean that’s
42:12
just it’s just not what we its they are
42:17
our political tolerance of one another
42:19
in support of one another is not
42:21
predicated on denying individual
42:24
differences or even statistical
42:26
differences across groups it can’t be
42:28
because we know that there are people
42:29
walking around like you know Elon Musk
42:33
who gets out of bed in the every morning
42:35
does the work of like 4,000 people right
42:37
and people who just are struggling to
42:41
work at Starbucks and hold down a job
42:42
and our political system I mean we don’t
42:48
say one person is more valuable
42:50
politically and socially than another
42:52
even though one person is capable of
42:54
doing massive things that that many most
42:57
other people aren’t it’s you know when
43:00
it comes time to to write laws and
43:03
create institutions that protect you
43:05
that that support human flourishing we
43:08
we have to engineer times that raise all
43:11
the boats and so you know and and you
43:13
know they’re legitimate debates about
43:15
the social policies that will do that
43:17
but and they’re legitimate debates about
43:19
facts so we can debate scientific fact
43:21
and and you know the the results of you
43:25
know psychometric testing or or
43:27
behavioral genetics that are relevant to
43:29
this question of intelligence and we can
43:31
have a good faith debate about the data
43:33
and then we can have a good faith debate
43:35
about social policy that should follow
43:37
from the data but what’s happening on
43:39
the left now is on either at either of
43:42
those tiers of conversation there are
43:46
just straight-up allegations of racism
43:49
that hit you the moment you touch
43:51
certain a certain fact can I say that
43:53
that what he just summarized that when
43:55
I’ve heard it sounds to me as being more
43:59
humane than the implications of the
44:03
argument that the left who are opposing
44:05
what Sam has just said ah because if you
44:08
think about it the implications of their
44:10
argument would be
44:11
they’re what they want to deny the facts
44:12
because they’re scared that those facts
44:15
would from which there would be derived
44:19
a policy that would reflect those facts
44:21
and other words in their minds they are
44:24
marrying those two they are marrying the
44:26
notion that if in statistical observance
44:28
there are variances in IQs between
44:31
groups in their minds that means the
44:33
policy should follow from that so it’s
44:36
why they’re resisting what he’s saying
44:38
whereas what he’s saying is there is no
44:41
connection between what the policy
44:42
should be and what the facts may be
44:43
because of the kind of world we want to
44:45
live in should aspire to equality
44:48
regardless of what the science is saying
44:49
because one is policy and one is science
44:52
I freely agree with you on that but I
44:54
don’t think that’s necessarily exactly
44:55
what they’re saying well I think what
44:57
they’re saying is what they’re doing is
44:59
they almost feel so guilty that any
45:01
discussion whatsoever about race can’t
45:03
be held unless you repeatedly bring up
45:06
all the instances of racism and
45:08
suppression that in discrimination that
45:11
that group has suffered from it’s like
45:13
you can’t it doesn’t exist as a
45:15
statistic island you have to bring
45:18
everything in together if you don’t do
45:20
that
45:21
that’s where their protest comes from
45:22
and I think that was one of the things
45:24
that I got from your conversation with
45:25
Ezra Klein he wasn’t willing to just
45:27
discuss what’s the implication of these
45:29
issues and completely dismiss this this
45:32
fact that Asian people score far better
45:36
there it’s not there’s nothing but it’s
45:38
always fair that by conceding on the
45:41
data it’s almost as if they fear that
45:43
the implication must necessarily follow
45:45
that the policy will also be supremacist
45:47
in that way hmm I wonder I honestly
45:50
think that what we talked about before
45:51
is a big part of it this is ideological
45:53
an idea sport and that they’re just
45:55
volleying back I don’t think they’re
45:57
willing to take I think one of the real
45:59
strengths of character that you
46:01
demonstrate in a debate or any
46:03
discussion of faxes when uncomfortable
46:05
truths rear their ugly head that are
46:07
counter to your or your personal
46:09
position you have to be able to go you
46:11
got a really good point you’ve got a
46:13
good point there’s something to that I
46:14
see what you’re saying okay this is what
46:16
my concern would be and this would be a
46:17
rational real conversation this is what
46:20
I would worry about and then you would
46:21
I’m sure say absolutely I would worry
46:24
about that as well and then you would
46:25
have this sort of a discussion I didn’t
46:27
get that from that conversation you had
46:29
I got ping pong I got I got this
46:32
rallying back and forth of ideas rather
46:35
than two human beings not digging their
46:38
heels into the sand just trying to look
46:41
at the ideas and look at the statistics
46:43
and look at these studies for what they
46:44
are and look at charles murray and what
46:46
he’s gone through and should we be able
46:49
to examine these statistical anomalies
46:51
should be able to examine athletic
46:53
superiority should we be able to examine
46:56
superiority that asians show and
46:58
mathematics and a lot of the sciences
46:59
should we should we be able to or should
47:01
we just dig our heads in the city should
47:03
we just let things sort themselves out
47:04
and quietly ignore all the reality yeah
47:07
I don’t know what so I should say that I
47:10
am I certainly understand people’s fear
47:13
that if you could that anyone who would
47:16
go looking for racial difference is very
47:19
likely motivate and motivated by
47:20
something unethical or unsavory right so
47:22
like like you could imagine you know
47:24
white supremacists being being super
47:29
enamored of this the possibility that
47:31
these days is yes and they are yes and
47:33
so they look at the Asians too so so
47:38
that’s like I get that right and there
47:41
is there’s some things that and this was
47:43
the question I had for charles murray on
47:44
them on that podcast i said like why pay
47:47
attention to any of this what is the
47:48
upside in the in the infinity of
47:52
interesting problems we can tackle
47:54
scientifically why focus on population
47:56
differences and you know frankly i
47:58
didn’t get a great answer from him i
48:00
mean yesterday his answer his answer is
48:03
well I think the best version of his
48:06
answer which I agree with but still it
48:08
may not justify certain certain uses of
48:11
attention it’s just that if you there’s
48:15
this massive bias that basically we’re
48:19
all working with a blank slate you know
48:22
genetically and therefore any difference
48:25
you see among people is a matter of
48:28
environment and so so then you have
48:31
people who have privileged environments
48:34
and people who have environments that
48:36
that
48:37
where they’re massively under-resourced
48:40
and so therefore any different
48:44
representation at the you know the
48:46
higher echelons of success and
48:48
achievement and power in our society you
48:51
know if there’s 13% African Americans in
48:54
the u.s. if you look at the top doctors
48:56
in hospitals or the top academics or the
49:01
you know the Oscar winners or whatever
49:03
you know whatever you want to look for
49:04
for for achievement if there are less
49:07
than 13% African Americans in any one of
49:10
those bins that has to be the result of
49:13
racism or systemic racism that is the
49:18
left the leftward bias at this moment
49:20
and it and so it is with Jews for
49:22
anti-semitism so it is to women you know
49:25
there should be an equal representation
49:26
of women you know computer software
49:29
engineers at Google and any lack of any
49:34
disparity there must be the result of
49:36
either just inequitable resources for
49:42
you know kids in schools or somewhere
49:44
along the way or a pound of a selection
49:48
pressure from the top that you know we
49:50
do you know we don’t like women in at
49:51
Google or blacks at the Oscars and so
49:57
that’s the so Murray’s concern is if you
50:00
believe that and I’m you know this it’s
50:03
not exactly what he said but this is
50:04
this is what I believe he thinks but I
50:06
could be putting some words into his
50:07
mouth here but there’s certainly what
50:08
many other people on his side of the
50:10
debate thing if you believe that you
50:12
will can consistently find racial bias
50:16
and anti-semitism and misogyny where it
50:18
doesn’t exist right so like if you if
50:20
you go looking if you go to a hospital
50:22
and this is a real problem you that
50:24
they’re like like you know the academic
50:26
departments in the medical schools at
50:28
the best medical schools are under
50:29
massive pressure to find like real
50:33
diversity in representation at the
50:35
highest level you need to find a head of
50:38
Cardiology who’s black right and if you
50:42
and you end the fact that you haven’t
50:44
done that is a sign that there’s a
50:46
problem with you and your organization
50:48
and your process of hiring
50:50
now if it’s just the case for whatever
50:53
reason that there are not many
50:55
candidates likely of less than 13% for
50:58
that field or to take the you know the
51:00
James d’amore memo at Google right if it
51:03
just is the case that women forget about
51:05
this is this is beyond aptitude this
51:07
just goes to interest if it’s the case
51:09
that women for whatever reason genetic
51:12
and but or environmental are less
51:16
interested in being software engineers
51:17
on average than men are then you then
51:21
having you know twenty percent women
51:22
coding software at Google is not the
51:25
probably’s not Google’s problem it’s
51:27
just the fact that this is that the
51:28
popular what the population the
51:30
interests are now we should no doubt
51:33
racism still exists no doubt misogyny
51:36
and sexism still exist there there are I
51:39
mean that and there’s proof of this to
51:40
be found as well but if to assume an
51:44
absolute uniformity of humor of interest
51:48
and aptitude in every population you
51:50
could look at is just scientifically
51:54
irrational that would be a miracle if
51:56
that was it so at this stage allow me to
51:58
remind everybody that was Sam’s
51:59
summarizing what he thinks Charles Mari
52:01
was saying as opposed to Sam what no no
52:04
that final point it’s just a true point
52:07
there jeans almost everything we care
52:10
about are massively influenced by genes
52:13
not a hundred percent of what I’ve seen
52:15
happen to you though is that people have
52:16
taken your summaries of other people
52:19
Charles Murray’s position you it’s your
52:22
summary of his position in relationship
52:25
to this this fight against it the thing
52:28
that I would add and the thing were
52:30
there’s some daylight between the two of
52:32
me and him on my podcast is this is so
52:38
toxic to be trafficking in population
52:44
differences with respect to IQ that and
52:48
and it’s not it’s not absolutely clear
52:50
what Social Policy is turn on really
52:54
nailing down these differences I mean so
52:56
you could go I mean to take it even more
52:57
toxic as an example
52:58
it’s like you could decide you know the
53:02
Roma in Europe the gypsies like this is
53:04
like a very isolated beleaguered you
53:07
know community who knows how inbred it
53:10
is I mean I don’t know it’s just this is
53:12
a this is an outlier community like
53:15
anyone who’s gonna want to do you know
53:17
massive IQ testing on the Roma what’s
53:20
the what’s the point of doing that right
53:22
like you know it’s like your it seems
53:24
like a just a kind of political time
53:28
bomb to devote resources in that way
53:32
because we know that the policy you want
53:36
whatever any whatever this the mean IQ
53:39
is of any group the policy you want is
53:42
to give everyone whatever opportunities
53:45
they can avail themselves of so we want
53:47
we want people to have the best schools
53:49
they can use and then we’ll find people
53:51
who need to be in more remedial schools
53:54
for whatever reason or you know people
53:55
like you know they’ll be one population
53:57
that has ten times the amount of
53:59
dyslexia then another population say and
54:02
they’ll be undoubtedly genetic reasons
54:04
for that you know there may be
54:05
environmental reasons for that as well
54:07
but there’s we need to be able to cater
54:10
to all of those needs with just there’s
54:14
this fundamental commitment to goodwill
54:16
and equality without being panicked that
54:19
we’ll find stuff that just blows
54:21
everything up but on the left there
54:24
there’s the sense that the only way to
54:26
move forward toward equality is to lie
54:29
about what is scientifically pause
54:31
applause a bowl and demonize anyone who
54:34
won’t lie with you mmm that’s the
54:37
ideological point yeah this is a new
54:41
thing though right I mean relatively
54:43
speaking this this hard-nosed dance from
54:46
the left of the equality of outcome and
54:48
and the only reason why there wouldn’t
54:51
be 50% women or 50% black or 50% any you
54:54
just pick any marginalized group the
54:56
only reason why wouldn’t be even across
54:57
the board with all other races is
54:59
because of discrimination this is a
55:00
fairly new stance I mean there were
55:02
there were moments that were fairly well
55:04
publicized that I don’t forget when
55:06
Larry Summers got fired from Harvard so
55:08
Larry Summers was the president of
55:09
Harvard and he’s a famous economist
55:12
and he gave a speech for what she was
55:15
fired there might be a little more color
55:17
as to why he was fired I mean it was
55:19
more fired because he he wants the the
55:21
wheels started to come off he didn’t he
55:24
had alienated enough people that he
55:25
didn’t have friends to kind of prop him
55:26
up but but the thing that pulled the
55:28
wheels off was that he gave a speech and
55:30
he said we know there are our
55:34
differences in in the the bell curves
55:37
that describe you know mathematical
55:39
aptitude between men and women and this
55:42
explains why there are many more
55:44
top-flight male mathematicians and
55:47
engineers than women and it’s not that
55:49
they even it’s not that the the means of
55:53
the the of the bell curves are different
55:56
so the means could be the same but there
55:59
could be more variant so that the tails
56:00
are thicker in the case of the male bit
56:02
poker so at the absolute ends both of
56:05
the low end and the high end you have
56:07
many more people so you know if you’re
56:10
gonna ask you know what’s the in the
56:12
same size population how many people do
56:15
you have at the 99.999% aisle of
56:19
aptitude in math say it could be that
56:23
you have and there’s a fair amount of
56:24
data to show this many more men at the
56:27
tails than women
56:29
right and and that’s true for
56:31
grandmasters in chess right it’s just
56:33
the it’s just this is not a and it may
56:36
be true for something like you know
56:38
playing pool you know I mean they’re
56:39
they’re just differences and that may
56:41
not be entirely environmental almost
56:43
certainly or not entirely environmental
56:45
that is one right it’s a big issue in
56:49
the world of pool men and women play
56:51
separately and there’s no reason
56:52
physically why they should yeah it’s not
56:54
a strength game right but women are
56:56
allowed to play in men’s tournaments but
56:58
they never win right gene be Lucas was a
57:00
woman who’s she was like one of the only
57:03
women ever compete and beat men she’s
57:06
like an extreme outlier and this was
57:07
like I want to say was in the late
57:10
seventies in the 80s and and other than
57:12
that there’s been a few women that have
57:14
done well in tournaments but when they
57:15
come to major league professional pool
57:17
tournaments they’re almost always won by
57:20
men and I’m when I say almost I mean
57:22
like 99.9 percent so it was a cum
57:25
games have been happening as we it was
57:27
just over the last couple of weeks and
57:29
there was a male to female transgendered
57:32
athlete in the weightlifting category
57:35
that’s a whole nother boy participated
57:37
in the women’s competition yes and the
57:41
Commonwealth Games at the time of her
57:43
joining hadn’t yet put down a rule asked
57:46
the testosterone levels in the females
57:48
competing and so this male to female
57:51
transgendered person qualified in the
57:53
female games and was as you’d expect
57:57
winning in all of the games and was the
58:00
front-runner and destined to win the
58:02
competition as a male to female
58:04
transgendered person and the only reason
58:06
and it would have led to a huge crisis
58:08
in the Commonwealth Games because there
58:10
was some resistance to this notion and
58:13
of course the questions that arise is
58:15
this fair men are born naturally with
58:17
higher levels of testosterone for
58:18
example the only reason it didn’t lead
58:20
to the crunch time and that was the huge
58:22
scandal of of her winning is that she
58:25
injured herself in the competition and
58:27
by sheer accident yeah I saw that I can
58:30
expand on that a little bit because I’ve
58:31
actually gone through this extensively
58:33
because there was a woman who was used
58:36
to be a man was competing in mixed
58:37
martial arts against women and just
58:39
beating the shit out of them and I and I
58:41
was saying that this is this is a
58:42
mistake and that you’re you’re looking
58:45
at whether someone should be legally
58:47
able to identify as a woman portray
58:50
themselves as a woman absolutely do you
58:51
have the freedom to become a woman in
58:54
quotes in our society yes but you can’t
58:56
deny biological nature and there’s
58:58
physiological advantages to the male
59:00
frame there’s it’s specifically when it
59:03
comes to combat sports that’s my
59:05
wheelhouse I’m an expert I understand
59:07
there’s a giant difference between the
59:09
amount of power that a man and a woman
59:11
can generate and if you’re telling me
59:13
that a guy living thirty years of his
59:14
life as a man that’s that’s essentially
59:17
like a woman being on steroids for 30
59:19
years
59:20
then getting off and then having regular
59:23
women being forced to compete with her
59:25
and trying to pretend this a level
59:27
playing field
59:28
it is not there’s a difference in the
59:29
shape of the hips the size of the
59:31
shoulder the density of the bones the
59:33
size the fists wet that’s a giant factor
59:36
and your ability to generate power is
59:39
size of your fists it’s also an ethical
59:41
problem it’s not just a competition here
59:42
is he have girls getting beaten up by
59:45
someone who used to be a man yes but
59:47
people came down on me harder than
59:50
anything that I’ve ever stood up for in
59:52
my life never in my life – I think
59:53
there’s gonna be a situation when I said
59:55
hey I don’t think the guy should be able
59:56
to get his penis removed and beat the
59:57
shit out of women and then people like
59:59
you’re out of line but that’s what
60:04
happened this is a conversation that I
60:05
had with a woman online this one what
60:08
during this whole thing she said she
60:11
this person who had turned into a woman
60:13
has always been a woman and I said but
60:16
she was a man for 30 years she goes no
60:18
she’s always been a woman I go even when
60:20
she had sex with a woman and fathered a
60:23
kid and she says yes even then I go well
60:26
we’re done yeah because you’re just
60:27
talking nonsense that’s a neurology
60:30
cover exact the facts as they are that
60:32
she had a male physique this person
60:35
always arguing with me wants to claim
60:37
this moral high ground of being the most
60:39
progressive and they’re always looking
60:41
step on top of anybody who’s less
60:43
progressive than then and complained and
60:45
proclaimed superiority and this is the
60:47
ideological sport this is the idea sport
60:50
that that you see with what people are
60:52
playing just ping-pong with ideas
60:54
they’re not listening you you need to
60:56
listen to experts in in that when you
60:59
especially talk about martial arts
61:01
there’s a did the the difference is so
61:03
profound and the results are so critical
61:06
because you’re talking about a sport
61:08
where the objective goal the goal is
61:11
clear it’s very clear beat the fuck out
61:14
of the other person in front of you yeah
61:15
so anything that would give you an
61:17
advantage in beating the fuck out of
61:18
that person should be really looked at
61:20
very carefully and not to thrown through
61:23
the the lens of this progressive
61:25
ideological filter that we’re going
61:26
through right now because that’s that’s
61:28
what it is I mean that’s how people are
61:29
looking at it it’s with weightlifting as
61:31
well when transgendered athletes going
61:34
to weightlifting competitions the male
61:37
to female transgender athletes are
61:39
overwhelmingly dominant I mean is this
61:42
is this a coincidence or it’s no it’s
61:44
someone who had fucking testosterone
61:46
pumping through their system and a
61:48
y-chromosome their whole life and now
61:50
all of a sudden we’re supposed to say no
61:52
she’s a woman
61:52
she’s dainty she’s got size 14 feet
61:56
she’s got gorilla hands like the fuck’s
61:58
he doing sir so I think as you said
62:01
earlier it’s she is a woman but for the
62:03
purposes of competition yeah against
62:05
other women you know legally she’s a
62:07
woman at that stage right if she goes
62:08
through that identity transition but I
62:10
think we have to recognize and I think
62:12
even many traditional feminists are
62:14
making this point you match to the anger
62:17
of the trans community they’re saying
62:19
hold on your what you’re doing in this
62:20
way is actually we fought so hard and so
62:22
long for these female spaces where we
62:26
have a space of our own and now people
62:28
that used to be men are coming into
62:29
those spaces and actually quite
62:30
literally beating the crap out of us yes
62:33
yes you know whether it’s in boxing
62:35
whether it’s in weightlifting in martial
62:37
arts they are – by definition they’re
62:41
dominating all this of course they are
62:42
for the reasons you said experts that
62:44
they’re calling upon or almost all
62:46
transition doctors surgeons or people
62:50
that have transitioned themselves when
62:52
they speak to actual board-certified
62:54
endocrinologist some of the only do it
62:56
off record but one of them forget her
63:01
name she was in one of the big mixed
63:03
martial arts publications Ramona cross
63:05
sick I believe is her name she’s saying
63:08
no not only does it it it actually doing
63:11
this transition like from male to female
63:14
you’re forcing your you’re putting
63:17
estrogen into the system so the bone
63:19
density change that would ordinarily
63:20
take place if you remove someone’s
63:22
testicles and stop that just the
63:24
production of testosterone estrogen
63:26
preserves bone density so you’re
63:28
actually retaining the male bone density
63:31
there’s so many problems with this and
63:33
that and that one of the other things
63:35
they say well oh the Olympics the
63:37
Olympics allow it the Olympics are very
63:39
ideologically based there’s not a whole
63:41
lot of science to this to this
63:42
transition thing of allowing male to
63:45
female athletes to compete in the
63:47
Olympics and there’s a stream amount of
63:50
corruption in the Olympics as it is with
63:52
the IOC being in bed with wada the world
63:55
anti-doping agency and the way they
63:56
handle this Russian scandal I mean this
63:58
Russian scandal that was highlighted in
64:00
that fantastic documentary Icarus yeah I
64:03
was like they’re fucking crazy
64:04
the
64:05
Olympics are not to be trusted that is a
64:07
gigantic multi-billion dollar business
64:09
where the athletes get paid zero money
64:11
it is inherently corrupt from the top
64:13
down no doubt about it so to call upon
64:16
them is to see who should be competing
64:19
as a woman fuck off they’re not the
64:22
experts this is this is not something
64:23
that’s been examined and this is coming
64:25
from someone who one of my jobs is
64:27
examining and commentating on fights
64:30
that is a big part of what I do
64:32
I understand fights and I know what it
64:34
looks like when a man’s beating the shit
64:35
out of a woman and that’s what it looked
64:37
like when this person was fighting women
64:38
it was there was a massive physical
64:40
advantage massive and not a scintilla
64:42
advantage what was the way you mention
64:43
something about the reaction that you go
64:44
to that what was the trouble you gonna
64:46
tell people are so mad at me I mean it
64:47
was just so many not only that they took
64:49
my words out of context they quoted of
64:52
all these different gender transition
64:55
doctors at saying that there’s no
64:57
science behind this and the science
64:59
behind it being totally fair and totally
65:01
equal it’s just not and people know it
65:04
everyone knows it they could they
65:06
couldn’t put Chris cyborg against this
65:07
guy and give him a run for his money
65:09
wrong way classer that’s the other way
65:12
that’s the other thing and we’re dealing
65:13
with a similar situation like that in
65:15
Texas I don’t know about the girl who
65:17
was which was born a girl she’s
65:19
transitioning to a boy in high school
65:21
taking testosterone but in Texas they
65:24
only allow her to compete as a girl so
65:26
she’s dominated the Texas State
65:27
wrestling championship two years in a
65:29
row and it’s horrific because she’s on
65:31
steroids she’s on testosterone and it
65:35
doesn’t matter because they’re testing
65:37
chromosome yeah she’s a woman she was
65:39
born a woman right she’s born a girl
65:41
so because the fact that she’s
65:42
transitioning to be a boy they don’t
65:44
give a shit you’re a woman you’re not
65:45
gonna wrestle against men you’re a girl
65:47
you’re not gonna wrestle against boys so
65:49
they’ve allowed her under extreme
65:51
protest mitts terrible she wants to
65:53
compete or he I should say wants to
65:54
compete as a boy they won’t let him they
65:58
say no you were born a girl you have to
65:59
compete as a girl so when he competes
66:02
everybody boos it’s awesome it’s fucking
66:04
awful I mean it’s it’s it’s really that
66:07
question for you that way around if it’s
66:09
female to male transition somebody that
66:13
used to be a woman that transitions to a
66:16
man and wants to compete with the men
66:17
they don’t have it
66:18
zone you’re allowed to read of this
66:21
advance if they win in that context they
66:23
actually done really good yes right look
66:25
women can beat men yeah I mean it
66:27
happens all the time in jujitsu there’s
66:29
especially in jujitsu in particularly
66:30
because it’s such a technique based art
66:32
but it is possible there’s there’s also
66:35
a woman named Germaine jaronda me who’s
66:37
world-class mixed martial artist who’s
66:39
multiple time world Muay Thai champion
66:40
who fought a man and knocked him out
66:42
it’s a crazy video she was a real man ko
66:45
time with a straight right it’s it is
66:47
possible for them to win if their skill
66:49
level is so far superior that it
66:52
overcomes the inherent strength
66:53
advantages but a woman – male transition
66:57
would be at a severe disadvantage
66:58
against the natural man so would you be
67:00
so in that Texas case they clearly have
67:02
it wrong they should allow they should
67:04
allow him to compete with yes and would
67:07
you be whereas I can I think all three
67:09
of us probably instinctively would
67:10
resist the notion that a female that a
67:13
male to female athlete competes with
67:16
other females because they’d have enough
67:17
quad resist that yes but would you be
67:19
for a female to male athlete competing
67:22
with men yes because I don’t think
67:24
there’s no there’s no better but here’s
67:26
the problem and again the consent is
67:28
sort of running in the other he is
67:30
continually putting herself or he’s
67:32
putting her right in my way knowingly
67:35
and I’m not opposed to a woman fighting
67:38
a man if she so chooses
67:39
like I’m not opposed to bull riding yeah
67:41
if you wanted I’m not you know lobbying
67:44
to get bull riding outlawed but if you
67:45
want to be so fucking stupid that you
67:47
climb on top of a 2,000 pound angry
67:49
animal go for it yeah you should be able
67:52
to do whatever you want I think you
67:53
should be able to jump out of fairly
67:54
good air on airplanes if you want to
67:56
parachute you should be able to risk
67:57
your life parachuting the difference
68:00
lies in just massive advantages and that
68:03
there’s a massive advantage in
68:04
transitioning from male to female female
68:07
to male here’s the other problem female
68:09
to male you have to take testosterone
68:10
you can’t legally take testosterone and
68:13
compete it’s been a giant issue in mixed
68:15
martial arts because for the longest
68:17
time there was a loophole and the
68:18
loophole was testosterone therapy and
68:20
they were allowing testosterone
68:22
replacement therapy for male athletes
68:24
that were either older or it’s it was a
68:27
it was a symptom of having pituitary
68:31
gland
68:32
which comes from head trauma which come
68:34
which means really essentially your
68:36
career should be over yeah your your
68:37
body’s not producing hormones correctly
68:40
and that’s a very common issue with
68:41
people that have been in war people that
68:44
have been blown up by IEDs people that
68:46
have been hit a lot even soccer players
68:48
a lot of times there’s show diminished
68:51
levels of testosterone and growth
68:52
hormone because of to eteri gland damage
68:54
so you wouldn’t even allow that so a
68:56
female to male would be in a whole
68:59
nother problem in combat sports because
69:01
it’s not legal for you to take
69:02
testosterone and compete to bring this
69:06
full circle back to me sitting at the
69:08
pool destroy about to destroy my
69:09
vacation on twitter how long did you
69:11
spend working on this article what
69:13
another thing it’s so again this was
69:14
your wife must hate to do that how much
69:17
does she matter well it was kind of the
69:20
perfect storm but there were there were
69:21
a few things that that relieve the
69:23
pressure one is there was another family
69:25
from our school so they’re like well
69:27
mark my daughter had a friend that said
69:30
that we that my wife could socialize
69:31
with and having another couple there
69:34
forced me to sort of put on my social
69:36
phase at dinner and and I mean it’s not
69:48
to say to describe it that way he’s
69:50
putting on your social phase it actually
69:52
changes your psychology I mean like if
69:53
you have if you if you have to drop your
69:55
problem in order to be a normal sane
69:57
person with people you don’t know all
69:58
that well you’re actually a happier more
70:00
normal person if it had just been me and
70:02
my wife at dinner while I’m dealing with
70:04
this blow up it just you know it’s just
70:05
never would’ve the cloud wouldn’t
70:06
wouldn’t have left so anyway I I was
70:12
trying so I was trying not to engage and
70:14
so I didn’t want to have to write
70:15
anything new to deal with this the this
70:17
what I viewed is just an egregious
70:20
attack on on my intellectual and moral
70:23
integrity and so when I saw this article
70:27
from Klein I realize I had this email
70:30
exchange with him at the end of which I
70:32
said listen if you if you continue to
70:35
slander me this is ahead for like a year
70:37
previously because there’s been released
70:39
I released so so I said but I said the
70:43
end of this exchange if you continue to
70:45
slander me
70:45
and if you misrepresent the reasons why
70:47
we didn’t do a podcast because we we had
70:49
had talked publicly about maybe sorting
70:52
this out on a podcast a year ago but I
70:54
found the exchange with him by emails so
70:57
in such bad faith I found him so evasive
70:59
and dishonest and again just plain
71:02
ideological ping-pong as you said and
71:04
not actually engaging my points that I
71:08
said listen if you if you lie about this
71:10
and you keep slandering me I’m just
71:11
gonna publish this email because because
71:13
I think the world should see how you
71:15
operate as a journalist and as an editor
71:17
like he he had declined to publish a far
71:20
more mainstream opinion defending me and
71:22
Marie an inbox I mean he it was just it
71:25
was truly you know slanderous and
71:27
misleading everything he’s published on
71:29
this topic and he has a huge platform I
71:31
wish to do it so which I enjoy I really
71:34
like what oh yeah no I mean if I I’ve
71:36
red fox with pleasure as well but it is
71:39
it it you know once you see how the
71:41
sausage gets made on many of these
71:43
things once you’re the news item you can
71:45
see that there’s very little
71:46
journalistic scruple in the in the
71:48
background there so I I was lit man I
71:53
didn’t want to have to spend my time on
71:55
vacation writing a retort to this thing
71:58
but I felt like I had to respond and
72:00
again this is an illusion there’s like a
72:01
sheer confection of looking at Twitter
72:05
if I hadn’t been looking at Twitter I
72:06
wouldn’t have felt I had to respond and
72:09
so I responded in the laziest possible
72:12
way which I just published the email
72:15
exchange because it’s already written I
72:16
don’t have to write anything you know I
72:17
just live those hits and essentially and
72:19
of course the rest of the world didn’t
72:20
know you’re actually meant to be on
72:21
vacation right now and so there’s no
72:24
context to them as to why you were still
72:28
III massively underestimated the amount
72:30
of work even my own fans would have to
72:33
do to understand why I was so angry in
72:36
that email exchange so I came off like
72:37
the angry bastard in the email exchange
72:39
and he came off as this you know just
72:43
open-minded ready to dialog guy whereas
72:46
if you follow the plot and you saw what
72:49
he had published about me and and Murray
72:51
previously this thing that has hit is
72:53
now on the hate watch page of SPLC
72:56
he was being totally disingenuous and
72:59
Ave
72:59
and just these responses you remember
73:01
they didn’t match to his article did
73:02
they not not at all and it was this
73:04
thing it was so yeah so I just kept
73:06
getting more tuned up and and so I
73:09
published this thing not realizing not I
73:13
mean I you know it was definitely
73:14
mistake to publish the email exchange
73:16
just just pragmatically not I don’t
73:18
think it was unethical because I told
73:21
him I was going to do it in advance if
73:23
he kept he kept it up it was just it was
73:27
totally counterproductive because it was
73:29
if he was far more reasonable emaddix
73:31
people in the original article what
73:33
seems like he that do a lot of work –
73:35
yeah yeah thing is he wasn’t it was
73:37
suited he was it was it was a an
73:40
appearance of reason but it was it was
73:42
not and then which so we finally did
73:44
this podcast a year hence you know this
73:48
is now my last podcast is now you know
73:50
two weeks ago and you know it was
73:53
basically as bad as I was expecting
73:56
and I basic I feel that I met the person
73:59
who I thought I was dealing with in the
74:01
email exchange and he was fundamentally
74:03
unresponsive to any of my points and you
74:07
know as you say Joe just trying to score
74:09
political points to his toward his
74:12
audience and the thing is he has a
74:14
what’s that mean there’s many there many
74:16
asymmetries here but one crucial one is
74:19
that he has an audience that doesn’t
74:22
care about whether or not he’s
74:25
responsive to the thing that his his
74:28
opponent or interlocutor just said right
74:30
it’s they’re not tracking it by that
74:33
metric they’re tracking it by are you
74:35
making the political points you win it
74:37
that are going that are massaging that
74:40
you know outrage part of our brains like
74:41
our ego
74:42
do you have your hands on our amygdala
74:44
you know and and are you pushing the
74:46
right buttons and so he’s talking about
74:49
racism and you know just the white
74:51
privilege and I’m granting him all of
74:53
that I’m said listen like let me tell
74:56
you why that’s not relevant to my
74:58
concerns and what happened here with
74:59
Murray I’m gonna I’m everything you’re
75:01
gonna say about the history of lynching
75:02
I’m gonna grant you right that’s not the
75:05
we don’t there’s no daylight between us
75:07
there and but the thing is I have an
75:09
audience that is that care is massively
75:12
about
75:13
following the logical conversation if
75:16
somebody makes a point in frustrating
75:17
that is even close to being a good in
75:21
response to me my audience is like you
75:24
know okay Sam what the fuck are you
75:25
gonna say to that yes right and if and
75:26
if I drop that ball I I lose massive
75:29
points right whereas I’m often finding
75:31
myself in conversation with people who
75:33
don’t have to care about those kinds of
75:35
audience that was the one I had one with
75:37
this Omar Aziz oh well that was title
75:40
the best podcast ever I mean he knows
75:42
his audience does not care about him
75:45
honestly representing in this case the
75:47
doctrine of Islam who was that guy even
75:49
I mean Ali says right fine you could say
75:50
okay editor of ox or whatever where did
75:52
you even find that Connie’s podcast
75:58
until this day I don’t even know who
76:00
this bloke is this guy is some crazy guy
76:03
me and auntie me it was because at one
76:07
point he was going on about me being
76:08
some form of enabler of your bigotry and
76:10
yeah well yeah be your own Uncle Tom
76:12
yeah I could see this is that this is
76:18
why it’s so frustrating because I have
76:19
pretty much memorized inside out back to
76:22
front these lannister ideological
76:23
narrative and I could sit here right now
76:25
and play that game with you the game of
76:28
ping pong yeah without conceding
76:29
anything and this is where you know I
76:31
feel our conversation went really well
76:33
because it was stripped away from all of
76:36
that bullshit and we had a genuine
76:38
conversation it still to this day very
76:40
easy for me to to play the tune of the
76:45
Islamist and score those points
76:47
especially because some of what I’ve
76:49
been through
76:49
yeah and score those points and just get
76:52
locked in a essentially it’s ego but
76:55
it’s it’s a it’s it’s it’s not an
76:56
intellectual conversation it’s a it’s
76:58
it’s a game of you know who’s who is
77:00
basically checking the right boxes in
77:03
their own little confirmation bias to
77:04
their own audience that doesn’t interest
77:07
me but it’s frustrating you’re also
77:09
you’re also the best person on the other
77:11
side of that conversation now so there’s
77:13
a series of videos on YouTube I think
77:15
it’s called Merry Christmas mr. Islamist
77:17
yeah that’s right and so on YouTube you
77:19
can look at him hit it against people
77:22
who are playing this game you know
77:23
Islamists and and jihadis of various
77:25
sorts you
77:26
and that he modest is meeting them on
77:29
your interview shows you mostly in the
77:31
UK where they’re pretending to be more
77:35
benign than they are and that it
77:36
monitors you know finding the question
77:38
that sort of pulls back the mask on the
77:41
theocrats hilarious it’s it’s very fun
77:43
well you’re that one video that you
77:45
publish on your blog I’ve sent to dozens
77:48
of my friends the one video where
77:50
there’s this guy and he’s addressing
77:51
this enormous group of people and he’s
77:53
talking about is this radical Islam or
77:55
is this Islam that was I think a
77:57
conference in Norway yeah that was just
77:59
I mean he’s not straight up in Islamist
78:02
jihadist addressing a crowd of seemingly
78:05
mainstream Muslims in Norway and but he
78:07
just by show of hands you know is it you
78:10
know are we extremists if we think
78:11
apostates or COPD it’s it’s pretty it’s
78:14
stunning it’s an amazing document in
78:16
yeah in respect to the way they want to
78:18
treat homosexuals apostates I mean the
78:21
whole thing is is this Islam or is this
78:23
radical Islam talking of ideology
78:26
blinking statistical data on the subject
78:29
of homosexuality so in the United
78:31
Kingdom a poll was done last year asking
78:34
so there have been two polls gauging
78:36
public Muslim attitudes towards gays the
78:40
first asked how many Muslims in the UK
78:42
find homosexuality morally acceptable
78:44
and zero percent this is by the way by a
78:48
professional polling company it’s not
78:50
just some student that’s devised a poll
78:52
on Twitter a professional polling
78:54
company found that zero percent of
78:55
British Muslims responded to a poll
78:58
saying that they found homosexuality
78:59
morally acceptable and then a year later
79:02
which now last year another poll was was
79:05
conducted and that was an ICM poll
79:08
asking whether British Mazda how many
79:11
British Muslims believed the
79:13
homosexuality should be criminalized or
79:15
remain legal and I think it was roughly
79:18
52% 52% if my memory serves incorrectly
79:21
said of British Muslims said that they
79:23
would wish for homosexuality to be
79:25
criminalized and of course what does
79:27
criminalization of homosexuality mean
79:29
under Sharia and traditional Islamic
79:32
jurisprudence we know that it’s
79:34
punished by death so these are these
79:38
this is scientific data from gauging you
79:40
know attitudes British Muslim attitudes
79:42
towards homosexuality but the
79:44
ideological blinkers will will kick in
79:47
and refuse to see that truth and these
79:49
aren’t Islamists unfortunately my
79:50
dialogue with Sam we talked about this
79:51
that there are the Islamists who want to
79:53
who actively want to take over a country
79:55
and enforce their version of Islam then
79:57
there’s underneath that there’s a softer
79:59
landing of very very conservative
80:01
stroked fundamentalist attitudes that
80:04
unfortunately have become widespread and
80:06
here is an example of it that is that is
80:08
being gauged by scientific polling
80:10
methodology that tells us there’s a
80:12
problem and unfortunately if one were to
80:14
speak in this way especially in in
80:18
Europe one is received by my own
80:21
political tribe and that’s liberals
80:23
center-left and further one is met with
80:27
denial and called a bigot simply for
80:30
relaying these facts a quarter of
80:32
British Muslims when asked about the
80:35
massacre at the Charlie Hebdo offices in
80:37
Paris a quarter said that those attacks
80:41
are justifiable they sympathized with
80:42
the attackers as opposed to the victims
80:45
who were the staff at the Charlie Hebdo
80:46
offices so this is what led you to be
80:50
put on the southern sovereign speaking
80:53
in these terms and unfortunately it’s
80:55
reporting polling data and what it does
80:57
for me is to say this is why it’s so
80:59
important to address these issues to
81:00
have these conversations to try and
81:03
empower those Muslim voices that are
81:04
seeking to challenge this sort of these
81:06
sorts of attitudes and and carve out a
81:09
space and if you know if if one can do
81:12
that with Catholicism in Europe and go
81:14
through a Reformation and end up with an
81:16
Enlightenment and end up with secularism
81:17
in the West what I often say is American
81:21
liberals are very happy challenging
81:23
their own Bible Belt and yet we have a
81:25
Quran Belt within our communities and if
81:28
I’m attempting to replicate the
81:30
equivalent of challenging the Bible Belt
81:32
within Muslim communities it means
81:33
addressing these issues and yet they
81:35
grant to themselves the right to
81:37
challenge the Bible Belt within America
81:40
and yet if we were to challenge what I
81:41
call the Quran belt in Europe we
81:43
suddenly called bigots
81:45
you know and Islamophobes is this is
81:48
this static has this been moving has it
81:51
been adjusting and changing is there any
81:53
sort of a recognition that there’s an
81:55
issue with this so you know the
81:57
emergence of Isis really did bring it to
81:59
the fore and it really did quieten some
82:02
of the voices it also did increase the
82:04
hysteria from the far left because they
82:06
began panicking thinking actually we’re
82:08
gonna lose this debate and that’s where
82:09
I noticed their labeling became even
82:12
stronger but the emergence of Isis did
82:14
wake up a lot of people to to the
82:16
challenges we’re facing here because so
82:18
many European born and raised Muslims
82:20
went over to join Isis and of course
82:22
think about it in this sense the most
82:23
infamous and notorious execution cell
82:26
that I think were erroneously called the
82:29
the jihadi Beatles in the press because
82:31
actually it really does it’s an insult
82:34
to the Beatles but it was a diminishes
82:35
the true horror you have these guys you
82:37
know they called him jihadi John and but
82:39
the ISIS execution is basically that
82:40
entire cell of the the media face of
82:44
Isis execution cell were all British
82:46
Muslims and that should tell you
82:47
something that we’ve got the worst
82:49
terrorist group educator I mean the
82:50
thing is university graduate like every
82:52
variable that the the far left wants to
82:55
marshal to explain this phenomenon like
82:58
lack of educational opportunity lack of
82:59
Economic Opportunity lack of social
83:01
integration mental illness a you you can
83:04
all you can find people who had massive
83:07
opportunity I mean City were I mean you
83:09
weren’t a jihadist but you were an
83:10
Islamist but let me you’re a person who
83:12
that’s right and basically play any game
83:14
he want to mere you like it is he’s he’s
83:16
he’s somebody who back to the Superman
83:18
he can run he can run for political
83:19
office
83:20
he hasn’t been elected yet but he you
83:22
know he should be I mean this is the
83:25
quarterback of the football team in the
83:28
the this context he is a candidate for a
83:31
recruitment wealth I think a think of it
83:32
this way we’ve got the worst terrorist
83:34
group in our lifetime it one can
83:36
reasonably say is Isis right the worst
83:38
terrorist group at least in living
83:39
memory is Isis and the worst cell
83:42
analysis the execution cell came from a
83:45
fully developed for want of a better
83:49
term first world country and that was
83:50
Britain and mohammed emwazi the leader
83:53
of that execution cell graduated from
83:55
the University of Westminster was given
83:57
as a young child was given political
83:59
asylum by Britain because his family
84:00
were Kuwaiti and they fled the invasion
84:03
of Kuwait by Saddam Hussein the country
84:05
that the West liberated and he turned
84:08
against that country so he had every
84:10
reason to like Britain Britain gave my
84:12
home gave him a actually physically
84:14
bricks-and-mortar house gave his family
84:16
on social costs they gave him social
84:18
housing they educated he graduated from
84:20
University and they liberated his
84:21
father’s country from an aggressor and
84:23
this man turned against this country
84:26
that helped him and his family and his
84:27
nation was he captured or did he’s dead
84:31
what one of them has been captured but
84:32
he’s currently being held in Turkey
84:34
would it would be fascinating to listen
84:36
to his rationale it’s not so that the
84:39
other why I forgot his name but he was
84:41
just interviewed you know you don’t get
84:43
a lot out of him
84:44
he was interviewed by female Arab
84:45
journalists so on and dismissive
84:51
character he refused to talk about much
84:54
he said you know these are accurate
84:55
accusations and allegations you’re
84:57
making and I will wait to trial in the
85:01
end he kind of cut the interview short
85:02
he seemed a little put out that she was
85:04
a woman oh yeah he did so as I’m looking
85:06
at you now imagine she’s the interview
85:08
and and and and she’s asking me
85:10
questions and I’m looking in this
85:11
direction
85:12
she literally never laid eyes on her
85:18
[Laughter]
85:21
it’s so intense it’s such it like as you
85:24
say radioactive subject to just it’s
85:27
it’s just fascinating to watch white
85:29
liberal progressives just scamper away
85:32
from this well but the flip the flip
85:34
side of the ISIS thing has been the
85:36
refugee crisis which has made which has
85:39
really empowered both extremes frankly
85:42
that the far left and the far right so
85:44
you have the far right you obviously
85:47
with the wind in their sails worrying
85:50
about this influx of people from the
85:53
Middle East and and you know and beyond
85:55
North Africa and just the change of
85:58
culture in their societies and a lot of
86:01
these concerns are plausible but because
86:03
only the far right and a few other
86:05
decent people like Douglas Murray will
86:08
talk about the plausible concerns
86:11
the space has just been vacated so you
86:13
just have the far right but if our
86:15
far-right populist politics being and
86:17
enabled and then you have this
86:19
delusional open borders left that won’t
86:22
we’ve got to talk about the huge problem
86:25
I told Sam about this but it bears
86:27
repeating I was having a conversation
86:29
with someone as an executive at YouTube
86:30
and I asked them why someone got a
86:33
Community Guidelines strike on their
86:34
account because they posted up a video
86:36
on their playlist that they enjoyed of
86:38
Sam Harris and Douglas Murray engaged in
86:42
a conversation I go why would that get
86:45
you a Community Guidelines strike and
86:46
this woman said because it’s hate speech
86:48
I got a problem with the last you see
86:50
sorry apparently Douglas Murray caused
86:52
me problems somebody worked yes she was
86:55
a big executive equate YouTube
86:57
she said it’s hate speech and I told her
86:59
I go did you listen to it I go you
87:01
didn’t listen to it I go this is
87:02
stunning that you would just say it’s
87:03
hate speech then you just be so
87:06
dismissive of it so quickly and she
87:08
talked to me as if I was her employee
87:10
like I was not allowed to question her
87:12
and she was just gonna say what she said
87:13
and I was gonna shut up and it was a
87:15
fascinating conversation no no no why on
87:19
vacation no but it was I did a podcast
87:21
with Douglas and apparently it got
87:23
flagged someone else put it up on their
87:25
account and I got flagged as hate speech
87:27
and strikes you can get your account
87:30
removed so I’ve got a phrase for this
87:32
and I’ve been I’ve been rallying for it
87:33
on social media for a couple of months
87:35
now and I call it a digital blind spot
87:38
there’s a cultural bias on social media
87:41
where because of and it’s intellectually
87:44
lazy because because social media is
87:46
essentially a Californian invention
87:48
right and we’re in the home state of
87:50
where most of this came from
87:51
it’s got a very Californian based
87:54
worldview which cares a lot about white
87:56
supremacy and doesn’t care about many
87:59
other forms of bigotry that exist out
88:00
there in the rest of the world which by
88:02
the way is the majority of the world so
88:04
on Twitter right now of course there’s
88:06
Miley Annapolis has been banned
88:08
Tommy Robinson has been banned as in
88:11
taken off now Twitter’s a private camere
88:12
onsen he’s the former leader of the
88:14
British English Defence League which was
88:17
at one time Europe’s largest anti-muslim
88:19
street protest group
88:20
I helped him leave that organization is
88:22
still what many views I completely
88:24
agree with but nevertheless he doesn’t
88:26
support or nor advocate for terrorism
88:28
why was he removed well so Twitter is a
88:31
private company it can choose to remove
88:33
whoever it wants for whatever reason and
88:34
we will judge it for us inconsistencies
88:35
but he was ostensibly removed for hate
88:37
speech as was Milo unitless
88:39
now the point being that still till this
88:43
day and before people misquote me and
88:45
completely say that I’m now defending
88:47
hate speech and and it’s and their right
88:50
to speak with hateful views on Twitter
88:52
this is my actual point that till this
88:54
day did you know that Hezbollah which is
88:58
a known and recognized terrorist
89:00
organization
89:01
so forget hate speech for a moment a
89:03
terrorist organization that believes in
89:05
actually killing civilians and Hamas a
89:08
known and recognized terrorist
89:10
organization that believes in bombing
89:12
babies on buses as a form of resistance
89:14
they still have accounts on Twitter and
89:18
my point is is that this is the this is
89:20
the blind spot you know that and I’ve
89:22
flagged Twitter about this on many an
89:24
occasion this is the cultural blind spot
89:26
this is the digital blind spot that the
89:28
dude sitting in California in wherever
89:31
who is monitoring this stuff and it’s
89:33
probably more than one person they don’t
89:35
give a shit that there’s some Brown
89:37
person in the Gaza Strip that believes
89:40
it’s okay to kill Jewish babies they
89:42
don’t give a shit because it’s a brown
89:44
person saying it in the name of Islam
89:46
what they care about is a non-violent
89:49
yet says stupid things guy because he’s
89:52
white called Tommy Robinson in England
89:54
or Milo u Annapolis saying stuff that
89:56
they obviously that touches their
89:58
sensitivities and it’s so intellectually
90:00
lazy to flag that immediately and to bar
90:02
it from social media because you’re
90:04
comfortable with it you recognize white
90:06
supremacy it doesn’t take any effort to
90:08
recognize it you don’t have to invest in
90:09
studying this stuff to know what white
90:11
supremacy is it takes a bit of effort to
90:14
study brown people’s ideas that you’re
90:16
unfamiliar with and recognize here’s a
90:19
terrorist organization that’s freely
90:20
operating on social media I know
90:22
specifically on Twitter
90:23
I’ve actually pulled up their handles I
90:25
think one of the concerns that Twitter
90:27
has and I think this is a valid concern
90:29
is that when you have people there
90:30
saying hateful things and you have
90:32
people that are saying whether it’s
90:34
white supremacy or whatever even if it’s
90:35
stupid yeah
90:37
problem is there’s a rallying cry of
90:39
trolls that follow behind them and it
90:42
builds up momentum and it gets pretty
90:44
stunning and that was what was happening
90:45
with Milo and by silencing Milo off
90:49
Twitter they have essentially removed
90:51
him from the public discourse you don’t
90:53
hear about him what’s right because of
90:55
this because of these things but imagine
90:58
what that does in Arabic with the
90:59
terrorist groups yes but there’s there
91:00
everything you’ve just said by the way I
91:02
agree with and multiply that for groups
91:05
that have infrastructure in multiple
91:06
countries with actual organizational
91:09
hierarchies and planned means of
91:11
distributing their ideas across entire
91:14
populations physically fighting in Wars
91:16
right now such as Hezbollah in Syria
91:18
killing Sunni Muslim rebels you know and
91:20
so imagine that and the and the way
91:22
you’re able to rally a mob in Pakistan
91:24
on blasphemy as an example all it takes
91:27
for some person on social media to
91:28
accuse another person or blasphemy and
91:30
they’re probably gonna get killed the
91:31
very next day where and it happens all
91:32
the time but but because these
91:34
californian based social media companies
91:36
are unaware of of the of the cultural
91:39
implications of those sorts of
91:40
organizations and groups and listed
91:42
terrorist groups mind you they are
91:44
there’s completely no no barring on any
91:46
of their activity there’s also the same
91:48
thing that you have with YouTube and
91:50
with a lot of these other social media
91:52
organizations and companies is they
91:54
don’t have to respond or give you any
91:57
reasons they can say it violates our
92:00
Terms but what are those terms those
92:02
terms aren’t even listed it would be
92:03
vague like no hate speech okay well
92:05
what’s hate speech like what do you say
92:07
like what is what are you what is your
92:09
clear policy what are your guidelines
92:12
how does someone avoid violating your
92:14
guidelines they don’t say yeah and how
92:16
is the president the United States not
92:17
of not violating those yeah well the
92:20
monetization is another way that they do
92:22
it they’ll remove the ability to put
92:23
advertising on a conversation that they
92:26
don’t like and it doesn’t have to be
92:28
like my conversation with Douglas Murray
92:29
was Dumont’s not without any explanation
92:32
none zero then we have Douglas his he’s
92:37
yeah but if you’ve listened to our act
92:40
the actual context of our conversation
92:42
there’s nothing even remotely remotely
92:44
hateful about it yeah yeah I mean these
92:47
are private companies they’ve got the
92:48
right to to choose whatever policy the
92:50
only
92:50
thing I would expect from a private
92:52
company show a consistent policy towards
92:54
these things you know if you don’t like
92:56
hate speech then Brown Band Brown people
92:58
who are also advocating more than just
92:59
the hate speech but actually preaching
93:01
violent terrorism right yeah it’s a
93:03
strange time for this man because it’s
93:06
it’s also a time where it’s you can
93:09
communicate so instantaneously it’s
93:11
fantastic in that regard you can get
93:13
ideas out so quickly but these hubs of
93:17
information like where the information
93:19
gets distributed are they’re controlled
93:22
by people that I don’t think ever knew
93:24
that they were going to have this sort
93:26
of responsibility I don’t think I think
93:27
you’re seeing that with Zuckerberg and
93:29
these trials or the the the speeches
93:32
that he’s given in front of Congress
93:33
like when you see him on television
93:35
talking about it you get the sense that
93:37
this is a guy that never prepared for
93:39
this had no idea this was going to
93:40
happen and then all of a sudden from
93:42
this simple social media platform that
93:45
was supposed to be friends sharing
93:47
photos and just talking about girls yeah
93:50
no sense of yeah put women there’s a lot
93:53
of that you know but I mean – and what
93:55
was Twitter I mean Twitter was
93:56
essentially just you know I mean you
93:57
remember the old days of Twitter it
93:59
would be you would use your name it like
94:03
is doing this like Sam under Sam Harris
94:06
like Sam Harris is at the movies you
94:08
would say that almost if you were in a
94:10
third person that was the original form
94:12
that people would use Twitter come after
94:14
that it was weird it was a weird way of
94:16
talking and then people started just
94:18
writing what they thought yeah and it
94:21
just became and then became ideology and
94:24
then it became sharing links sharing
94:27
links and interesting articles is a big
94:28
part of it but to me that’s the only
94:31
good part of it now like I got like I’ve
94:34
just discovered that and that was as
94:37
most of my attachment to it I genuinely
94:40
use it to ask you as a curated news
94:43
because I follow interesting people they
94:45
say they tweet interesting stuff and I
94:47
and I consumed it that way but noticing
94:50
what’s coming back at me in the at
94:52
mansion so I put something out you know
94:54
with my podcast and then I look to see
94:56
how it’s being received on Twitter and I
94:58
don’t tend to do that in other forums I
95:01
don’t really look at facebook comments
95:03
much
95:03
I don’t look at YouTube on YouTube it’s
95:06
just a cesspool right I mean so so even
95:08
if therefore you that the comments are
95:09
horrible nastiness started on the
95:17
youtube comment friends it’s and then
95:18
spread everywhere else very strange but
95:20
so I but one thing I found that you you
95:22
can change that your settings in Twitter
95:24
where you you screen out people who
95:27
don’t have you know just have Twitter
95:30
egg photos they don’t have a real photo
95:31
you can screen out people who haven’t
95:33
had their email confirmed and I think I
95:36
just did those two things and like 90%
95:39
of the hate went away it was amazing
95:41
like it just just doing that thank you
95:45
for you should do that except I’m think
95:47
it’s better to not actually even look at
95:49
what’s coming back at you
95:51
well you’ve taken it off your phone no I
95:52
think so too I think looking looking at
95:55
it – virus my wife Rachel will be very
95:57
happy with that I think she’d probably
95:59
wish that I did the city you tweak on it
96:02
to do you think I don’t react sometimes
96:05
I like to think I don’t react in this
96:07
way but I mean I can’t I can’t say that
96:09
cuz actually probably I have sometimes
96:10
but but you know I get all that same
96:12
kind of I get it’s interesting because I
96:14
took I took a stance on the serious
96:16
strikes and that’s your stance well I
96:18
just think that um especially now in
96:21
hindsight we’re there now no casualties
96:23
involved at all there are only three
96:25
injuries I think we had to take a stance
96:27
that succeeded where Obama failed in in
96:31
making sure that redline was maintained
96:32
that the use of chemical weapons cannot
96:34
be tolerated even if it was symbolic
96:36
even it was highly symbolic I think
96:38
sometimes symbolism is important so I
96:40
took that stance and it wasn’t got a lot
96:43
of love on Twitter oh yeah of course
96:44
because it’s actually that’s against the
96:45
grain public opinion at the moment is is
96:47
it was against the strikes and I fully
96:48
acknowledge that when I took the stance
96:49
right but I argued a case and I set the
96:52
case out and both on my sky news show i
96:54
have a show a co-host on the pledge and
96:56
also on my radio show on LBC I
96:58
repeatedly argue for why I think is
97:00
important that we don’t allow for
97:02
chemical weapons and they used to become
97:03
normalized in our world and so it was
97:06
interesting because I posted the sky
97:07
news clip of me sort of talking to
97:09
camera about my reasons for this and and
97:12
I have this screen grab of the reaction
97:15
it’s just a puddle of blood so it is the
97:19
two extremes completely they actually
97:21
started fighting with each other about
97:23
who’s right about her so I’ve said look
97:25
there here’s a clip why we must
97:26
intervene is here after that come up
97:27
with that blah blah first one is a guy
97:29
with an actual swastika Nazi symbol on
97:31
his profile and it says you know that
97:34
Nordic Scot as his handle Thomas James
97:36
he says Majid wants Britain to intervene
97:38
in Syria because Putin and Assad are
97:40
kicking his Isis buddies arses end of
97:42
story right so there’s a guy is
97:43
basically saying what my real reason for
97:45
calling for that is because I’m
97:46
supporting Isis against the Assad regime
97:48
the guy immediately after responds to
97:51
him and it’s called at last oh right and
97:53
he says what are you on about you Nazi
97:55
dumbest Majid is funded by you’re not
97:57
he’s a far I Uncle Tom captures that but
98:05
they’re arguing with each other over
98:07
whether I’m in there cut his camp or his
98:09
camp basically you never have the worst
98:12
publicist in the world or the best one
98:14
so I should I think I should take myself
98:15
out of that equation let them find each
98:17
other it would be even better really
98:18
that’s the move just set something like
98:20
that up set the far right and the far
98:22
left against each other and you could
98:23
just like sneak away while they’re
98:25
fighting yeah that’s how nuts it is I
98:28
mean the the kind of horses you know the
98:31
extremes are I mean they’re equally
98:36
irrational and the fact that you could
98:38
be at the epicenter of each prompt of
98:42
both of their problems yeah you’re
98:43
Europe covert jihadist and you’re an
98:46
anti-muslim bigot it seems like there’s
98:48
more conspiracy theories in in terms of
98:51
like what someone’s actual motivation
98:53
for what they’re saying now than ever
98:54
before to because it’s so easy to
98:56
express them so someone could say no you
98:58
know he’s far-right or no you’re you’re
99:00
you’re just trying to support Isis yeah
99:03
like this this is this ability to like
99:05
find some nefarious reason for your
99:07
actions but again it’s reducing one’s
99:09
opinion to the lowest yes base you know
99:12
dodgy motive as opposed to applying the
99:14
principle of charity so if Joe says
99:16
something now I can either sit her and
99:18
actually think no I don’t trust this guy
99:20
I don’t respect him and therefore I’m
99:22
gonna reduce his opinion to the worst
99:24
possible interpretation that he could
99:25
possibly mean and then use that against
99:27
him
99:28
or I could continue to ask what you mean
99:30
by that because I’m assuming you’re a
99:32
good decent human being in origin and
99:34
perhaps you mean something that I
99:35
haven’t yet quite grasped and then I’ll
99:36
see to clarify your own opinion in your
99:38
own words and I think it’s unfortunate
99:40
that many of our conversations today and
99:43
the far left is as guilty of it as the
99:44
far right and they like to think they’re
99:46
not which is part of that righteousness
99:47
that blinds them from actually
99:49
committing this very same injustice they
99:51
accuse the far right of committing and
99:52
that is a it’s the same bigotry in in a
99:55
mirror image I call it the bigotry of
99:57
low expectations the low expectations
99:59
they have that Muslims are somehow
100:01
unable to adhere to a common decent
100:02
liberal secular democratic values and so
100:05
it’s actually plaguing our conversation
100:07
stay if only we were able to strip away
100:10
our ideological baggage in entering
100:12
conversations and and allow for you know
100:14
that honest honest conversation but of
100:16
course we say that and then you try to
100:18
replicate our success on a number of
100:20
occasions and found yourself incredibly
100:22
frustrated well you know unfortunately I
100:26
found the one reasonable person to have
100:28
a fight with well it just seems like
100:30
this is a side effect of this increased
100:33
ability to communicate and that just
100:34
there’s so much noise and there’s so
100:37
much going on I mean it is the most
100:39
fantastic time for the distribution of
100:41
information there’s never been time yeah
100:43
where it’s so easy to distribute
100:44
information in human history it’s really
100:46
crazy but I don’t think we know what to
100:47
do with it and I think that when you
100:49
deal with people who have such rigid
100:51
ideologies and they find this incredibly
100:54
easy ability to express these ideologies
100:56
there’s just so much clashing it’s just
100:59
so much so much noise and nonsense and
101:01
when someone says something that they
101:05
know that they don’t have to back up
101:06
with facts because they know that
101:07
they’re there people were on their
101:08
position will support it you say the
101:10
right keywords you know right and
101:12
privilege whatever you want to say and
101:13
then boom you’re gonna get a whole slew
101:17
of people like those two people in your
101:18
your mentions battling it out with each
101:20
other you’re just like kind of picking
101:22
fights and starting these little fires
101:23
and letting other people go to war you
101:26
know what I think we’ve done and it’s
101:27
again the advent of social media is that
101:28
we I was speaking with my friend Mark
101:31
about this and we’ve democratized truth
101:34
and when you democratize truth in that
101:36
way the earlier thing you mentioned
101:39
about sports
101:40
that sports and your expertise in their
101:42
field if I had come back at you and
101:44
spoke at you with as much authority as
101:47
you claim in your expertise with having
101:50
absolutely no history in that expertise
101:52
whatsoever and assumed that I have as
101:55
equal right to an unresearched claim to
101:59
truth in my opinion as you do and who
102:01
has a lifetime of experience in that
102:04
field therein lies a problem that I am
102:06
arrogating to myself this notion this
102:09
this this kind of belief that my opinion
102:12
though I’ve got of course I have an
102:13
equally legal right to express it but it
102:15
doesn’t mean it carries the same weight
102:16
as your opinion when it comes to combat
102:18
sports and it shouldn’t unfortunately I
102:20
think what’s happened with the and were
102:22
still you could add you’re expressing
102:24
that opinion as a person of color or as
102:30
therefore it Sun criticize about by you
102:32
because his truth otherwise you’re
102:34
racist and it’s my task the key word
102:36
that it’s my truth you know and so the
102:38
problem with that is when you relativize
102:39
truth in that way is it there now I can
102:41
speak to you on on an equal footing
102:42
about combat sports which only a mad
102:45
person who hasn’t had that history in
102:46
combats what would think would arrogate
102:48
to themselves a right to do so but
102:49
social media I think has allowed for
102:51
that to happen I gave a TED talking
102:54
about I think it was roughly 2011 about
102:55
the the dangers of this happening and
102:57
social media dividing us all but I’d say
103:00
now that that’s if I were to pitch that
103:02
TED talk today I did it that Ted global
103:04
if I were to pitch that TED talk today
103:06
it wouldn’t be accepted because it’s not
103:09
something new now it’s it’s now people
103:10
know that how social medias has divided
103:12
us but back then it was new and
103:15
innovative art in offer as an idea for
103:17
Ted global to say we want you to speak
103:18
about this on and it’s still up online
103:20
but if people watched it today they’d
103:21
think how on earth did that become a TED
103:22
talk um
103:23
because there was this heady day back in
103:27
you know five six seven years ago this
103:29
kind of hope filled moment where
103:31
everyone thought Google Facebook and
103:33
Twitter and generally social media and
103:35
also tech companies were like the good
103:37
guys that these companies weren’t
103:39
actually companies that they were on our
103:41
side against the corporate world and it
103:43
turns out I think we’ve just hit this
103:44
moment he mentioned Zuckerberg we I
103:46
think we’ve culturally come to this
103:47
moment now where you know I think
103:50
symbolized by his testimony of Congress
103:51
that those that honeymoon period is over
103:54
people now view him I think quite firmly
103:57
and squarely as a CEO of a very rich
104:00
company as opposed to a guy in my club
104:04
that I’m friends with who’s on my side
104:05
against the world you know and that’s
104:07
how Google used to have that slogan
104:09
don’t do evil they still have it I mean
104:14
the problem is the the incentives are
104:16
all wrong and I’m sorry I was just at
104:18
Ted and well they give you a sense of
104:20
how far the rot has spread here so I was
104:22
I found myself at a dinner sitting next
104:24
to a neuroscientist who thought that it
104:28
was and this Ezra Klein thing followed
104:32
me around to Ted and I saw because many
104:33
people have listened to the podcast and
104:35
he thought Charles Marie should have
104:38
been physically attacked at Middlebury
104:40
this is a nurse is a neuroscientist
104:42
academic you know what you’re like a
104:45
impeccable person otherwise I think he
104:48
was after we wound up having a fight at
104:50
dinner over it I think he was somewhat
104:52
chagrined by having expressed that
104:53
opinion but I mean that’s how how
104:55
emotionally hijacked people are by this
104:57
issue and but it’s a that’s incredible
105:03
it was the other thing that’s new this
105:05
is the other thing that’s no the left
105:07
advocating for violence this is very new
105:10
yeah yeah I mean I always felt like the
105:12
the left was nonviolent the the whole
105:16
idea behind being progressive like
105:18
non-violence was was a genuine aspect of
105:21
that and free speech yeah two things yes
105:24
those are two things that have been sort
105:25
of stopped that this free speech is fine
105:28
as long as you’re not saying speech that
105:30
I disagree with and non-violence sure
105:33
unless we need to use violence which is
105:35
like and the people that are saying it
105:37
like if you watch these nt4 people like
105:40
Jesus Christ the most incompetent
105:41
violent people you’ve ever seen in your
105:43
life these guys practicing there’s
105:53
videos of anti feh they had they got
105:56
together and decided to train and
105:58
prepare for violence and so they’re
106:00
doing these martial arts classes they
106:02
have people teach them like holy shit
106:03
like the average high school kid could
106:06
fuck you guys up like this is the most
106:08
ridiculous thing I’ve ever seen in my
106:09
life but it’s almost like they’re they
106:11
realize that there’s not that much
106:14
danger in what they’re doing and they
106:16
can kind of play with danger they can
106:18
play with violence they can put the
106:20
masks on there that you know they’re not
106:22
in Israel they’re check out the Gaza
106:24
Strip show they’re a bunch of cowards
106:26
he’s a guy there’s a guy he went to my
106:27
old Universe I graduated from so ask
106:29
before I did my masters at the LSE so s
106:31
has been embroiled in a strike at the
106:32
moment as the Students Union has been
106:34
supporting professors who are on strike
106:36
and it’s over pension and pension rights
106:38
in a refused government refusing to
106:40
raise their pension rights and whatever
106:41
and some of the students came out in
106:43
strike far-left students defending the
106:46
professors and they put up they put
106:47
forward a ring preventing students from
106:49
attending their classes and an a female
106:53
black lecturer wanted to cross the
106:56
strike lines to go in to teach her
106:57
students a white male public school
107:02
educated very very middle class
107:05
protester far left physically attacked
107:08
her he physically attacked a female
107:10
black professor so gone is suddenly gone
107:14
is the white privilege gone is the male
107:16
attacking a female you know gone is all
107:18
of that is non-violence all the above
107:20
the name of ideology he legitimized and
107:23
allowed himself to attack of black
107:25
female by the way oh and she was also
107:26
Muslim black female this white kid is
107:36
just attack for wanting to teach my
107:38
class this is crazy this is crazy crazy
107:41
world we’re in man this is do you are
107:43
you optimistic about the future yeah I
107:50
say that because it’s going to take a
107:52
lifetime’s work and I don’t think that
107:53
in our lifetime much is gonna change I
107:57
think you know maybe for the next
107:59
generation what is it the picture of how
108:04
do you conceive of your job at the
108:05
moment and what what is the status quo I
108:08
mean so say for instance Isis the
108:10
Islamic state is sort of fading from
108:12
most people’s memory now I mean there’s
108:14
you know the even mine I’m spending much
108:16
less time thinking about it because it
108:18
seems to fit so let me tell you story in
108:19
into submission
108:20
can answer this question with a story so
108:22
radical which is my autobiography has a
108:23
u.s. publication right in the UK it’s
108:26
Random House Penguin it’s published by
108:27
the biggest publishing house when I came
108:29
to publish in the US I approached
108:32
publishing houses but it was after bin
108:34
Laden was killed and so when we
108:37
approached ten twenty whatever
108:38
publishing houses the problem solved
108:40
they all said no they say the problem
108:42
solved they said we think you know we
108:44
wish you’d come to us five years earlier
108:45
but problem solved now there’s not a
108:47
problem anymore and and a bit like what
108:50
you mentioned is sort of your expertise
108:51
and and I I have been consumed by this
108:54
subject all my life and there are a few
108:56
people on this planet that I would take
108:58
seriously on this subject outside
109:01
especially of Quilliam and there are
109:02
other organizations they have some
109:04
really good people but I know them all
109:05
and we regularly speak so I would say to
109:08
all these publishing houses I can assure
109:11
you 100% this problem not only has not
109:13
been solved it’s gonna come back around
109:15
in a far worse way than you can ever
109:17
have imagined this is before Isis came
109:19
along none of them believed me of course
109:22
what then happened my cookbook
109:24
eventually got published by some very
109:26
small publishing house in the US and has
109:28
done quite well for them but the point
109:30
of the story was this Isis came around
109:32
and people were suddenly like oh my god
109:34
where did this come from of course those
109:36
of us who had been monitoring the
109:37
situation knew this was going to come
109:39
back around very very heavy now the ISIS
109:42
had been pushed back and and this is
109:44
where this story is sort of the point of
109:45
the story is we’ve got to resist the
109:47
temptation to believe the problem has
109:49
been solved because the the organization
109:51
known as Isis which is an a bureaucracy
109:54
has been fought back but the ideology
109:56
upon which that organization was built
109:59
is still very much alive and it’s still
110:02
strong um what al Qaeda did while the
110:05
whole world was focused on Isis was
110:07
exploit that opportunity to rebuild and
110:10
regroup and they’ve been rebuilding in
110:12
Syria now they are stronger than they
110:15
have ever been even under bin Laden
110:17
because for the first time in the
110:19
history of that organization they are
110:21
firmly embedded within the Syrian
110:23
population as they genuinely kind of
110:25
viewed by the people that they were
110:27
fighting on behalf of as a grassroots
110:29
resistance organization whereas up
110:31
before that they were seen as a a tear
110:33
group that was like a you know just like
110:34
a vanguard they’ve embedded themselves
110:36
in the Syrian population in the Yemeni
110:38
Civil War they’ve embedded themselves in
110:40
North Africa East Africa and in Pakistan
110:42
and they are resurgent and they are
110:45
grooming Hamza Bin Ladin who has been
110:48
add-in son and they’re grooming him for
110:50
leadership and and a time will come
110:51
maybe in a couple of months maybe in a
110:53
couple of years where they announce
110:54
Hamza bin Laden as a new leader of
110:56
al-qaeda currently it’s Ayman Zawahiri
110:58
when they do that once their grooming
111:01
has been complete and assuming hamza
111:03
isn’t killed up until then all of the
111:06
fragments of what remains of isis will
111:09
probably rejoin al qaeda under hamza bin
111:11
Laden and you’ll have a stronger than
111:13
ever before al-qaeda organization and
111:15
we’ve got to we’ve got to remember that
111:17
we never expected Isis to emerge alqaeda
111:19
will come back with a vengeance what is
111:23
the the politics between the remnants of
111:27
Isis and al Qaeda
111:29
well Hamza bin Laden’s succession to the
111:30
leadership solves that problem of the
111:32
biggest the Isis guys well originally
111:35
all al Qaeda Isis was al Qaeda in Syria
111:37
and they broke away after bin Laden died
111:39
because they didn’t they had pledged
111:41
allegiance to bin Laden and the new
111:43
leader of al Qaeda Ayman Zawahiri is by
111:45
all accounts a rather uncharismatic and
111:47
you know he’s a he’s a pediatrician he’s
111:49
not really a kind of bin Laden had the
111:51
Korea’s media Trish yeah he’s a kid he’s
111:53
Egyptian as an Egyptian pediatrician
111:54
from a very well-off Egyptian family by
111:56
the way
111:57
I think his grandfather ambassador bin
112:02
Laden clearly had the charisma the
112:04
wealth the presence the looks he had all
112:07
of it
112:07
that saguaro he doesn’t as worries you
112:10
know compared to bin Laden he just
112:11
doesn’t you know say if the guys that
112:13
broke away from Al Qaeda’s forum Isis
112:15
said to suwari the current leader we
112:17
pledged allegiance to bin Laden we are
112:19
you nothing you’re not our Emir our
112:21
leader
112:22
if humza bin Laden comes back into as
112:25
the leader of al Qaeda it solves that
112:26
problem because those remnants of Isis
112:29
have a loyalty to the bin Laden name and
112:31
their bin Laden family and they remember
112:33
what they consider their glory days
112:34
fighting under under bin Laden that’s
112:39
not nice to hear no no the problem has
112:41
not gone away I can tell you that the
112:43
problem and the problem is the ideology
112:45
and it will not
112:46
be dealt with until we deal with this
112:48
ideology and it’s why it’s so dangerous
112:50
too you know there was this awful term
112:52
that I railed against it was so
112:55
frustrating to see under Obama’s
112:57
presidency the US State Department
112:58
officially adopted as their name for
113:01
challenging this problem they adopted
113:03
the term al Qaeda inspired extremism of
113:07
course it isn’t it isn’t al Qaeda that
113:10
it inspired extremism its extremism that
113:12
inspired al Qaeda and it is for the
113:15
purposes of political correctness you’ve
113:16
got this term and the State Department
113:17
officially that we’re fighting across
113:19
the world we are fighting al Qaeda
113:21
inspired extremism my former
113:24
organization his but to hire a Caliphate
113:26
espousing organization that believes in
113:28
their ideal caliphate that gays should
113:30
be killed adulterous he should be stoned
113:32
to death
113:33
they were there before al Qaeda and this
113:36
ideology has been there before al Qaeda
113:37
al Qaeda was one of a long line of
113:39
groups that came as a result of the
113:41
Islamist ideology and we’ve got to start
113:43
focusing on the ideology itself not the
113:45
physical groups that spring up from it
113:47
because they can change their name as
113:49
you point out there’s a another layer to
113:52
the ideology that is also that is even
113:54
more well subscribed that presents
113:56
social and political problems so freely
113:58
so as you said there are conservative
114:01
Muslims who don’t support al Qaeda
114:03
they’re not jihadist they can they would
114:05
honestly say bin Laden doesn’t represent
114:07
my brand of Islam but these are still
114:10
people who will who will say that
114:12
homosexuals should be killed that’s nice
114:15
oh so it’s like there’s apparent allies
114:18
against quote extremism can still be
114:21
people so with with religiously mandated
114:24
social attitudes that just cannot be
114:26
assimilated in cosmopolitan societies so
114:30
people who are and it may be worse worse
114:34
than worse than al-qaeda inspired
114:36
extremism there’s just this notion that
114:38
on the left and and this was this came
114:40
out of Obama’s mouth and it came out of
114:42
Clinton’s mouth and largely why she
114:44
wasn’t president it’s not it’s just a
114:48
generic extremism right so that like in
114:51
the same sentence that you have to worry
114:53
about the caliphate you have to talk
114:56
about abortion doctors being killed in
114:59
the u.s. once every
115:00
fifteen years so you might cost you
115:01
remember because that President Obama
115:02
refused to use the word Islamist
115:04
extremism Trump has the other problem he
115:06
thinks that bite like Rumpelstiltskin by
115:08
repeating it enough you’ve solved the
115:09
problem you know but but actually one of
115:12
the elements in which he was correctly
115:13
critical of Obama was and I was at the
115:16
time vocally critical of Obama’s
115:17
reluctance to use the word Islamist
115:19
extremism and we’ve got no problem when
115:23
we talk about you know when we talk
115:26
about white supremacist ideology we
115:28
don’t mean that all white people are
115:30
supremacists you know what we’re doing
115:32
here is actually attributing precisely
115:35
specifically what the ideology is and
115:37
believes in white supremacy and likewise
115:40
Islamist you know it’s important so we
115:42
can identify that ideology still while
115:45
not calling it Islam right so we’re
115:48
still giving a bit of a leeway there for
115:49
everybody else all the other Muslims but
115:52
to call it Islamist extremism is to
115:53
recognize that it’s an offshoot of Islam
115:55
it’s a manifestation extreme or
115:56
otherwise of Islam and thereby we are
115:59
acknowledging that its justifications
116:01
are in Islamic Scripture as well as of
116:03
course a multiplicity of other causes
116:05
grievances and what-have-you but we
116:07
cannot ignore that it also rests on
116:09
justifications that are derived from the
116:11
Islamic Scripture I mean I can cite for
116:13
the Arabic that tells you in the Koran
116:15
itself to cut the hand of a thief or to
116:17
lash the adulterer you know these are
116:20
they all quote the hadith or the saying
116:22
of the Prophet that says kill the person
116:23
that changes their religion this is
116:25
scripture and so of course there are
116:27
other factors involved as well but one
116:29
of the factors that gives rise to this
116:31
is the unreformed scripture that these
116:34
extremists cite and so we have to
116:36
acknowledge that Islam has a role to
116:37
play I often say that you know because
116:40
again under the Obama presidency it was
116:42
frustrating that the common refrain was
116:44
to say that Islam this has nothing to do
116:46
with Islam this is absurd as arguing
116:49
that the Spanish Inquisition had nothing
116:50
to do with Catholicism he went even
116:52
further at one point didn’t he at one
116:53
point say that not only does this have
116:55
nothing to do with his mom this has less
116:57
to do with Islam than any other wouldn’t
116:59
tell him it was just he bent over
117:00
backwards it’s not saying the Crusades
117:02
have nothing dude Christianity yeah Oh
117:04
gentlemen unfortunately I have to wrap
117:06
this up but I really appreciate you guys
117:09
coming on it was
117:11
in your book the book is Islam in the
117:15
future of tolerance and actually we’re
117:17
the one thing we do have to announce is
117:19
we’re going to Sydney and Auckland yeah
117:23
two of us and Douglas Murray and both
117:25
Weinstein brothers we’re gonna we’re
117:27
gonna wreck those towns oh my goodness
117:30
we’re gonna have a podcast a day long
117:32
calm I think you want to use that first
117:33
name because I think okay
117:42
no but great to get both of them
117:43
together that room yeah those guys are
117:45
awesome
117:45
yeah I’m really grateful to meet both of
117:47
them and you as well thank you guys
117:49
thank you appreciate is
117:54
[Applause]
117:57
[Music]