The Grant Williams Podcast

In 2020, I stoked the fire of my passion for interviewing brilliant people by launching The Grant Williams Podcast. Alongside my own interviews, I added three additional streams, each with a fantastic co-host: The End Game (with Bill Fleckenstein), The Super Terrific Happy Hour (with Stephanie Pomboy), and The Narrative Game (with Dr. Ben Hunt).

The response has been overwhelming. In only 5 months, these interviews have been downloaded over 1 million times, placing this content squarely in the top 0.1% of podcasts offered in the vast marketplace.

Daniel Schmachtenberger on The Portal (with host Eric Weinstein), Ep. #027 – On Avoiding Apocalypses

64:58
is this you know kind of funny idea of a
65:01
paperclip Maximizer paperclip is
65:02
representative of any widget so make an
65:04
AI that basically can do two things it
65:08
can optimize the production of something
65:10
here a paperclip and so it can use its
65:12
intelligence to do that so it can make
65:14
more efficient supply chains and
65:16
whatever and it can use its intelligence
65:18
to increase its own intelligence so
65:20
you’ll get a exponential curve on
65:21
intelligence which also then means an
65:23
exponential curve on its capacity to
65:25
optimize whatever narrow metric its
65:27
optimizing and so of course after it
65:30
just makes increases in efficiency which
65:32
are awesome then it starts making so
65:33
many paper clips that it needs new
65:35
substrate to make paperclips out of and
65:37
it eventually turns the whole world into
65:39
paperclips because it can it can grow
65:41
its intelligence to out-compete
65:42
whoever’s competing for those paperclips
65:45
faster than they can that’s a very short
65:48
version of it you’re going to say
65:49
something well just
65:52
I guess what I find very bizarre about
65:54
all of this is that I live in multiple
65:57
social worlds and intellectual worlds
66:00
and in some of my worlds this stuff
66:02
strikes people as loopy oh here comes
66:06
the stuff about the AGI that the robots
66:08
are gonna kill us all and in some other
66:12
portion of my world it’s like well
66:13
clearly we’re on the verge of AGI and
66:16
that’s going to be the existential risk
66:18
and this is in part to go back to your
66:22
original point and something that you
66:24
and I share a failure a catastrophic
66:27
failure of communal sense making right
66:30
so what I’ve claimed is is that the
66:33
revolution that we’re in is is based
66:35
around the idea that we don’t have what
66:39
I’ve called semi reliable communal sense
66:41
making we can’t all agree now even if
66:45
it’s slightly wrong or maybe even deeply
66:47
wrong as to what it is that we’re seeing
66:49
where we are in human history what our
66:51
issues are and so the first part of this
66:55
decision tree that goes really wrong is
66:57
that a lot of people think that we’re in
67:00
great shape okay so this is I I’m
67:04
actually gonna come back to the
67:05
paperclip Maximizer because it explains
67:07
why we don’t have communal sense making
67:09
if instead of thinking about an
67:11
artificial intelligence they can
67:14
increase its capacity while optimizing
67:16
something we think of a collective
67:18
intelligence that can some way that
67:19
humans are processing information
67:21
together in a group a market is a kind
67:24
of collective intelligence right the
67:26
whole idea of what the invisible hand of
67:27
the market that the market will figure
67:29
out what stuff people really want it’s
67:32
expressed as demand and then which
67:34
version of the various supplies is best
67:36
that I would say it’s an intelligence
67:39
it’s not a central intelligence but the
67:41
idea is that there is kind of an
67:43
emergent intelligent is an emergent
67:46
property of this thing and you know it
67:51
computes things like prices and
67:53
allocations and if that’s what you mean
67:55
by intelligence that I oh yeah so it’s
67:57
it is a bottom-up coordination system
68:01
that does end up having new information
68:06
emerges
68:06
the result of the bottom-up coordination
68:08
okay and I can take a market as a as
68:12
kind of at the sender I can take
68:14
capitalism at the center of the more
68:17
general class of what I would call rival
68:19
risk dynamics as a whole as a kind of
68:21
collective intelligence because the
68:24
thing that wins at the game of rivalry
68:26
gets selected for and so there is this
68:27
kind of learning of how to get better at
68:29
rival risk games learning across the
68:32
system as a whole which which things win
68:35
in or which things keep more people
68:37
believing the thing which keep people
68:40
from at ridding out of the thing like
68:41
that that makes sense yes and so I would
68:48
say that we have if we just take the
68:50
capitalism part capitalism is a paper
68:56
clip Maximizer that is converting the
68:58
natural world and human resources into
69:00
capital while getting better at doing so
69:02
so it goes from barter to currency to
69:05
fiat currency to fractional reserve
69:07
process to complex financial instruments
69:10
to high-speed trading on those those are
69:12
like the increase in its capacity to do
69:15
that but specifically now it is a
69:18
incentive for all the humans to do
69:20
certain things
69:21
so if leaving the whale alive in the
69:23
ocean confers no economic advantage on
69:26
me but killing it and selling it as meat
69:28
is a million dollars of economic
69:29
advantage and if I don’t kill it the a
69:32
whale still won’t be alive because
69:33
somebody else is gonna kill it anyways
69:35
and then that they might actually even
69:37
use that economic power against me now
69:38
I’ve got I have an incentive system that
69:41
is encouraging all the humans to behave
69:43
in certain kinds of ways and now not
69:45
only do we need to kill the whales we
69:46
need to race at getting better to do it
69:48
and making better militaries and
69:50
extracting all the resources and so I
69:53
see this as a kind of so if I’m
69:56
understanding where you’re headed what
69:58
you’re saying is is that the market is
69:59
kind of a precursor to an AGI it is a
70:06
collective intelligence that is
70:12
eventually self terminating in the same
70:15
way that a cancer is right the cancer
70:17
cells are self-replicating and they’re
70:19
growing
70:19
to the normal cells but they end up
70:21
killing the host which kills themselves
70:23
and so the the reason I’m bringing this
70:27
up in terms of collective sense making
70:29
is those who do the will of capitalism
70:33
like those who do the will of the
70:35
paperclip Maximizer Moloch’s or on
70:37
whatever kind of analogy we want to use
70:39
here those who do well at the game of
70:42
power get more power and then they use
70:44
that legislative power media power
70:46
capital power to make systems that to
70:49
modify the systems in ways that help
70:51
them more right those who oppose the
70:54
system of power also oppose those who
70:56
are doing well at it so even though the
70:58
system is inanimate the people who are
71:00
doing well at it or animate so then they
71:02
take those people out which is we see
71:04
how Martin Luther King and Gandhi and
71:06
Jesus and etc died people who actually
71:09
opposed the system of power right and so
71:13
you end up having a system that is
71:15
selecting for or is conferring more
71:19
power to those who are good at getting
71:22
more power which ends up meaning who are
71:24
selecting for conferring power to
71:27
sociopathy yeah I I don’t find this part
71:33
of the argument well maybe I’m just
71:36
stuck somewhere doing okay let me be I
71:39
mean I think I’m on your side so I want
71:42
to help make a different part of this
71:44
case I think a lot of this comes down to
71:46
magical thinking because of the non use
71:49
of nuclear weapons against humans since
71:54
1945 I think that one thing if 9/11 had
71:59
been a nuclear attack rather than a
72:01
weird conventional attack we would know
72:04
where we were in human history and by
72:07
virtue of our luck and our luck alone we
72:11
are completely confused as to how
72:13
perilous the present moment is because
72:15
our luck has been amazing and if you
72:18
believe surprising yeah if you believe
72:22
that somehow it can’t be luck because
72:24
it’s this good then you believe that
72:28
there’s some unknown principle keeping
72:30
us safe and that you don’t know what the
72:32
name of that principle
72:33
maybe it’s human engine in ingenuity
72:36
maybe it’s some sort of secret
72:38
collective that keeps the world sensible
72:42
maybe it’s that markets have tied us all
72:44
together I don’t know what your story is
72:46
yeah but whatever your story is it’s
72:48
wrong and it’s it’s obviously wrong
72:51
right the the idea that we didn’t have
72:54
anything like 9/11 and then we had a
72:56
sudden 9/11 kind of attack is itself
73:01
paradigm attic that these things with
73:05
which you have no data familiarity I
73:08
mean look there was no suicide bombing
73:12
in the modern world before the 1980s and
73:15
I think this is the point is that it’s
73:20
generally more advantageous within a
73:22
market to believe that markets are good
73:24
in the world as healthy and things are
73:26
awesome
73:26
I’ll usually do better in academia if I
73:29
say academia is good right which is a
73:31
point that you make if I really
73:32
criticize it heavily I’m gonna get less
73:34
tense this is Peter’s point more than my
73:35
point okay yeah I will usually do better
73:39
in markets if I say they’re awesome and
73:41
do better in a corporation if I say it’s
73:42
awesome and so there is kind of an
73:44
incentive for optimism about the
73:48
dominant system if I want to do well in
73:50
the dominant system and if I have
73:51
critiques of the dominant system I’m
73:53
usually going to do less well in it
73:54
which means less power will get
73:56
conferred to those ideas and so there’s
73:58
kind of a mimetic selection right like
74:00
the memes that that do well end up being
74:04
the memes that propagate but do well
74:06
with an ax current system well look this
74:07
is why I’ve called for a return to
74:11
above-ground nuclear testing because my
74:13
belief is is that we you laugh but I’m
74:16
not kidding yeah I mean if we don’t get
74:18
our amygdalas really engaged with where
74:21
we are this magical thinking which by
74:24
the way I suffer from this magical thing
74:26
I’m not it’s not something I’m claiming
74:27
everyone else has I have the idea that
74:29
nothing too bad can can come that you
74:33
know I always asked this weird question
74:34
which is how many foreign nuclear
74:36
devices are currently on US soil people
74:39
always think about the nukes will have
74:40
to come through an ICBM I’m not at all
74:42
convinced that that would have to be
74:43
true people just don’t think about these
74:46
things because
74:47
we’ve been in such a rare period of time
74:50
that these things haven’t like everybody
74:53
who’s talked in these terms
74:55
sort of to me like there’s a part of me
74:58
that sounds like okay well that’s the
74:59
kind of a conversation you have on on a
75:01
dorm floor during a bull session like
75:06
grown-ups realize that something is
75:08
keeping the world together which is
75:11
funny right because it’s basically
75:12
saying grown-ups have bought into
75:15
magical thinking exactly yeah and so by
75:20
the way a lot of the people that I think
75:22
of as being the smartest most
75:24
interesting people have not bought into
75:26
this magical thinking what has happened
75:29
is is that those people have been pushed
75:31
out of the institutions that form this
75:35
sort of weird conversation that I refer
75:37
to as the gated institutional narrative
75:38
and the depopulation of dissenters like
75:42
really serious dissenters from inside
75:45
the institutional complex is one of the
75:50
defining features of our age to me but
75:52
something that you can’t get any
75:53
commentary on because the commentary
75:56
you’re really looking for is is that
75:59
conversation so what I just try to do is
76:02
to show people that no matter what you
76:04
do the gated institutional narrative
76:06
cannot look at certain very basic facts
76:09
right so you get a very dangerous kind
76:12
of groupthink there and even if someone
76:15
disagrees with it they have a lot more
76:17
incentive to not publicly disagree with
76:19
it within that group think only of those
76:22
people I mean this is why we’re doing
76:24
the portal podcast which is let’s be
76:27
honest this is pirate radio yeah and I
76:30
don’t know why it hasn’t been shut down
76:32
at the moment the best thing that the
76:36
gated institutional conversation has
76:38
going for it is that all of these
76:41
interesting people are simply humans and
76:45
you can destroy any human reputationally
76:48
and so the cheapest thing to do is not
76:50
to kill anybody right but just as
76:51
somebody starts to accumulate mindshare
76:54
the gated institutional narrative goes
76:56
into into hyperdrive and it just starts
76:59
pumping out
77:01
fear uncertainty in doubt which is the
77:03
you know Fudd is the major tool for
77:06
destroying an individual’s ability to
77:08
communicate reality yeah
77:11
something I think about is the people
77:16
who went through what happened in Syria
77:19
recently or say collapse in Libya or
77:23
wherever where you had an actual pretty
77:25
developed nation that didn’t also expect
77:29
that it was going to go through war and
77:30
collapse and then it did I bet that if
77:33
we talk to those people they would have
77:35
a very different intuition for the state
77:37
of the fragility here because they
77:39
actually have first-person experience
77:41
it’s something that seemed really stable
77:43
and like it wasn’t going to collapse
77:44
actually did and most of us haven’t
77:47
actually been through anything like a
77:49
collapse in our life and we don’t have a
77:51
good intuition for things that are only
77:53
in history books and so this is a place
77:57
where our intuition by itself like our
78:00
intuition is informed by our experience
78:02
but our experience is very short right
78:04
and if we study past civilizations one
78:07
of the things we know and you know we
78:08
can read Tainter in the collapse of
78:10
complex societies or you know other kind
78:14
of good insights on how civilizational
78:16
collapse works but none of the previous
78:18
civilizations are still here like that’s
78:21
that’s one of the important things to
78:23
get is that they go through a life cycle
78:24
and that they mostly collapse for
78:26
self-induced reasons and that even if
78:29
someone else overtook them oftentimes
78:31
the group that overtook them was smaller
78:33
than rivals that they had fended off
78:35
previously because they had already
78:37
started going through institutional
78:39
collapse or civilizational collapse and
78:41
if we look across all of them there are
78:43
some things we can generalize about what
78:45
leads to civilizational collapse but I
78:47
think the difference now versus any of
78:49
those other times is that due to
78:50
globalization yeah in some ways the US
78:53
and China are different civilizations
78:54
but both of them would fail without each
78:56
other currently and like we don’t make
79:00
our own computational substrate we don’t
79:01
make our own lots of things right like
79:03
they don’t do their own fundamental well
79:05
but the size of the hope of the but the
79:07
architects of many of these systems
79:11
believed that a kind of eakin
79:14
Sonic mutually assured destruction yeah
79:17
it was the best way of producing hang
79:21
that’s been true you well so I was gonna
79:24
bring up the case of Europe so one of
79:26
the arguments for the European
79:27
experiment is that you Europe is
79:31
actually arguably the world’s most
79:33
dangerous region people are very
79:36
competent and their long-standing
79:38
rivalries and hatreds and you had some
79:45
desire to create something that seems
79:47
impossible which is a United States of
79:50
Europe and nobody was gonna sign up for
79:53
that so how do you do that while you
79:54
back you back them into a financial
79:58
union without political Union you give
80:00
them their ability the ability to issue
80:02
their own debt but not an ability to
80:04
print it their currencies and then you
80:08
wait for the collapse to come and then
80:13
your hope in this storyline anyway you
80:15
create a Federation which becomes a
80:19
political Federation the United States
80:21
of Europe is created because of a
80:24
sovereign debt crisis and we sort of
80:26
went through that which I believe was it
80:29
was was sort of a sought after outcome
80:31
which is maybe hopefully people will
80:34
print their own their own debt and
80:37
they’ll issue debt and they won’t be
80:39
able to print their own currencies in
80:41
order to inflate their way out of it
80:43
ergo something positive will have to
80:46
happen that seems to me to be also a
80:51
recipe for disaster and that the
80:53
architects of these plans seemingly died
80:57
and everybody’s on autopilot not
80:59
understanding you know I’ve seen this in
81:01
the in terms of certain US policies
81:03
where people create a policy for reasons
81:06
that nobody’s really understanding and a
81:09
short time later nobody even knows why
81:10
the policy was created the real reason
81:13
and to begin with do you see that this
81:15
kind of a world I think that’s actually
81:17
one of the meaningful dynamics in
81:20
institutional decay and in
81:21
civilizational decay is that a new
81:24
civilization is formed coming out
81:26
coming out of a war or after a migration
81:28
or through a famine or after like some
81:30
really developing yeah and to really be
81:36
able to build something new took real
81:38
capacities what you would call the
81:40
contact with the unforgiving right like
81:42
real empirical capacities and just
81:45
loaded my lingo sir a bit yeah and I
81:48
think that’s really good
81:50
lingo because like the I can’t I can’t
81:54
lie to physics and have it reward me for
81:56
it right like either I can grow corn or
81:59
I can’t grow corn either I can win it a
82:01
war or I can’t but there’s a real
82:03
situation and so oftentimes when we go
82:05
from non more time where the generals
82:08
are politicians to war time where the
82:11
politician generals who maybe suck at
82:13
war start losing battles and we cycle
82:15
through looking for ones who are good at
82:16
it then we get some who are actually
82:17
good at war those difficult situations
82:20
select for real empirical capacity but
82:24
when you don’t have those difficult
82:25
situations then you’re actually
82:27
selecting for who can do politics best
82:28
which means convince everyone of
82:30
something whether it’s true or not well
82:31
this is what I call sharp Minds versus
82:33
sharp elbows yeah and so you have the
82:37
people who are at the beginning of
82:38
figuring out how to do some new
82:41
civilization and those people had some
82:44
capacity to be in direct contact with
82:46
reality and figure stuff out and then
82:48
oftentimes what they pass on is the
82:51
stuff they figured out but not the
82:53
psychology in them and the capacities to
82:55
figure stuff out so the generator
82:57
function of the civilizational models
82:59
lost and so now we start getting copying
83:01
errors and people are hopefully trying
83:04
to at least copy it earnestly so now
83:07
we’ve got a constitution or a set of law
83:09
or a set of market practices or whatever
83:11
it is but we don’t really understand how
83:13
we generated that effective thing and so
83:16
that also means that as the environment
83:18
changes we won’t be able to adapt it
83:20
adequately and that also means that
83:23
we’re not going to know how to deal with
83:24
failures of it so then some people
83:26
recognizing that start realizing that
83:29
they can do better by defecting on the
83:32
system and kind of preying on it then by
83:35
participating with the system and so and
83:38
this is what we
83:40
think of as corruption right where they
83:41
can start maximizing their own bonus
83:45
structure or do it back in the or
83:47
whatever and so long as it’s adequately
83:48
hidden they can get away with it and now
83:50
that collapses the civilization even
83:52
further so it goes from loss of
83:54
generator function to copying errors to
83:56
incentive for internal defection and
83:59
disinformation and you know like I think
84:03
that every civilization has faced this a
84:05
loss of intergenerational knowledge
84:07
transfer because it’s not just the
84:08
knowledge it’s the generative function
84:10
of how Lahti good it’s also the case
84:11
that real knowledge I think has become
84:15
too dangerous to transmit and the real
84:21
knowledge doesn’t know what the social
84:27
norms are and you know certainly the
84:32
biological world is so disturbing I mean
84:35
there’s no corner of the biological
84:37
world since you can look at where and
84:39
not come away thinking wow that’s
84:41
incredibly destroyed and what we’re
84:45
seeing right now a situation in which we
84:46
can’t cope with any discussion of
84:48
biology every single attempt to have a
84:51
real biological discussion given all of
84:54
the social issues that it would bring up
84:56
immediately ends in madness
84:59
I’ve just seen no ability to talk real
85:01
biology in public and so this is the
85:04
earliest place where I can see here’s a
85:06
subject of science that actually can’t
85:09
be discussed I don’t have anything in
85:11
particular in mind
85:12
there’s just you know like you know Bob
85:15
Trevor’s work on parent-child conflict
85:17
if we have a beautiful story about how
85:19
mothers would do anything for their
85:21
children and somebody comes along and
85:22
says no it’s actually a struggle where
85:24
mothers want to hold on to their
85:26
resources because they’re gonna have
85:27
many children and the child attempts to
85:31
gain as much resource as possible
85:32
without regard for the mother
85:34
that’s so against the Hallmark card
85:36
version of motherhood for Mother’s Day
85:38
that we can’t have a discussion about
85:40
parent-child conflict in biology it’s
85:42
not that that one isn’t about gender
85:44
it’s not about race it’s not about you
85:46
know power dynamics it’s about it just
85:49
immediately runs into one of our
85:51
cherished nonsensical points of
85:54
or is it the market is self-correcting
85:56
the market is always self-correcting and
85:58
knows best that the leading thinkers are
86:01
all sitting in institutional chairs that
86:05
every previous civilization was the
86:08
Hobbesian bias brutish nasty short
86:10
dreadful lives and that everything is
86:12
awesome just in the last little bit
86:13
because of this system so don’t
86:15
criticize the system eventually so this
86:16
is the the weird thing that I’m I’m
86:18
finding is that you can’t start
86:21
interesting conversations not only about
86:24
the pessimism of the impending collapse
86:27
if we keep this up but about the
86:29
optimism about well what might we do
86:31
differently like we can’t get energized
86:33
to actually use this period of time to
86:37
do something novel in an interesting and
86:40
hopefully a so think about this the you
86:43
know the definition of infidel for kind
86:45
of a jihadist ideology is anybody that’s
86:47
not supporting the jihadist ideology the
86:50
definition of which to the Crusaders was
86:53
kind of a similar thing right the I have
86:56
a friend who went looked at a bunch of
86:59
the intelligence agency documents in
87:01
Yugoslavia and some of the Baltic
87:03
nations that had been Declassified after
87:05
the USSR collapse specifically regarding
87:06
how the intelligence agencies influenced
87:10
the definition of psychiatry and their
87:12
equivalent of the DSM and so there was
87:14
something like their definition of
87:16
Diagnostic and Statistical Manual for
87:17
psychology which tells you when somebody
87:19
is mentally has a personality disorder
87:22
neuroses you remember the previous
87:24
definition of female mania during the
87:26
Victorian period right which basically
87:28
translated to she had a sex drive and so
87:30
that was like a mental illness and but
87:34
so they their definition of something
87:37
that translated to schizophrenia the
87:39
first symptom was had negative feelings
87:41
about the state and the second symptoms
87:44
might take a while to show up and so
87:46
what I think happens is that the
87:47
dominant system ends up eating
87:49
psychology and saying that the
87:51
psychology that supports the dominant
87:53
system is healthy psychology and
87:55
anything that is dissenting – it’s not
87:56
healthy it ends up beating spirituality
87:59
and virtue and ethics and academia and
88:02
whatever to basically say the the
88:05
behaviors that support the system
88:06
are good so the thinking that supports
88:08
those behaviors is good and anything
88:10
that’s dissenting is bad and like it’s
88:13
so easy to see it in the Crusades or in
88:16
jihadism or even in Victorian time
88:18
period it’s just very hard for us to see
88:19
it about ourselves now but I think
88:23
that’s actually like one of these
88:24
fundamental things in terms of you’re
88:26
saying like why don’t we have group
88:28
sense-making is because you have us you
88:30
have a self-perpetuating system that
88:33
includes the self perpetuation of the
88:36
memes that support the system I
88:39
understand well look I also have I ask
88:43
it not because I have no ideas why we
88:45
don’t have communal sense making what
88:48
I’m confused by is why we are not more
88:52
successful people in our group and I
88:56
mean this in a relatively large and
88:58
inclusive sense because you and I come
89:00
from different corners of this large
89:02
collection of people I think people are
89:04
relatively well-spoken some of them have
89:08
fancy degrees some of them have made
89:09
money some of them have become
89:12
relatively well known for their their
89:14
thinking and yet that institutional
89:18
conversation I mean I always liken it a
89:20
bit to the difference between wrestling
89:23
and professional wrestling where in the
89:27
institutional conversation you need to
89:29
know what’s going to happen ahead of
89:30
schedule so the you can know whether
89:33
you’re going to have that part of the
89:35
conversation or not or whether it’s
89:36
going to be that the private
89:38
conversation that we can’t talk about
89:39
versus the public in a conversation in
89:42
this concept I’ve called split level
89:45
argument other people have called
89:46
motte-and-bailey style tactics where you
89:49
have some version of the argument that
89:51
you can make in public and then you have
89:54
some other argument that you really have
89:57
to is governing why you’re saying and
89:59
doing what you’re doing all of these
90:01
things lead to this very unhealthy
90:03
situation whereby there is no communal
90:08
sense making there is a gated
90:10
institutional narrative it seems to be
90:12
decaying progressively year by year
90:15
nobody’s suddenly coming up to me and
90:17
saying wow I think CNN and Fox are doing
90:20
a great job this
90:22
where’s the hope where does this get fun
90:25
and when did we get a chance to to find
90:29
a portal to a better world as you see
90:32
most likely yeah well oh I want to start
90:36
by saying I think this is important and
90:39
I think that you doing this as a portal
90:42
to a better world where you are
90:44
supporting earnest thinking that is
90:48
outside of institutional context and
90:51
maybe heterodox but at least earnest and
90:53
seeking to be well grounded and the fact
90:56
that people are interested in it I think
90:58
is really important when we come back to
91:00
the difference between personal
91:01
incentive and collective incentive right
91:03
you say why aren’t we more successful
91:05
obviously it’s like okay so what is the
91:07
incentive for someone to agree with us
91:10
that for the most part expressing these
91:13
things would make them do less well at
91:16
politics and their job and maybe even
91:18
their social club and maybe even be part
91:21
of the in-group that they’re part of
91:23
whether it’s the left or the right or
91:24
the whatever it is because then they
91:26
would be saying things that there’s
91:27
almost no in-group that they would be
91:29
aligned with or very very small and so
91:32
you still end up having that there’s
91:36
more selective pressure for the
91:38
individuals to continue to be part of
91:40
institutions even an institutional
91:42
thoughts also doesn’t make sense here’s
91:44
the part that doesn’t make sense
91:45
and very kind of you to say what you
91:47
just said let’s imagine that you have
91:51
perfect SAT scores you kept your nose
91:55
clean your whole life you’ve gone to
91:58
Harvard and Yale you’ve got a position
92:02
where you’re commenting as a professor
92:04
with a column in a major publication if
92:10
that person for example calls me all
92:14
right you know or I don’t know I mean
92:21
like I have a Jewish last name I voted
92:23
for Bernie there’s some point where that
92:27
person’s self-esteem I would imagine
92:29
they would be so embarrassed to put
92:32
their life’s
92:33
compliments at risk by just being
92:36
obviously stupidly wrong like that they
92:40
just there’s there seems to be no bottom
92:43
at the moment okay so this is this is
92:45
important about obviously stupidly wrong
92:48
I understand obviously stupidly wrong
92:52
when your ability to demonstrate your
92:55
power is to go out in the public square
92:58
and say the dumbest most ridiculous most
93:01
obviously incorrect thing you can think
93:03
of and nobody says a word yeah well one
93:07
of the things I find interesting is you
93:10
know if we ask a question like even
93:13
what’s actually causing coral die-off
93:15
how much of it is temperature versus pH
93:19
versus nitrogen messing up the
93:21
phosphorous cycle versus trophic
93:23
cascades right how long do we have
93:25
before the coral die-off what are the
93:27
consequences of that you know like
93:30
really important questions right or what
93:32
are the actual what really happened in
93:34
North Korea like why there was such a
93:36
change just recently and what are the
93:38
actual tactical nuclear capabilities
93:40
that they have or or how much leakage
93:43
actually occurred at Fukushima or like
93:45
any of these things nobody fucking knows
93:46
and you’ll hear different narratives and
93:49
you’ll hear kind of equally compelling
93:52
disagreeable narratives on those yeah
93:54
and almost no one has the time or the
93:57
will or the epistemic capacity to really
94:00
figure that out so one point is the
94:02
sense making is actually hard you have a
94:04
situation in which a lot of these things
94:06
are complex enough and there’s so much
94:11
disinformation that when people try to
94:14
actually figure it out they just get a
94:16
they get a information overwhelm and
94:19
then it’s very hard for them to continue
94:22
so when you’re saying like obviously
94:24
stupid well there aren’t there’s a lot
94:25
of places where people can hold a train
94:28
of thought that seems cogent enough even
94:31
if it’s in direct opposition with
94:33
another cogent train of thought and like
94:35
just the plausible deniability that it
94:38
might be one of the true ones since
94:39
nobody can really sense make seems to be
94:41
enough and so this is one of the really
94:44
tricky things is in a world where
94:47
if I have the incentive to dis inform ya
94:50
at various different levels right and
94:52
then I have exponential information tech
94:55
so I can do exponential disinformation
94:57
now this is when I say that the system
94:58
is inanimate I any give this example
95:00
everybody who’s seen Tristan Harris’s
95:03
stuff will know this but if we think
95:05
about disinformation via the nature son
95:07
Harris is mutual colleague he heads a
95:11
movement called time well-spent and he’s
95:13
trying to show you that your attention
95:17
has been effectively weaponized against
95:20
you where the big tech platforms are
95:22
figuring out how to keep your eyeballs
95:23
on their system to your detriment right
95:26
Center for humane technology you can see
95:28
his stuff but like I think I think a lot
95:31
of people know that news stations as
95:36
for-profit companies have to make money
95:38
right and they make money by monetizing
95:41
attention and basically they they sell
95:44
advertising and the advertisers pay more
95:47
the more people who are watching for
95:49
more total minutes so the incentive of
95:52
the news station is to make stuff that
95:54
is both inflaming and scary and
95:57
entertaining and whatever it will engage
95:59
people to spend a lot of time watching
96:00
in and to not say things it would not be
96:04
to the advantage of the advertisers that
96:06
can afford to pay for them right so they
96:07
have like they have an incentive to not
96:10
share really complex nuanced things that
96:13
will have most people click off but to
96:14
share things if I see I don’t I really
96:17
don’t let me give you an argument one of
96:20
the things that I say that I think
96:21
people find interesting is is that I
96:23
believe the National Academy of Sciences
96:25
in the National Science Foundation
96:27
effectively conspired against American
96:30
scientists and engineers on behalf of
96:32
scientific and engineering employers
96:34
that’s a fascinating story I shout it in
96:39
the public square now you know I’ve been
96:42
asked four times to the National Academy
96:44
of Sciences to discuss this so they are
96:46
certainly taking this quite seriously
96:48
I’ve talked to the actual people who are
96:50
involved with this it is amazingly
96:54
interesting you could sell clicks you
96:57
think you could just get advertisers to
97:00
buy for the clicks
97:01
the story nobody’s gonna run the story
97:04
nobody has run the story in I don’t know
97:07
more than 20 years it’s sitting there on
97:09
servers I don’t believe that this is all
97:12
being driven by profit I believe that
97:14
there is some force that we don’t
97:16
understand that keeps the gated
97:19
institutional narrative gated yes I
97:21
think profit is one part of it that’s
97:24
why I say we have to think of profit as
97:25
one aspect of kind of power or rival
97:29
risk dynamic more largely because it’s I
97:32
think government or academia or
97:37
religious or cultural groups or profit
97:39
can all influence the nature of
97:41
narrative and information I think
97:44
there’s an economy of shame and terror
97:47
say more about that I believe that the
97:51
real reason that this works the way it
97:53
does is we have not even gotten to a
97:55
very basic point where it is considered
97:59
acceptable to say I want immigration
98:04
restricted now I point this out because
98:08
I think is very funny most people who
98:10
want immigration restricted enjoy food
98:15
from other cultures they they have
98:17
friends who come from other places they
98:20
enjoy travel there’s nothing xenophobic
98:22
about them in general there is zina
98:24
philic yeah and the idea that you can be
98:27
both as inna philic fascinated and
98:29
interested in the world’s cultures and
98:31
want immigration to your country
98:33
restricted and that this is the generic
98:36
position that the average person holds
98:38
this position is a story that appears
98:42
nowhere so nobody has an idea that zina
98:46
philic restriction lists might be a
98:49
plurality or a majority in the country
98:52
because there is a rule that says anyone
98:55
who calls for a restriction of
98:56
immigration must be tarred as a
98:58
xenophobe right and I think it’s time to
99:02
double dog dare the people who are
99:06
keeping this level of discipline they
99:08
say why can’t why is it it impossible to
99:11
be a zina philic restriction Asst
99:13
what I think is is that the economy of
99:16
shame is such that whoever acts first to
99:21
make this point is in such danger for
99:24
their livelihood their reputation that
99:27
they are going to be tarred and
99:28
feathered and why not one of the things
99:31
that I’m trying to show people is is
99:33
that you can you can make these points
99:35
now I can’t do this on CNN but I can do
99:39
this on pirate radio this is basically
99:41
audio samizdat it to take the Russian
99:44
underground mimeograph movement as as a
99:50
template we can say things here but
99:53
there’s only a matter of time before the
99:57
starts to become problematic to the
99:59
institutional structure and it responds
100:02
by debiting my account
100:05
oh well that that’s that all right guy
100:08
you know he seems disgruntled or you
100:11
know he seems a gloomy and out of touch
100:13
and then the fear uncertainty and doubt
100:15
campaign starts and that’s what is
100:17
actually keeping everybody in line it’s
100:19
not that there isn’t money to be made
100:22
there’s tons of money to be made
100:24
what what’s happened is is that it’s
100:26
been too easy to pick off the initial
100:30
adopters I agree and I’m curious what
100:35
your explanation of how that phenomena
100:38
emerged oh that’s a really so let’s
100:44
really get into it we did have a
100:48
dissension suppression unit inside of
100:52
the FBI which was called COINTELPRO and
100:58
it tried to induce Martin Luther King
101:01
jr. to suicide through a letter from
101:04
Sullivan who was I think number one or
101:06
number two maybe under Hoover this thing
101:09
lived inside of the FBI it probably
101:13
tried to tell John Lennon that he was
101:16
traitorous it tried to humiliate Jean
101:20
Seberg who is a Black Panther supporter
101:22
by planting false information inside of
101:26
mainstream media
101:27
Newsweek in the Los Angeles Times it
101:29
tried to get la cosa nostra to kill Dick
101:33
Gregory the famous comedian and black
101:36
civil rights leader so we did have a
101:39
dirty tricks unit inside of the United
101:41
States that needs to be known broadly
101:45
which was pretty thoroughly investigated
101:48
in the mid-1970s and once we saw that we
101:51
were engaged in his dirty tricks against
101:54
our own people we were kind of shocked
101:57
and flipped out and the economy wasn’t
101:59
in great shape and then Ronald Reagan
102:01
came riding in and I think he pardoned
102:04
mark felt who had been the head of
102:07
COINTELPRO after Hoover but he was also
102:10
deep throat and so you had this very
102:12
strange situation that we got this
102:14
reboot during the Reagan years where we
102:17
went back to some sort of more
102:19
traditional more patriotic imagined
102:22
version of our country and my belief is
102:26
that in part when Bill Clinton decided
102:30
that he couldn’t take yet another loss
102:33
to the Republican Party and was gonna
102:35
start experimenting with republicanism
102:37
inside of the Democratic Party by that
102:40
point we had two parties that more or
102:42
less were two flavors of the same thing
102:45
I refer to that collective as the
102:47
looting party in the looting party the
102:50
neoliberal is the neoconservatives
102:52
sort of intergenerational warfare within
102:58
the country in the US and my take on it
103:01
is that the common ideology is that
103:06
profit had to be found abroad and so you
103:08
had to loosen the bonds to your fellow
103:10
citizens and that’s where all of this
103:14
kind of the market always knows best
103:17
we need to offshore and downsize and
103:20
securitize and what I’ve called the new
103:22
gimmick economy so that right now we’re
103:25
waking up from the new gimmick economy
103:28
and having never lived in anything
103:32
really authentic unless we’re quite old
103:34
so my belief is is that during that
103:37
period of time there was very Swift
103:42
retribution for anyone who dissented
103:44
famously a prominent trade theorist who
103:48
was talking about the benefits of
103:51
restriction of trade restrictions for
103:54
infant industries let’s say apparently
103:57
got a call from one of the people high
103:59
up in the field say oh you seem to be a
104:01
very bright young man it would be a
104:02
shame if anything happened to your
104:04
career and so this kind of idea
104:09
suppression is the the hallmark well it
104:16
is what I think these two generations
104:19
the baby boomers and the Silent
104:21
Generation may become best known for in
104:25
the future that this was a period in
104:28
which new corrective ideas had to be
104:32
suppressed because of the fragility of
104:34
the system we saw the fragility breakout
104:36
in 2008 we saw have vulnerable we were
104:39
in 2001 and we see that the the whole
104:44
sense making apparatus is breaking down
104:48
from the Trump election so these have
104:50
been the three moments when the gated
104:52
institutional narrative has broken
104:54
because it just got overwhelmed by
104:55
events but other than that the key was
104:58
making sure that people like you or like
105:01
me or like Peter are not mainstream the
105:06
cost of listening to us has to be driven
105:08
to astronomical levels so we have to we
105:12
have to look wild-eyed we have to you
105:16
know they can’t call me uneducated if I
105:17
have a Harvard ph.d which is one of the
105:19
funny parts of the system but the idea
105:23
is that you have to say well you know
105:24
maybe he used to be smart but he’s gone
105:27
fringe so the the social cost and
105:30
similar
105:31
it’s amazing how effective such small
105:33
amounts of that can be well it’s also
105:36
just funny I mean it just there’s so
105:39
many hours of audio of us and I was just
105:44
astounded for example with a number of
105:46
people who would try to portray let’s
105:49
say my brother as right-wing I mean from
105:53
my perspective
105:54
can you imagine making that decision
105:57
that you can’t if a guy as far left as
105:59
Brett and you’re gonna spend your
106:02
credibility pretending that he’s like
106:05
allied with the Nazis
106:07
I just died it doesn’t even make sense
106:08
to me because it’s it’s simply to me a
106:12
way to incinerate your credibility and
106:13
yet the way the system works is you
106:18
incinerate people’s viability its
106:22
economic warfare that if your reputation
106:25
is damaged you can’t be trusted
106:27
you know you and and that’s how that’s
106:29
how this this enforcement is work so you
106:31
ask me the question how does it work to
106:33
keep this in line it’s to trivially easy
106:36
to destroy individuals and my question
106:39
has always been is there a program which
106:43
I have tentatively called
106:44
no living heroes and if you’ve heard
106:48
this riff before Charles Lindbergh who
106:52
was not a great human being almost kept
106:56
the u.s. out of World War two he said
106:59
why is this why is this America’s
107:01
problem and if you think about it he had
107:05
self minted credibility in that he got
107:08
into a plane and he flew it over an
107:10
ocean solo and became a hero and that
107:12
level of visibility allowed him to
107:16
compete with the state okay I think that
107:21
there was a program after Lindbergh that
107:23
said individuals should not be able to
107:27
amass sufficient mindshare to affect the
107:32
course of government policy and that
107:35
this is a question in my mind is there a
107:37
program that got started that said we’re
107:40
gonna wait and see if anything starts to
107:42
bubble up that seems to have integrity
107:44
it seems that mindshare it seems to be
107:46
opposed to our policies and if and when
107:49
we find such a thing it has to be
107:51
redirected co-opted destroyed
107:55
reputationally or made ineffectual and
107:58
the the phrase that I really appreciated
108:00
that was used about Jean Seberg who was
108:04
you know one of Hollywood’s great
108:06
leading ladies at the time
108:07
was we have to cheapen her image yeah
108:10
this is the federal government talking
108:12
about cheapening the image of a
108:14
Hollywood star because she was
108:17
interested in in radical black politics
108:21
sorry Kendra but now China reminds me
108:23
when you were saying if when we look at
108:26
biology it’s disturbing when we look at
108:28
history too and we realized that those
108:30
people that did the Crusades were
108:32
genetically identical to us and we think
108:35
about the kind of civilized way that we
108:37
want to think of ourselves that we
108:39
wouldn’t do something like have a
108:40
government try and discredit someone but
108:42
then we look at just how we have behaved
108:44
as people throughout most of history and
108:48
it’s been like it’s been pretty
108:51
draconian through most of it and I think
108:56
we’re at a time where having it more
108:58
hidden has been useful but that doesn’t
109:01
mean that it hasn’t still been happening
109:03
well it’s very interesting to me is we
109:05
go from we have these two phases the
109:07
first phase is like you think people are
109:09
still doing that you have an overactive
109:11
imagination then when it’s discovered I
109:13
say what you think that governments
109:16
don’t do this they’ve always done this
109:17
and I’ve always watched as people get
109:22
their cognitive dissonance to zero using
109:25
two totally different mental strategies
109:29
do you find us of course yeah all right
109:32
Daniel assuming that we are in some
109:36
sense breaking out of this narrative
109:37
that’s been imposed institutionally and
109:39
you’re starting to be able to hear new
109:42
voices is there an opportunity in some
109:46
way to start hacking our way into a less
109:50
rival risk well let me try it again
109:56
sorry for getting a little bit of gas I
110:00
also had a little bit of guest
110:07
all right Danno so if we agree that
110:11
there is something a little bit bizarre
110:13
about the extent to which there’s been
110:15
discipline in this gated institutional
110:17
narrative and it’s been hard to get kind
110:20
of a different message out to people
110:23
that they need to start exploring new
110:26
systems of organization may be beyond
110:28
market democracy who knows what what are
110:31
the most hopeful systems that we
110:34
currently have to use they can be used
110:39
to build even better systems and how do
110:42
we get that message out where do you see
110:43
the hope in trying to confront the real
110:47
problems we face to find and exit into
110:51
our our next stage of human development
110:54
so we’ve been talking about where there
110:58
is incentive for disinformation or
111:01
information suppression or narrative
111:03
suppression the the last chunk of things
111:05
you were sharing regarding shames kind
111:08
of a narrative warfare tool so a way I
111:11
think of it and say there was a group
111:14
that seemed like it didn’t have power of
111:15
one kind then it tries to find power of
111:18
some other kind so reconfiguring in
111:21
groups competing with whatever tools
111:22
they can against out groups but imagine
111:29
if we could create a situation where
111:31
there was no incentive for
111:33
disinformation I’ll talk about in a
111:35
moment how I think we could do that and
111:36
not just no incentive for disinformation
111:38
but also no incentive for information
111:40
withholding and something pretty unique
111:44
about humans is how good we are being
111:47
able to add intention to signal lie but
111:52
all the subtle versions right which is
111:54
most of the signal that is coming to me
111:56
is just bouncing off of stuff and
111:58
reflecting and doesn’t have that much
111:59
disinformation in it and obviously
112:02
animals have kin of camouflage and
112:04
strategies like that but every time
112:06
we’re communicating we are usually
112:09
communicating towards some intention
112:12
that we have and so I want you to think
112:14
certain things were you thinking those
112:16
things I think will advantage me but
112:18
then to the extent that you take what
112:20
saying as adequately informing you like
112:23
accurately informing you about reality I
112:25
not be right like there’s a discrepancy
112:27
between why I’m communicating to you and
112:31
what would be maximum benefit to you so
112:35
and even if we’re not doing spin and
112:38
Russell conjugation disinformation even
112:40
if it’s just i pianned trade secrets and
112:42
information withholding this lowers our
112:44
coordination capacity to do interesting
112:47
things tremendously and then there’s so
112:49
much coordination cost that goes into
112:51
the competition so we say well let’s
112:53
imagine and we I think we can say up to
112:56
a tribal scale people did could do I’m
113:00
not saying they always did I don’t wanna
113:02
be romantic people could do a better job
113:04
of accurate information sharing because
113:07
there was less incentive to dis inform
113:09
each other inside of a tribe because it
113:10
would probably get powned out and we
113:12
actually depended on each other pretty
113:13
significantly but the Dunbar limit seems
113:17
to be a pretty hard limit on that kind
113:19
of information check do you mean this
113:21
supposed Dunbar number that is the limit
113:24
of our ancestral mind or group to track
113:30
the number of interactions we have so
113:32
maybe maybe I can keep track of 200 or
113:34
300 people yeah not much more yeah
113:37
whether whether it’s a hundred and fifty
113:39
or 50 or 200 or whatever it is and you
113:43
know I think we’ve attributed this to
113:44
different things why tribes never got
113:46
beyond a certain scale within a certain
113:48
kind of organization and if they would
113:50
start to they would cleave and then if
113:52
they were going to get larger they had
113:53
to have a different kind of organization
113:54
I think how one thing that we commonly
113:58
think about is that kind of a limit of
114:01
care and tracking right up to that
114:03
number up to 150 people or whatever I
114:06
can actually know everybody pretty well
114:08
they can all know me and if I were to
114:10
hurt anybody I’m hurting the people that
114:11
I’ve known for my whole life so
114:13
something like universal interest of
114:16
that group or almost like a communist
114:19
idea makes sense if there’s no anonymous
114:22
people and there’s no very far spaces
114:25
where I can externalize harm I basically
114:26
can’t externalize harm in the social
114:28
Commons when I know everybody
114:29
I also probably can’t lie and have that
114:31
be advantageous I think there’s another
114:34
thing which is there’s a communication
114:36
protocol that anyone who has information
114:38
about something within that setting can
114:41
inform a choice
114:42
where that information would be relevant
114:44
that the tribe would be making because
114:47
they can actually communicate with
114:48
everybody fairly easily and if there’s a
114:51
really big choice to make everybody can
114:53
sit around a tribal circle and actually
114:55
be able to say something about it and as
114:58
you get larger you just can’t do that
114:59
and I think there’s a strong cleaving
115:02
basis for not wanting to be part of a
115:07
group that would make decisions that
115:08
I’ll be subjected to that I don’t get
115:10
any saying unless it’s really important
115:13
to do that like we’re gonna have there’s
115:15
a situation where tribal warfare is
115:17
starting to occur more often and so
115:19
having a larger group is really
115:20
important or you know some something
115:22
like that in which case the bonding
115:23
energy exceeds the cleaving energy but
115:26
let’s say that we could actually have a
115:28
situation where we had incentive to
115:30
share
115:31
– not this inform and to share accurate
115:33
information with each other and it could
115:35
scale beyond a dunbar size I so now we
115:41
have something where we don’t have
115:43
fractal disinformation inside of a
115:46
company we don’t have people competing
115:48
for cancer cures that aren’t sharing
115:49
information with each other I think that
115:51
system would out-compete all the systems
115:55
that we’ve had in terms of innovation
115:56
and in terms of resource utilization
115:59
resource per capita utilization so much
116:02
that if we could do such a thing had
116:04
become the new attractive Basin to which
116:06
civilizations would want to flow and I
116:08
think the limit of Dunbar dynamics were
116:11
communication protocols and I think we
116:14
do have technological capacity and I’m
116:16
be I mean both social technologies and
116:19
physical technologies to develop systems
116:22
and and so like this is kind of at the
116:24
heart of it to develop systems where
116:28
there was more incentive to share honest
116:33
information and obviously this is a
116:34
example of anti rivalries where I had my
116:41
well being in your well-being and
116:43
wellbeing of the Commons more tightly
116:45
coupled to each other yeah that’s the
116:51
first part of it okay so try to figure
116:56
out how to get very large-scale human
117:01
collectives to behave like small scale
117:04
human collectives well it’s yeah if I
117:07
think about two groups of people that
117:09
sounds to me like TripAdvisor where I to
117:14
some country I’ve never been to and I’m
117:16
never going back again and there’s some
117:20
sort of reputational cost that a hotel
117:22
would have had if it had gamed their
117:25
guests so it becomes a bad idea to game
117:28
your guests because you have a
117:31
fractional relationship with the world
117:33
in some sense where somebody has left a
117:35
review it says but you know be careful
117:37
they try to upsell you on the Wi-Fi and
117:41
it’s a scam and here’s how to look out
117:43
for it and suddenly you have got a
117:45
problem if you’re a dishonest actor
117:47
because there is this sort of
117:49
reputational game that is
117:54
technologically enabled yeah so I think
117:57
this is why people like blockchain is
117:59
the idea of an uncorruptible ledger is
118:01
that this information and information
118:04
withholding or would be really benefit
118:06
beneficial to the public and any kind of
118:08
bad acting does less well with good
118:10
accounting systems I have to be able to
118:12
kind of corrupt the accounting in some
118:14
way to be able to have it be
118:15
advantageous and so can we make can we
118:20
make systems that make the accounting
118:21
much better as part of it but it’s not
118:23
the whole basis because then of course
118:26
you still have incentive to figure out
118:28
how to game the game whatever it is as
118:29
long as we still have separate interests
118:31
and the separate interest which is that
118:34
any in group can advantage itself at the
118:37
expense of an out group or any
118:38
individual canta JIT self at the expense
118:41
of other individuals which is grounded
118:42
all the way down to like a private
118:44
balance sheet I do think is an
118:46
inexorable basis of rivalry and I do
118:49
think that rivalry in a world of
118:52
exponential tech does self-terminate and
118:54
given that I don’t think we can stop
118:57
progress of tech I do think we have to
119:00
create fundamentally anti rivalry
119:02
systems and I don’t think you can do
119:03
that with capitalism or that or private
119:07
property ownership is the primary basis
119:09
to how we get access to things I don’t
119:11
think you can do it with communism or
119:12
socialism or any of the other systems
119:13
we’ve had but I don’t think that if we
119:16
look at how the coordination system of
119:17
cells or organs inside of a body works I
119:19
don’t think it’s capitalist or communist
119:22
I think there’s a much more complex way
119:24
of sharing information and provisioning
119:27
resources within the system you know
119:30
this is how the famous anarchist Peter
119:34
Prince Peter Kropotkin got in trouble I
119:36
think he was like kind of an amateur net
119:39
naturalist and he would observe things
119:42
like ant colonies say look look how well
119:45
the ants cooperate and of course he
119:47
didn’t know that it was a haploid
119:49
diploid system where sisters are more
119:52
closely related to each other than to
119:54
the offspring and you had a you know a
119:56
breeding queen and then effectively
120:00
mimicking some kind of body division
120:04
into soma and germ where your somatic
120:06
cells have no possibility of leaving a
120:11
permanent trace of themselves but for
120:13
their ability to aid your germline cells
120:16
that can become a fertilized you know
120:19
egg and embryo I don’t think there is an
120:23
adequate biomimicry example okay and I
120:26
think there’s an important reason why is
120:27
I think that technology creation is
120:30
something that we don’t see happen in
120:32
nature anywhere else and of course
120:34
animals will use a tool but they don’t
120:37
evolve better tools or to develop better
120:40
tools the way that we develop better
120:42
tools and the distinction of technology
120:46
creation or tool making as a process by
120:48
which new stuff comes to exist as
120:50
opposed to evolution as a process by
120:52
which new stuff comes to exist is at the
120:54
heart of a lot of the things that I
120:56
think about here because I think it
120:58
fundamentally changes our thinking on
121:02
like social Darwinism and why markets
121:04
are kind of a viable or an extra bolide
121:07
eeeh is if we think about evolution as a
121:10
process
121:10
by which new things come about defined
121:14
by mutation survival selection and then
121:17
mate selection within an environmental
121:20
nation and of course there’s recursion
121:22
on niche creation in in evolution we see
121:29
rivalry everywhere as you are mentioning
121:31
in like biology there’s a lot of really
121:34
painful things to look at and I think
121:37
we’ve especially since Darwin modeled
121:40
ourselves as apex predators for a long
121:43
time and but I think and I think that we
121:47
actually even reified the theory of
121:49
markets with evolutionary biology to say
121:51
that demand is like a niche and that the
121:54
various versions of a product or a
121:56
service are like mutations and the
121:59
company that survives because it’s able
122:01
to supply the demand well those ideas
122:04
and those technologies make it through
122:07
and then if there’s a couple that are
122:09
mutually good we’re merging would be
122:10
good so you get a merger and acquisition
122:12
that’s kind of like mate dynamics right
122:14
like recombinant work dynamics and this
122:18
is why competition is good and drives
122:20
innovation and same as happens in nature
122:23
I think that’s kind of the way that a
122:25
lot of people think about markets and
122:27
relationship to evolution and I think
122:29
the reason we can’t think about it that
122:30
way and also the reason why we don’t see
122:34
whether it’s ants or whether it’s cells
122:36
in the body or anything why we don’t see
122:39
examples of the kinds of coordination in
122:41
nature that will apply to humans is I
122:44
think that the development of technology
122:47
both language and social coordination
122:50
technologies and physical technologies
122:52
but our capacity for abstraction and
122:54
then things that increase our power via
122:57
abstraction as opposed to their power
122:59
increases via some instantiated thing
123:01
like a gene is a fundamentally different
123:05
process because in nature you will see
123:09
rivalry you’ll see obviously one if the
123:12
lion catches the gazelle the gazelle
123:14
dies if the gazelle gets away the lion
123:15
might die right and yet all lions and
123:18
all gazelles are symbiotic with each
123:20
other meaning if there were no lions the
123:21
gazelles might eat themselves to
123:22
extinction if there were no gazelles
123:24
lions mites
123:24
so there’s this process by which micro
123:27
rivalry leads to macro symbiosis and
123:30
both of them evolving supports each
123:33
other to evolve as the Lions get a
123:34
little bit faster they eat more of the
123:36
slower gazelles the faster ones genes
123:37
recombine and you get faster gazelles
123:39
yeah but I mean you know mathematically
123:42
I think the lotka-volterra equations is
123:46
this predator
123:47
so very simple predator prey dynamics
123:49
with like let’s say two species I
123:52
understand how that can be stable right
123:55
I don’t understand that in the presence
123:59
of exponential tech I mean they’re not
124:00
okay that’s so the first thing that I
124:02
got trying to be concrete here is that
124:06
maybe something like the technology of
124:09
reputation might allow us to leverage up
124:13
small group dynamics towards large group
124:16
dynamics the idea that I don’t have to
124:19
know you to know something about your
124:21
reputation I see some hope there but
124:26
then it’s open to reputational warfare I
124:29
think reputation systems will be gamed I
124:31
agree right with it look I’m very think
124:35
about game B yeah not because I don’t
124:39
understand our need for it is that I
124:42
can’t imagine the system that gets us
124:47
out of our nature and our nature you
124:52
know rivalry abounds within nature
124:56
cooperation it’s found everywhere I
125:00
don’t see a way of getting everything
125:04
towards universal disclosure and
125:07
cooperation but I’m I hear you are one
125:10
of the people who was the farthest along
125:12
thinking about how we might pull this
125:15
off yeah and I know I told you earlier I
125:19
have to apologize for a strange night
125:21
that had me not sleep so operating at
125:23
low capacity so I think I’m less clear
125:25
than ideal but no I want to say a little
125:30
bit more because just saying make large
125:32
groups work like small groups is like
125:33
that doesn’t help at all I want to
125:35
actually say a little bit more about
125:36
how we would do that sure it
125:38
specifically why the tool-making thing
125:40
is such a big deal and why the
125:42
biomimicry examples don’t work because
125:44
it specifically then plays into what
125:45
does have to work the mutation pressures
125:51
that are happening in nature are
125:53
relatively evenly distributed across the
125:55
system we think about mutation survival
125:58
selection and then breeding selection
126:00
and so you don’t get a situation where
126:04
one species gets a thousand x advantage
126:08
in a single quick jump independent of
126:11
all the other ones right the mutation is
126:12
only going to be so big and the mutation
126:16
forces that are happening on the lions
126:18
are also happening on the gazelles right
126:20
so they’re all experiencing gamma rays
126:22
or oxidative stress or copying errors or
126:24
whatever similarly so that’s one thing
126:26
and then the other thing is that there’s
126:28
co selective pressures as as the lion
126:31
gets a little bit faster then the
126:33
gazelles end up getting faster because
126:34
the slower ones get eaten and the faster
126:36
genes recombine and so because of the
126:39
pair because of the even distribution of
126:42
mutation and because of the co selective
126:45
pressures there’s a certain kind of
126:47
cemetry of power that happens right the
126:50
gazelles get away as often or more often
126:52
than the Lions get them and so you only
126:55
get the situation where micro rivalry
126:57
leads to macro symbiosis when you have
126:59
and also the situation of metastability
127:01
of an ecosystem when you have something
127:04
like a cemetry of power within the
127:06
system asymmetry of power yeah if the
127:10
Lions got a thousand times more
127:11
predatory in one generation they would
127:14
end up eating all the gazelles and then
127:16
going through their own collapse the
127:19
they get they get they increase their
127:23
per date as they increase their potato
127:24
capacity the environment increases its
127:28
capacity to respond to the per date of
127:30
capacity symmetrically similarly I mean
127:36
this works up to a point I mean part of
127:40
the problem is is that gazelles are not
127:42
the only thing that dying on Lions that
127:44
Lions dine on right well and furthermore
127:49
you know lions are not the only even
127:53
even if lions are atop some predator
127:55
hierarchy one lion in 20 hyenas is not a
128:00
reputation for it’s not a recipe for
128:03
lion happiness so you have you have very
128:07
complex dynamics with with many species
128:12
interacting and that’s what I mean you
128:14
have meta stability of the whole
128:16
ecosystem not stability because some
128:17
species will die off and other species
128:19
will emerge but you have okay an
128:21
increase in orderly complexity but there
128:23
is a parallelism between lion and lion
128:26
between lion and hyena between lion &
128:29
gazelle right and if there wasn’t you
128:32
would have you wouldn’t end up having
128:34
metastability you’d have something have
128:35
a runaway dynamic that was unchecked by
128:37
the dynamics of the environment so
128:40
basically the the forces of the
128:42
evolutionary forces that are happening
128:43
are happening across a whole solution
128:45
the whole system and Co affecting each
128:47
other but with tool making tool making
128:50
didn’t occur for us with a mutation tool
128:54
making was us consciously understanding
128:56
that this sharp rock that maybe a chimp
128:59
would experientially use a sharp rock
129:01
and then use another sharp Rock and
129:02
realized this rock was experience a
129:04
sharper but it wouldn’t understand the
129:06
abstract principle of sharpness to make
129:08
sharper Flint things our capacity for
129:11
abstraction leading to tool making like
129:13
that made us increase our predatory
129:16
capacity radically faster than the
129:18
environment could become resilient to
129:20
our increase per date of capacity and
129:21
that was the beginning of a curve that
129:25
has you know started to vertical eyes
129:26
exponentially recently but because of
129:29
that tool making we could put on clothes
129:32
and go to the Arctic and become the apex
129:35
predator there in a way that the lion or
129:36
the cheetah couldn’t leave its
129:37
environment we could we could go become
129:40
the apex predator in every environment
129:42
and over hunt the environments and then
129:44
when we would over hunt an environment
129:46
rather than have our population come to
129:47
steady-state we could go move to and
129:50
start over hunting in other environment
129:51
and then figure out a grow culture
129:53
that’s super different than every other
129:55
animal and so you don’t have a situation
129:58
anywhere in nature where like a single
130:01
lion could do that much damn
130:02
to its environment but you do have a
130:05
situation where a single person like a
130:06
Putin or a Trump or whatever could do
130:08
massive damage because of Technology 2
130:11
the total biosphere you don’t have a
130:13
situation where a single cancer cell can
130:16
propagate cancer genes instantly to the
130:18
whole system it’s gonna affect the cells
130:20
around it which have a chance to then
130:21
correct it there’s a lot of corrective
130:23
mechanism so the exponential tech
130:25
increases our leverage so much that if
130:29
we that individuals and small groups
130:34
have the capacity to influence the rest
130:36
of the human space but also the bio
130:37
space in a way that nothing else has so
130:39
there is no example anywhere in biology
130:42
of a system that can that has the kind
130:45
of asymmetry relative to its whole
130:47
environment that we have so yeah if I
130:52
understand correctly I mean the slight
130:56
adjustment I would I would give is that
130:58
orcas get you part of the way there
131:02
because they’re a broadly distributed
131:05
apex predator they occur in southern
131:08
northern seas they have all sorts of
131:09
different strategies the thing that that
131:14
you’re coupling it to which I think is
131:15
very interesting is that nobody has seen
131:18
a 10,000 fold increase in Orca
131:22
efficiency as a predator so it may be a
131:24
we couldn’t because as they start eating
131:27
too many of the fish then they can’t
131:29
keep breeding you know I understood that
131:30
point so my my point is that you said
131:34
you were trying to indicate that you
131:36
could just keep changing your
131:37
environment like your clothing becomes a
131:39
microclimate so that you’re able to
131:41
become the eight the the polar bear is
131:43
no longer the a prick apex predator of
131:45
the Arctic right and you could make the
131:47
argument that the Orca is not the apex
131:50
predator of the Seas because we’re in
131:52
the seas and I think the example there
131:54
is just to think of an ocean trawler
131:56
with a mile long drift net and the
131:58
number of fish it pulls up compared to
132:00
an orca and you realize that we can’t
132:02
model ourselves as apex predators that
132:04
are competing with others to see who’s
132:06
maximally dominant with that much power
132:07
without either story this is a very
132:09
interesting point and I think the idea
132:13
that we are without precedent
132:16
many of us accept we don’t know of any
132:20
other species that has language ability
132:22
to coordinate the way we do all those
132:24
certain social species from African dogs
132:27
to orcas to what have you you know are
132:29
pretty impressive and their ability to
132:30
coordinate in one form or another so
132:32
what I hear you is saying is the tool
132:35
use and the extended phenotype if you
132:38
will to use Richard Darwin’s concept
132:41
like for example these microphones are
132:43
part of our extended phenotype because
132:46
they are tools that allow us to do
132:48
something yeah okay that changes the
132:52
picture and it also ends up introducing
132:55
both a fundamental thing about the
132:57
problem and the solution I’d recommend
132:59
tell me about the solution okay and then
133:01
tell me about the problem I want to have
133:03
well which order would be better
133:05
logically I just I would love to get to
133:07
the positive uplifting yesterday so so
133:16
we can say that what’s particularly a
133:19
primary thing that’s particularly unique
133:21
adaptively about homo sapiens
133:23
yeah is our capacity for technique right
133:26
our capacity for tool and that social
133:28
tools like language and democracy and
133:30
but also physical tools and they are all
133:33
abstract pattern replicators rather than
133:36
instantiated pattern replicators right
133:38
so it memes rather than genes so you
133:41
could say that what humans selected for
133:44
our genetic selected for mimetics our
133:46
genetic selected for radical
133:48
neuroplasticity and the capacity to have
133:51
much more significant software upgrades
133:54
that could change our capacity without
133:56
needing hardware upgrades and so and I
134:01
would argue that this is partly why we
134:04
have such a long period of neoteny right
134:07
why we have such a long period of being
134:08
totally helpless on the outside is
134:10
because I’m gonna give up on trying to
134:14
get you to redefine the words that are
134:15
going to cause people to have to go to
134:17
their dictionaries I think one of the
134:19
one of the things that I’ve said about
134:21
this podcast is that we made me miss
134:25
speak we may use language improperly but
134:28
we should at least play with it and
134:30
people to look things up on their own
134:31
hey I only was doing that because I
134:34
listened to the half-hour thing that you
134:35
said did say you’re gonna let people go
134:37
look at their dictionary yep so I bring
134:39
this rule you’re playing me against me I
134:42
love it
134:43
all right so an extended period of
134:45
neoteny go on yeah so we’re embryonic on
134:48
the outside meaning we’re helpless for a
134:49
super long time compared to anything and
134:52
obviously there are some animals like
134:53
birds that are more helpless than other
134:55
ones for longer periods but nothing like
134:57
us but we don’t if we came hardwired how
135:00
to be fit to our environment that would
135:03
make any sense because we change our
135:04
environment so fast most creatures
135:06
emerged evolved to fit an environmental
135:09
niche but as niche creators as
135:12
significant as we are both because we
135:13
moved places then you know like this is
135:16
not an evolved environment and it’s not
135:18
that adaptive for me to throw Spears but
135:19
I do need to be good at texting so we
135:22
had to come to be able to learn language
135:24
whether I’m learning English or Mandarin
135:26
whether I’m learning spear-throwing or
135:28
texting or whatever and so what I would
135:31
say is that essential to human nature is
135:34
the depth of nurture capacity relative
135:37
to other species and so when I look at
135:40
the thing we call human nature I look at
135:42
how much I think the social sciences
135:44
don’t factor that there is Oba quit is
135:46
conditioning that we’re doing the social
135:50
science within that is Rubik what is
135:52
conditioning and there are outliers that
135:54
are actually relevant that aren’t just
135:56
genetic all right so if I understand you
135:58
correctly and now we’re gonna just
135:59
totally geek out we are the most case
136:02
selective of species that is that we put
136:04
the largest investment into our young we
136:09
delay reproductive maturity for 12 times
136:13
around the Sun seems crazy and therefore
136:16
your point is we’ve got an unparalleled
136:20
opportunity for teaching for adaptation
136:23
because we unlike the wildebeest who has
136:25
to be more or less ready good to go
136:27
almost from the moment of birth minutes
136:29
Yeah right
136:30
the idea is that we are in the luxurious
136:32
position of having a long period of
136:37
development and knowledge transfer
136:39
because we are more about the extended
136:41
phenotype and we look at this anthill
136:44
it’s pretty amazing yeah so what that
136:46
tells me is I look at some outliers on
136:49
both sides of the bell curve of various
136:52
dimensions of the human condition and
136:54
let’s say we take Buddhism for instance
136:57
we have something like three millennia
137:00
of 10,000,000 fluxing give or take
137:03
people who mostly don’t hurt bugs across
137:08
different bioregions and across
137:10
different languages and that’s really
137:13
significant when we think about the
137:15
inexorability of violence in humans and
137:17
then we look at say the Janjaweed or
137:20
some group of child soldiers where by
137:22
the time someone’s a teenager they’ve
137:24
all hacked people apart with machetes I
137:25
think that the human condition can do
137:28
both of those human nature can be
137:30
conditioned to do both of those but then
137:33
I see that we have a system where in
137:35
general them as soon as a tribe figured
137:38
out as soon as a couple tribes were
137:42
competing for resources it was generally
137:44
easier to move than it was to war until
137:48
we had moved everywhere in which case it
137:50
was it started making sense to war and
137:53
then as soon as any tribe militarized as
137:55
every other tribe has to militarize or
137:57
they lose by default
137:58
and the game of power has begun in in
138:01
earnest in that way the human on human
138:03
game and I think we’ve seen that the
138:07
peaceful cultures largely got killed by
138:10
the warring cultures and the warring
138:12
cultures learn from each other how to be
138:14
more successful at it and so the thing
138:17
that we have now is something that has
138:21
emerged through iterations on power
138:24
dynamics and it’s conditioning everyone
138:26
within it and then we do all of our
138:28
social studies within that and say this
138:29
is human nature Wow
138:31
so this is a very weird place to get
138:34
brought back to because I’m I’m on the
138:37
escape branch of our decision tree and
138:39
what you’re talking about
138:41
is possible when you can do better by
138:47
investing in peaceful and kind
138:54
alternatives I don’t know what to call
138:57
exactly but nonviolent alternatives and
139:01
as soon as things become kind of steady
139:04
state zero-sum you start eyeing other
139:08
people those protein sources because
139:10
that’s the way to grow a slice and I
139:13
don’t know how you get out of this in a
139:16
finite world so maybe the idea is that
139:19
you’re you have a concept of escape
139:21
yeah that isn’t physical escape I think
139:24
Malthus was right at the time but wrong
139:29
fundamentally mmm-hmm where he said
139:32
resources are reproducing geometrically
139:34
or humans reproducing geometrically
139:36
resources arithmetic aliso there’s
139:38
either not enough or there’s not going
139:39
to be enough certain point well he
139:42
hadn’t got to the point that some
139:43
cultures went into negative population
139:46
amounts in lower birth rates without an
139:49
imposition it’s not just China’s you
139:51
know one child in position that did that
139:52
but we’ve seen birth rates low enough in
139:56
some of the Nordic countries and in
139:58
Japan and he hadn’t got to the point of
140:00
seeing the phenomena that bring that
140:01
about or the ability to recycle
140:04
effectively and which means not a linear
140:07
materials economy so I’m starting is I’m
140:09
starting to guess where you’re gonna go
140:11
so if I if I understand you correctly
140:13
the idea is that you’re going to look at
140:16
all of the places we’ve been a little
140:17
bit sloppy like recycling wasn’t a place
140:20
that we put too much attention and
140:24
increasingly as we understand that stuff
140:27
is limited we we have more of a reason
140:31
to be careful about our land use rare
140:34
rare resources I think I get that part
140:37
of it then you have another idea here
140:40
about development is kind of unused and
140:45
we could do something far greater and
140:48
then you just had another one as
140:50
slipping my brain population oh that we
140:54
would start to see fertility below
140:57
replacement rates so that you would
140:59
actually go into population decline as a
141:01
means of taking pressure off of the
141:03
system yeah so I see the possibility for
141:06
a steady-state population
141:09
that is within the carrying capacity of
141:11
a closed-loop materials economy but that
141:15
is fueled by renewable energy so you
141:17
basically have a finite amount of atoms
141:20
so you circle the atoms you don’t have a
141:22
finite amount of energy because you’re
141:23
getting more energy every day be of a
141:24
finite amount per day and so you have to
141:26
be able to cycle the atoms within the
141:28
energy bandwidths and you’re cycling it
141:30
from one bit pattern into another bit
141:32
pattern right like from one form into
141:35
another form and the forms are stored as
141:37
a bit so you have atoms energy and bits
141:39
and you don’t really have a limited
141:40
number of the bits that you can have and
141:42
so we can have a economy where it’s
141:46
getting continuously better but not by
141:48
getting bigger but by getting better we
141:51
continuously make more and more
141:52
interesting things with the same format
141:54
listen you’ve always had the possibility
141:55
of decoupling economic growth from let’s
142:02
say burning fossil fuels we just haven’t
142:06
gotten around to doing very well okay
142:09
what I’m starting to hear is that you
142:14
believe potentially that maybe we should
142:16
embrace declining populations as a means
142:20
of either and I don’t put words in your
142:24
mouth but I’m just trying to guess ahead
142:26
one possibility is is that we need to
142:30
amplify the people who can live
142:34
peaceably and that maybe the idea is
142:36
that people who can’t live peaceably
142:38
need to be incentivized to maybe have
142:43
fabulous somatic lives but without
142:45
reproducing I don’t know so that we can
142:47
drive certain traits towards zero maybe
142:51
the idea is we just need to take ambient
142:53
pressure off the system and so we need
142:55
to go into a world where eight-billion
142:57
becomes 6,000,000,000 becomes 1 billion
142:59
and we start dropping down again I think
143:04
we see that obviously birth rate is
143:08
higher where there’s poverty and we
143:09
might lose some kids right and so as we
143:13
just get out of abject poverty birth
143:14
rates go down and then as total economic
143:19
quality of life and the choice
143:23
abilities for women and education and
143:26
other things go up we start getting too
143:29
much lower birth rates and no I’m not
143:32
concerned that the birth rate will just
143:34
collapse forever we’ll come to some
143:36
steady-state birth rates but those are
143:41
happening as a function of increased
143:43
good things increased quality of life so
143:46
in other words if you make the
143:47
opportunity cost for childbearing
143:50
enormous by making sure that let’s say
143:53
females have outrageously great career
143:56
prospects and it starts to become much
144:01
more fulfilling did she doesn’t wanna
144:03
spend their whole life pregnant well
144:05
look I mean the there’s different issues
144:08
with women not realizing that most of
144:11
their children will survive which is
144:14
happening in the demographic transition
144:16
so people miscalculated for a period of
144:19
time leading to fears about runaway
144:21
population booms so that’s that’s one
144:24
effect and then there’s another one
144:26
about if you give people education if
144:29
you give if you educate women the
144:31
opportunity costs of staying home and
144:33
raising children starts to impress
144:36
itself and so people will have fewer
144:37
children yeah but I think where you’re
144:41
where you’re headed is super interesting
144:43
and part maybe it’s one of the reasons
144:45
that people might find it rather
144:46
disturbing making life awesome for
144:50
females might mean having far fewer
144:53
children yep all right so that’s a good
144:57
thing in ‘shmock did burgers well yeah
144:59
this is so both I mean the Malthusian
145:02
trap right the Malthusian situation is
145:05
both the geometric production
145:08
reproduction of humans and the
145:10
arithmetic reproduction of resources and
145:12
I think neither of those are true
145:14
inexorably true I think we can keep
145:17
cycling the resources and so basically
145:20
we can have a steady-state human
145:21
population within a renewable materials
145:25
economy carrying capacity but we’re
145:29
we’re keep but we keep innovating on
145:31
bits so we keep making more and more
145:33
positive and interesting things so we
145:35
keep getting an increase in quality of
145:37
but not by increasing the quality the
145:39
quantity of the pie and the quantity of
145:41
people consuming it but the quality of
145:43
it well the world of Adams I can’t have
145:44
Bill Gates home in Washington State but
145:47
in the world of bits maybe I can live
145:49
there in my virtual reality and even
145:54
have much more fantastic places and so I
145:57
agree that bits have some ability to
146:01
create wild abundance that goes
146:03
non-rival rest’s but i brought up a very
146:05
different concern which you may be
146:07
familiar with which is abundance can
146:09
kill you if you have if you look out
146:12
these windows and you see all of these
146:14
people engaged in activities without
146:16
being told to do so by a central
146:19
authority
146:19
what is it the ties that together for
146:23
the most parts markets with some amount
146:25
of state control of violence in the form
146:28
of policing okay so now you create
146:32
abundance an abundance has this weird
146:34
effect that it turns private goods and
146:36
services into public goods and services
146:38
where price and value are no longer
146:40
equal and suddenly you have people who
146:43
are producing things that are very
146:44
valuable and can’t get paid right and so
146:48
how do we handle the takeover in this
146:52
hypothetical world where we get to an
146:55
economy of abundance that doesn’t
146:57
actually cause a collapse of
147:00
civilization you can you can die from
147:01
abundance though a market can die from
147:05
abundance but I’m not proposing a market
147:07
society I okay so I like that so the
147:10
idea is that we welcome the destruction
147:13
of the markets to be replaced by and
147:17
it’s important to say obviously if I
147:20
have a situation where valuation is at
147:25
least largely proportional to scarcity
147:27
then I have a basis to continue to
147:29
manufacture artificial scarcity and if
147:31
something becomes abundant enough it
147:33
loses value then of course abundance and
147:36
markets don’t go together I’m very
147:39
excited about any credible thing that is
147:43
better than markets because markets well
147:46
laden with problems
147:49
have been pretty amazing and what they
147:50
produced yeah I’m not gonna criticize
147:52
the illusionary path here to say we can
147:56
argue straightforwardly why this path
147:58
can’t continue why the nature of it self
148:01
trap I agree but the big problem here
148:03
has always been that we have so little
148:04
experience with self terminating our
148:09
rival risk desires well so this is why I
148:12
bring the Buddhists up all right and I
148:14
think the Buddhists got past one part of
148:16
the Dunbar number if we think about it
148:20
we think of a couple of Buddhist
148:21
countries for our listeners at home that
148:24
they can keep in mind while you’re
148:26
talking about but this mostly they don’t
148:30
have countries anymore there are
148:31
Buddhists in a lot of Southeast Asian
148:34
countries so there are Buddhists in
148:37
India there are Buddhists there’s a lot
148:39
of Buddhists in Nepal
148:41
obviously Tibet was Buddhist before
148:43
Tibetan stopped existing in that form
148:47
but and you know I could bring up Jane’s
148:49
or others but they’re so few of them
148:51
that it’s a little bit easier to throw
148:53
it out as an outlier but basically
148:55
cultures that were widely peaceful but
148:59
it is important to say the widely
149:00
peaceful ones did largely get either
149:03
killed by warring cultures or somehow
149:07
taken over by them or they became
149:09
warring at a certain point and this is
149:10
why your escape hypothesis which your
149:13
escape hypothesis only works if we can
149:16
make a much better civilization but it
149:17
needs to not have proximity to the thing
149:20
to external sources of rivalry so that
149:22
it can develop I want to say you know
149:25
that one of the reasons I keep pushing
149:26
you on these things is not because I’m
149:29
trying to do a gotcha style interview
149:31
the concern let me just be open about it
149:33
is that there are so few people who are
149:38
thinking who are tempting to think
149:39
rigorously about what we actually are
149:42
and what we must become if we are to
149:46
have a long term future that I’m not I
149:51
believe that you or somebody was trying
149:53
not to flinch when it comes to a
149:56
description of how we got to this place
149:58
from the arms race that is read of tooth
150:00
and claw we’ve called called nature
150:02
and yet your point is maybe we can hack
150:05
ourselves into a situation with the
150:08
future where with exponential tech as
150:10
you call it we don’t have a future and
150:12
here is the basis for rigorous idealism
150:16
and hope and so that’s what I’m trying
150:18
to tease out no great yeah I don’t think
150:22
that we are inexorably rival hrus can we
150:28
take this weirdly into the the realm in
150:31
which it is hardest to imagine that we
150:34
are not rival risks we sense sex as the
150:37
precursor to reproduction the floor is
150:41
yours sir okay
150:43
this is going to make the conversation
150:44
weird no no I’m look I think that where
150:49
you’re heading let me rephrase this
150:56
every branch of the decision tree has
150:58
gotten hyper weird and anybody who’s not
151:01
looking at the fact that there is no non
151:03
weird branch of the decision tree is
151:06
missing the story of who we are and what
151:08
time it is in human history so I think
151:11
to not explore the weird to not dream
151:15
about what might be is the least
151:17
responsible least adult thing we can do
151:22
if we don’t dream and we don’t explore
151:24
the weird we’re doomed
151:25
yeah all right with that the floor is
151:28
yours okay
151:34
I wanted to go somewhere with Buddhism
151:36
and why not an extra blue rival race and
151:38
that then if they were to actually get
151:40
the other side of the dunmer number
151:42
which is not just getting care beyond
151:44
the Dunbar number which they could do
151:45
through abstract empathy but also the
151:48
ability to calculate and coordinate
151:50
which they couldn’t because they didn’t
151:51
have the tech to do it and I’m basically
151:54
gonna say we can get something like abs
151:58
oh well okay I’ll do this X thing into
152:01
Buddhism thing together cuz actually go
152:02
together I think we get something like a
152:06
certain level of empathy up to the
152:08
Dunbar number just through mere neuron
152:11
type effects through the fact that I
152:13
know these people they know me we’ve
152:14
lived together if they’re hurting I am
152:16
gonna see it because they aren’t
152:17
somewhere far away okay and similarly
152:21
I’m less likely to pollute in an area
152:23
I’m in then through an industrial supply
152:25
chain that pollutes somewhere that I’m
152:26
not so just a proximity where the cause
152:29
and effect has a feedback loop as we
152:31
start to get to much larger scales where
152:33
I haven’t a cause and there’s an effect
152:35
but I don’t get a feedback loop on it
152:36
the broken open feedback loop is a
152:38
problem so I think the Buddhists were
152:42
able to Train abstract empathy not just
152:45
empathy for the people who I see hurting
152:47
but empathy for all sentient beings
152:49
throughout time and space right feeling
152:52
their connectedness with them that the
152:54
nature of the vows of the Bodhisattva
152:56
and they’re not the only one right this
152:58
is different religions have tried to do
153:02
this but it’s an example of a group
153:03
succeeding at it where they were able to
153:05
have a sense of positive coupling of my
153:09
well-being in the well-being of another
153:11
rather than inverse coupling they get
153:13
ahead and it’s decreasing my ability to
153:15
get ahead what they the other side of
153:18
the Dunbar number was not just who we
153:20
care about but also our ability to
153:22
coordinate and I don’t think they were
153:25
able to figure out coordination
153:26
mechanisms that are adequately effective
153:29
at scale okay I think if we do both of
153:31
those things we can make a fundamentally
153:34
different kind of civilization and
153:38
rivalry mostly comes down to today
153:42
private balance sheets which is I can
153:44
get ahead economically and that money
153:46
equals option
153:47
for most of the things that I want
153:49
alright and I can get ahead economically
153:51
independent of you getting ahead and
153:53
even at the expense of you getting ahead
153:55
or the expense of the Commons right and
153:57
so my near-term incentive can oftentimes
154:01
be a long-term disadvantage to others of
154:04
the whole so now this basis of where my
154:07
well-being and the well-being of others
154:08
were the Commons the Delta between those
154:11
is the basis for rivalry but then
154:14
dealing with that rivalry keeps
154:15
increasing coordination costs keeps you
154:18
know creating disinformation systems
154:20
where we can’t coordinate effectively so
154:24
how we deal with the balance sheet part
154:26
there’s a few things right now for me to
154:30
have access to stuff I have to mostly
154:34
with a few exceptions possess the stuff
154:36
right so possession and access are
154:38
coupled and if I possess something I
154:41
don’t have to be using it I’m just
154:43
reserving the optionality to use it the
154:45
drill that sits in my garage that I
154:47
might not have used in a couple years
154:48
but at least it’s convenient that when I
154:49
wanted it’s there right but me
154:53
possessing something means that I have
154:54
access to it and means you don’t have
154:56
access to it and so with a finite amount
154:59
of stuff the more stuff you possess the
155:01
less stuff I have access to rival risk
155:03
basis but we all know library type
155:07
examples or shopping carts where if I
155:09
have enough shopping carts of the
155:10
grocery store for peak demand time I
155:13
don’t have to bring my own shopping cart
155:14
which would be a pain in the ass and
155:15
would require 10,000 shopping carts per
155:17
grocery store rather than 300 everybody
155:20
bringing them so what matters is you
155:23
having access to the shopping cart
155:25
doesn’t decrease my access and we start
155:27
to see a potential for this if we think
155:30
about something like an uber and then we
155:33
think about self-driving eibar that then
155:35
has a blockchain that disintermediate
155:37
saat being a central company and being a
155:40
commonwealth resource where those were
155:42
you having access to it doesn’t decrease
155:44
my access so we’re not rival risks
155:46
anymore but then we take the next step
155:48
and say if you having access to
155:50
transportation then also allows you to
155:52
go to the maker studio that you have
155:53
access to to the science studio to the
155:57
educational places to the art studios
155:59
where you then have more
156:01
access to be creative but the things
156:03
that you create you aren’t creating for
156:04
you to get more money and get ahead
156:06
because you already have access to all
156:08
the things that you want and you don’t
156:09
differentiate yourself by getting stuff
156:11
you differentiate yourself by the things
156:13
that you offer because you already have
156:15
access to stuff so there’s a
156:17
fundamentally different motive structure
156:19
then you having access to more resources
156:22
creates a richer Commons that I have
156:25
access to so now we go from rival risk
156:29
not just to non rival risk which is
156:30
uncoupled but anti rival risk meaning
156:33
you getting ahead necessarily equals me
156:35
getting ahead and so on when we look at
156:39
getting out of the Malthusian type
156:41
dynamics part of it is that we can
156:43
actually get out of the population
156:45
dynamics part of it is that we can’t
156:47
actually get a closed-loop materials
156:48
economy with renewable energy that can
156:50
continue to upcycle and part of it is
156:53
that we can utilize our resources much
156:56
more effectively and much less rivalry
156:58
slee where we start decoupling access
157:01
from possession that’ll start easily in
157:03
some areas be harder in other areas but
157:05
we start with in the areas that it
157:06
happens and so we start getting more and
157:09
more of a situation where I want you to
157:11
have access to more things because as
157:13
you’re more creative than I get access
157:15
to more things that are the results of
157:16
your creativity so we’re so this is an
157:21
example of removing some of the basis of
157:24
rivalry associated with balance sheets
157:26
okay I can go to sex underneath that now
157:29
if you want me to usually go where is
157:32
most natural to take the conversation
157:34
okay I will just try to fall
157:46
and the problem is if you go to sex
157:49
directly from where you are you are
157:51
describing the value let’s say of
157:54
prostitution which is that people do not
157:58
have to make a commitment to a sexual
158:02
partner many people can have the same
158:07
sexual partner you start to get into all
158:11
of these very funny areas where status
158:15
for example is a very weird commodity do
158:22
I want you to have more status because
158:24
somehow that will give me more status do
158:26
I stop caring about status if there is
158:29
exactly one parcel of land which has a
158:32
unequivocally the best of you is that
158:36
something that I want you to have rather
158:39
than me having it yeah so let’s talk
158:47
about status for a moment if I’m
158:53
comparing you and me in terms of who has
158:55
more dollars or who’s taller or who can
158:59
run faster or some I can compare us on
159:02
the same metric right now and if status
159:06
is number of followers on Twitter then
159:09
whatever Kim Kardashian’s most
159:11
interesting human being that’s ever
159:12
lived and so I I think we know that
159:15
reductionist metrics on status are also
159:18
gamified and inappropriate but if we say
159:21
like MC escher or dolly like what was
159:27
more brilliant art I think it’s a
159:30
meaningless question because they both
159:33
offered something completely novel to
159:35
the world and something meaningful and
159:37
beautiful that neither of the other ones
159:39
offered or could offer and I can’t
159:41
compare them because I can’t Metra sized
159:43
them and the reduction of that that’s
159:47
the thing is I can’t reduce totally
159:49
unique things to a fungible metric so
159:51
one of the problems I think is actually
159:53
fungibility and metric reduction
159:58
and so if you have status associated
160:02
with unique things that you offer to the
160:03
world awesome I’m not competing with you
160:06
writ large for more status I’m going to
160:11
people are gonna have a relationship to
160:14
me for the things that I offer and those
160:16
are really the people that I want to
160:18
have a relationship with me and if
160:20
you’re offering things to the world that
160:22
people have a relationship to you for
160:23
and I see that the world is getting
160:26
better as a result of what you’re
160:27
offering and I have access to more a
160:29
better world as a result of it I’m
160:31
totally stoked on that this is where it
160:35
starts to feel not real to me and no
160:38
yeah okay but let’s but let’s go through
160:41
the show here’s why it sounds not real
160:43
alright I think so do we have a slowing
160:50
in technological progress yes and you
160:55
know less so in some areas than in other
160:58
areas but do we still have exponentially
161:03
growing technology in terms of both
161:05
cumulative amount associated with number
161:07
of people in globalization and in terms
161:09
of just technologies that are still
161:11
continue to grow yes of course we do so
161:13
is it 50 years or hundred years we don’t
161:15
know but I really like I have to think
161:19
of this in a kind of a mythopoetic frame
161:21
that’s how it occurs to me is it as we
161:24
as technology is empowering our choices
161:27
and we are getting something like the
161:29
power of gods you have to have something
161:31
like the love and the wisdom of gods to
161:34
wield that or you self-destruct and so
161:37
when I think about I think about the
161:40
rapture story or the Mayan calendar or
161:43
any of those stories in a metaphoric
161:45
sense as just like let’s say you and I
161:48
were in the Bronze Age and we had just
161:49
seen a larger war than had ever happened
161:51
because there were some new better
161:52
weapons Android shoot further distance
161:54
and there were deserts where there
161:56
didn’t used to be deserts because we had
161:58
got new better axes and saws and had
162:00
been able to cut down more trees and we
162:02
just thought about and we said we’re
162:04
still developing better weapons and were
162:05
developing better economic extraction
162:07
tools were using our power in ways that
162:10
are coming
162:11
destructive in a narrow sentence and
162:13
destructive in a larger sense but
162:15
everybody is doing that this doesn’t get
162:17
it happen forever so this phase defined
162:20
by increasing power on all sides used in
162:24
destructive ways constructive narrowly
162:26
but destructive broadly that phase comes
162:29
to an end and there’s something like a
162:31
hard fork where if we keep doing
162:32
anything similar to that it’ll come to
162:34
an end cumulatively whether existential
162:36
or catastrophic more likely catastrophic
162:38
right not full everything end but a lot
162:40
and to be able to have that much power
162:43
and not use it in ways that destroy the
162:47
system requires being actually good
162:50
stewards of power so then the whole
162:52
question for me becomes how do we make a
162:55
social system like what is the the
162:57
Bodhisattva engineering how do we make a
162:59
social system that is conditioning not
163:03
just individual humans but also
163:04
collectives to do good choice making
163:07
Omni positive kind of choice making well
163:10
I have to have a sense making system
163:12
that can factor things like
163:14
externalities ahead of time better and
163:16
that doesn’t have things like multipolar
163:18
traps where if anybody is doing the
163:20
fucked up thing that everybody has to do
163:22
it and so I can start to think about
163:25
what architectures such a system would
163:27
have to have to be able to do sentence
163:30
making as to what externalities would be
163:32
and be able to internalize them and
163:34
where then I can actually confer rights
163:38
oice making and that we’re developing
163:39
humans so again think about the the
163:43
education associated with some religions
163:45
bringing about less violence the
163:46
education associated with some cultures
163:48
bringing about higher average cognitive
163:51
capacity and being able to bring those
163:53
together as much as I know this sounds
163:56
like hippie and silly I don’t actually
163:59
see anything other than a radical
164:02
increase in our good stewardship of
164:07
power it makes it I love the idea that
164:11
you think that there might be something
164:13
here but let me come back it with my and
164:16
again I’m not trying to be negative I
164:18
had it experienced at some point your
164:20
answer requires a warp drive
164:23
so we we both recognize the end-user
164:27
ability of this thing and then are
164:28
saying okay so what is the fundamental
164:30
thing and something Lou I’m not making
164:33
fun of you because what you’re saying is
164:34
insane
164:35
what I’m saying is insane and the people
164:38
who are saying the most common
164:40
supposedly adult things are the craziest
164:43
of us all so I at least accept the idea
164:47
that we have to be here and I want you
164:49
on that branch and I want other people
164:52
on other branches because we need to fan
164:54
out and start exploring at least start
164:55
to care but I guess what I what what
165:00
this makes me think of it was a
165:01
particular moment in my life where one
165:06
of my closest friends brought his father
165:09
to dinner and his father was a guy who
165:12
was legendary in the film industry and
165:18
one of the things he taught his son was
165:21
never let the other guy get the first
165:23
punch in and I thought wow first strike
165:28
you teaching your child to strike first
165:34
nobody had ever suggested anything
165:36
remotely like that in all of my
165:38
upbringing I never heard anything like
165:40
this and I instantly recognized it for
165:43
what it was somebody was going to
165:45
parasitize whatever I had been taught
165:49
and say well great Eric’s been taught
165:52
self-restraint Eric’s been taught to
165:54
turn the other cheek
165:56
to make sure that you de-escalate a
165:58
conflict and goodie-goodie
166:01
more for me your multipolar trap right
166:04
okay there’s a way out of it tell me I’m
166:08
dying to hear it so do we retrofit the
166:14
system no impossible foundational axioms
166:16
are all the wrong axioms can we can we
166:22
make a situation in which we can raise
166:24
children quite differently yes go to see
166:27
kids who grew up in an Amazonian tribe
166:30
or you know some very different
166:31
conditioning environment you’ll see very
166:33
different types of human behavior
166:36
can we change already set adults much
166:39
harder not impossible but harder so can
166:45
we could we find adults that are that
166:49
would be the most likely to be fast
166:51
adopters of a new system like this and
166:53
capable so both kind of at the cutting
166:55
edge of their capacity to have abstract
167:00
wide empathy and bind that to their
167:03
action and you know deeply considerate
167:06
about actual cause-and-effect dynamics
167:08
factor complexity and work with other
167:10
people well can we find the ones that
167:12
are closest there and then train them up
167:15
additionally in some systems that are
167:17
developed for how to do a different
167:20
process of collaboration that doesn’t
167:22
lead to one way of talking about it is
167:24
that when we go to command and control
167:27
hierarchy systems to get beyond the
167:29
Dunbar number we get diminishing returns
167:31
on collective intelligence as a number
167:34
as a function of the number of people
167:35
which creates an incentive to defect
167:38
against that system even internal
167:39
defection and so then we get a problem
167:41
if we could get collective intelligence
167:43
scaling linearly we get something
167:45
radically different so we get just the
167:49
number of people that are needed to be
167:50
able to do something like that trained
167:54
to do that
167:54
and we build a civilization a full-stack
167:57
ground-up civilization because obviously
167:59
I’m talking about not private balance
168:01
sheets and private property is the
168:03
dominant system I’m also going to talk
168:05
about not democracy because the nature
168:07
of voting is inherently polarizing to
168:09
populations because we make propositions
168:11
we’re both voting for it and voting
168:14
against it suck for somebody for
168:15
something because they’re based on
168:16
theory of trade-offs where we didn’t
168:18
even tried to figure out what a good
168:19
proposition for everybody would be in
168:21
the first place so better systems of
168:24
sense-making in choice making which we
168:27
could get to and so let’s say you have a
168:30
full stack civilization of people who
168:32
are capable and oriented to implement it
168:35
and you have not only much higher
168:36
quality of life for the people who are
168:38
there but innovative capacity to solve
168:40
certain problems the world can’t
168:42
currently solve well because of no
168:44
disinformation in the system and better
168:46
coordination well then that system can
168:49
exce
168:50
port solutions that other places in the
168:54
world that would normally have an enmity
168:56
relationship with it actually need that
168:58
they can’t solve for themselves so it
169:01
can create a dependence relationship
169:02
rather than an enmity relationship and
169:04
then they’re like well why the fuck are
169:06
you figuring out these pieces of tech
169:08
and we aren’t we’re like well we figured
169:09
out a better social system and if you
169:11
want it you’re welcome to use it we were
169:12
open sourcing the technology here’s how
169:14
here’s how it works
169:15
but given that the technology as a
169:18
social technology is a social technology
169:21
of how people share information and
169:22
share resources and coordinate
169:24
differently it can’t be weaponized
169:26
because it is kind of the solvent of
169:29
weaponization itself and so any other
169:32
group using it is just now that kind of
169:35
social architecture starting to spore or
169:38
to scale and so yeah I think you get out
169:43
of the multipolar trap by you don’t have
169:46
to win at the game of power against some
169:49
external force to avoid losing at the
169:51
game of power so far if people didn’t
169:56
focus on militarizing they lost to
169:58
whoever militarized and if they didn’t
170:03
lose to whoever militarize is because
170:04
they militarized which means their
170:06
culture became a culture that supports
170:07
the ideas of militarization right but if
170:10
I focus on being able to have whoever
170:13
would militarize against me be able to
170:15
offer them things that are particularly
170:17
valuable that are novel to a collective
170:19
intelligence that can do better
170:20
innovation yeah you get out of a
170:23
multipolar trap that way I want to try
170:26
aggregating all the little bits that I’m
170:30
getting from you and seeing whether I’m
170:33
coming anywhere close okay all right so
170:36
the way I’m seeing it did is the father
170:38
first of all you’re gonna point out to
170:41
me that there are all sorts of
170:43
interesting things that have not been
170:44
really effectively scaled up so your
170:47
point about Buddhism and Jains and
170:49
what-have-you
170:51
it might be possible to use this
170:54
enormous and luxurious developmental
170:57
period for something radically different
171:01
and that something you haven’t said but
171:03
I’ll throw
171:04
to mix and see whether you rejected is
171:05
that man’s capacity for self that is
171:11
somatic eradication through fanaticism
171:15
tells you how powerful the software can
171:18
be that you can teach people to die for
171:21
a cause let’s say and which is obviously
171:25
against genetic comparatives no it’s
171:28
obviously against individual genetic
171:31
imperative but the genetics doesn’t work
171:35
at the level of the individual it’s
171:37
obviously against the somatic the
171:39
assumed somatic imperatives it could
171:43
actually benefit inclusive fitness I
171:45
think there’s a very good reason to
171:46
imagine that you actually benefit your
171:49
clan if your deed is known so I know I
171:53
don’t want to get into that but
171:54
fanaticism exists and maybe fungible I
171:57
think the Tamils for example probably
172:00
showed us that fanaticism can be used at
172:04
a political level as long as you get
172:06
access to children in Sri Lanka yeah
172:08
okay so our sister children was a key
172:12
thing I think so yeah right so the idea
172:15
is that you in effect and I don’t mean
172:18
to put words in your mouth one of the
172:21
lessons of human history is that the
172:23
developed developmental process if not
172:27
used for the traditional Darwinian
172:30
imperative is available for other uses
172:33
and it is of arbitrary power yeah now
172:39
I’m gonna get into the ethics of it but
172:41
first I just wanna get into feasibility
172:43
so first of all there’s an enormous I’m
172:46
gonna keep going back to square zero if
172:48
I don’t get this all right first thing
172:50
is you’re pointing it we’re not on the
172:51
efficient frontier we’re screwing up
172:53
everywhere we could be doing a lot
172:55
better appreciate that next point is
172:58
there are a ton of different things that
173:00
we haven’t really looked at pushing and
173:03
we could afford to push on all of these
173:04
things principle among those things is
173:07
we should be using development for
173:09
something radically different and
173:10
studying cultures which have an
173:13
intrinsically sort of non rival risk
173:16
ethos
173:17
to them to see what have we already been
173:20
able to do and then we can engineer on
173:21
top of that Adams is are different than
173:25
bits Adams have a some somewhat finite
173:28
feel to them bits feels effectively
173:31
infinite so to the extent that we can
173:33
move things from atoms to bits and not
173:35
be coupled to a market system where you
173:38
have this problem of the de bundles
173:39
creates public goods and services which
173:42
causes markets to fail but then
173:44
something else succeeds in instead that
173:47
we can start to have abundance
173:49
particularly where we decouple and we
173:51
learn more about recycling
173:53
so that finite resources are much better
173:56
appreciated for what they are that we
173:59
can get to a point where we can start to
174:01
take pleasure in each other’s pleasure
174:03
particularly if somebody’s producing
174:06
something that is extremely positive for
174:09
that society I want to see Jackie Chan
174:10
given more money to make Jackie Chan
174:12
films so I’m not angry about that so now
174:17
we’re scaling up all of these things the
174:20
things that haven’t been noticed hacks
174:22
this than the other thing I like it
174:26
maybe it’ll buy us some time here are
174:30
the things that really disturbed me
174:31
about it
174:32
one you’d have to grimace I mean I want
174:35
to have not grimacing and smiling okay
174:38
one is the the lot what is the minimal
174:43
level of violence and coercion needed to
174:46
bring about some of these changes so
174:48
this was something that I brought up in
174:49
my discussion with Peter Thiel and his
174:52
and my sort of somewhat mutual framework
174:55
really I learned something from him but
174:57
I tried to put my own thing back into it
174:59
is take take a beautiful dream ask what
175:04
the minimal level of violence and
175:05
coercion needed to accomplish it add
175:08
that in as part of the cost and ask
175:09
yourself is it still beautiful so that’s
175:11
one of the questions that I would ask
175:12
you and then I get to the issue of
175:19
certain things like lakefront property
175:23
in the atomic world anyway are valuable
175:29
and unique
175:30
and it becomes problematic to imagine a
175:34
world in which all of our previous
175:36
experience was about competing for these
175:39
things to imagine 100% adherence to this
175:47
new way of thinking well let’s go proto
175:51
pian not utopian let’s go that there are
175:54
some 0u by proto P moving in the right
175:56
direction
175:57
alright let’s say that there are some
176:00
things that are harder to make
176:03
adequately abundant than other things
176:05
but there’s a lot of low-hanging fruit
176:07
that we can start moving and as we do it
176:09
we will get there’s good reason to think
176:11
that there is a basis to do that in more
176:14
areas so in a system where when
176:18
something is more scarce it is worth
176:20
more then if I’m on the supply side of
176:23
that I have an incentive to manufacture
176:25
artificial scarcity and to definitely
176:28
prevent abundance that would debase the
176:30
value of the thing that I have in a
176:32
world where we remove the Association of
176:35
value and scarcity than where there are
176:37
actual scarcities the goal is to
176:40
engineer the scarcity out of the system
176:42
mm-hm and so if we’re talking about
176:44
limited amount of Oceanfront then this
176:46
is where we say well can we do
176:48
seasteading and create a lot of
176:49
Oceanfront that is really awesome where
176:51
there is actually more to they’re just
176:52
like more people are shopping at the
176:54
store then we need more shopping carts
176:55
and so part of the answer is how do we
176:58
actually increase the abundance but not
177:00
an exponential abundance because we’re
177:01
talking about also steady-state
177:02
population and using and a lot of shared
177:05
resources and it’s that coupled with
177:12
psychologically healthy or more mature
177:14
people that relate to these things
177:15
differently both of those are necessary
177:17
now there would be sufficient on their
177:18
own well I like that a lot
177:20
and I I do quite honestly take some hope
177:23
in that I’m finding that what people are
177:27
now rival RIS about has changed a lot I
177:30
think over the course of my life I think
177:33
Millennials are much more interested in
177:36
what what experiences have you had
177:38
recently rather than what have you
177:40
bought and purchased recently in part
177:44
because
177:44
economy kind of turned against them but
177:46
travel got cheap right and so that
177:49
that’s been interesting to see do you
177:52
believe that we have a huge nearly
177:56
universal level up in maturity and
178:00
wisdom available to us through
178:03
development hacking yeah and so it is
178:09
both how we develop that socially which
178:14
I don’t think will happen uniformly I
178:16
think will happen in pockets that become
178:18
strange attractors that other groups
178:21
want to then implement once seen because
178:24
they’re so clearly better at both
178:26
quality of life and innovation and how
178:31
long that takes to develop widely is a
178:33
while like this is a multi generation
178:35
thing okay I think that that would not
178:40
be sufficient on its own but it’s
178:42
necessary better sense making systems
178:46
where we can actually solve problems
178:48
without causing worse problems which
178:50
we’re not historically good at is also
178:52
necessary and this is both some
178:56
evolution in our epistemic sand our
178:58
actual processes of collective sense
179:00
making and collective coordination so
179:06
yes I see level ups in both of those
179:08
possible right now I’m gonna ask a very
179:10
difficult question but we have to get to
179:13
it yeah
179:15
in essence I’ve got a riff which I don’t
179:20
think I’ve said publicly which is that
179:21
the the biggest problem with discussing
179:25
sexuality is is that sexual sex is sexy
179:28
and if you have something that’s central
179:31
to the world that is almost impossible
179:33
to talk about yeah it’s a very strange
179:38
state of affairs assume that we solve
179:41
all of these problems that don’t have to
179:44
do with status sex and reproduction
179:47
according to your most optimistic
179:50
scenario but we have trouble over here
179:52
that there’s one last little pesky
179:54
problem yeah does this situation work
179:58
yes now I will speak to it because as
180:02
you said it is central and you’re wrong
180:04
because it is central of course okay but
180:08
my speaking to it is probably going to
180:10
change the comment section of this video
180:13
but so be it you know what if they don’t
180:16
want to come along for the ride they I I
180:19
think that the most important thing is
180:21
to just try to do this mean to say this
180:25
to be horrible but let’s try to take
180:27
some of the stupid fun out of discussing
180:29
sexuality by talking about it for what
180:33
it is and a central system that has to
180:38
be discussed because it is the engine of
180:40
human behavior so your brother and I had
180:45
this conversation when we met and
180:49
obviously with his background
180:50
evolutionary biology and Prime a mating
180:52
and whatever I was very interested in
180:56
his perspective and it took a little
180:58
while but for what it’s worth and let me
181:01
just jump in one second Brett were he
181:03
here yeah would break the theory of
181:06
selection into two pieces that would be
181:09
the stuff that follows natural selection
181:14
the way we expected from Darwin and then
181:16
you would break it into a second piece
181:18
which is the stuff that goes completely
181:20
counterintuitive due to sexual selection
181:25
right and that division is actually part
181:29
of the standard evolutionary teep
181:31
toolkit he does it a little bit better
181:33
and a little bit differently but that
181:35
division into natural and sexual
181:37
selection is part of the the territory
181:40
and it really matters for when we think
181:42
about resource scarcity because the
181:44
resources that people need to deal with
181:47
the first part the survival part are not
181:48
that much right actually but the
181:51
resources that people need to deal with
181:52
the mating part is more than the other
181:54
guy historically which is why the guy
181:57
with 150 foot yacht might feel bad when
181:59
the 200-foot yacht well this is up and
182:01
let’s say this is closed if you’re not
182:04
an evolutionary theorist I’m not but we
182:06
do our best there is a version of
182:10
evolutionary theory which states that
182:14
there needs to be crisis there needs to
182:20
be a function for showing that you are
182:22
better in order to keep individuals max
182:28
you know sort of on that razor’s edge of
182:31
performance and that mating
182:33
opportunities means that there’s always
182:35
a crisis there’s never enough abundance
182:38
because somebody with 13 homes is more
182:43
desirable than somebody with 9 homes if
182:45
you’re just trying to figure out if
182:46
there were a crisis I mean who would do
182:49
better right
182:50
so we have to overcome that because that
182:52
drives a Malthusian situation of no
182:55
amount of resource ever bring
182:56
sufficiency about right and drives a
182:59
fundamental rivalry which is why you
183:00
said we have to address it so what I’m
183:08
you my take on this as I explored it my
183:12
process with myself has been asking ok
183:15
as soon as I saw that the dynamics of
183:19
this world that seemed intuitive and
183:21
natural to most of us as we kind of grew
183:24
up in and were conditioned by it were
183:26
self terminating and I said any of the
183:28
things that we think of as normal I’m
183:31
willing to question deeply and so how do
183:34
I think could I imagine a high-tech
183:36
civilization that doesn’t implode could
183:39
I imagine a kind of enlightened planet
183:41
what would life be like there all the
183:44
different things conflict emotion
183:46
resources and sexualities obviously one
183:48
of the big questions and I think I think
183:53
the book sex it wrong sex at dawn
183:55
obviously gets plenty of things wrong
183:56
it’s trying to make a strong antithesis
183:58
to the standard evolutionary history of
184:01
Homo sapiens thesis but I think there
184:04
are some key parts to it when they look
184:05
at the moss wah people or the Canela
184:07
people or people that did not have that
184:10
had a stable society that was not
184:12
primarily pair-bonded but had multi-male
184:15
multi-female dynamics it’s not to say
184:18
that’s how humans mostly were that
184:19
doesn’t matter it’s to say that it’s a
184:21
possibility if it’s within the
184:23
possibility set same
184:24
Buddhism I’m not saying that’s how
184:25
people know it in sort of it doesn’t
184:29
have been just needs to establish proof
184:31
of concept and then we can try to scale
184:33
it up from yeah it’s a positive deviant
184:35
analysis for proof of concept to then
184:37
say can we make that actual is that a
184:39
viable model for a new center and is
184:42
that a possible thing to make and the
184:47
fact that it didn’t make it through
184:48
evolution so far like evolution has a
184:50
blind quality to it right where it’ll
184:52
make a a DAP tation that makes sense in
184:55
the moment DISA determined by something
184:59
like warfare that is actually not that
185:01
good long term receivin self terminating
185:03
long term so the argument if it would
185:06
have been a good system it would have
185:07
made it well the thing that has made it
185:10
is continuing to up ratchet rival risk
185:12
capacity oh and that itself is gonna
185:14
self-terminate metaclass hacking that
185:16
somehow we’ve hacked ourselves new
185:17
positions we can keep surviving yeah and
185:20
so one one version says that we can
185:22
never escape the evolutionary
185:23
imperatives the other says we will all
185:26
we have always escaped whatever our last
185:28
problem was and so we should be expected
185:30
that even if there’s only the sliver of
185:32
hope we should exploit it to the fullest
185:34
yeah and so generally this situation
185:36
happens that we have a near-term
185:39
incentive to pursue some advantage but
185:41
where the disadvantage of that thing
185:42
might happen over a much longer term and
185:44
that’s like one of the fundamental
185:46
problems right the externality might
185:48
show up over hundreds or thousands of
185:49
years but the benefit occurs over this
185:51
year so I have to do it so we have to
185:54
get over that actually if we’re
185:55
affecting the world in such fundamental
185:57
ways over the long term we have to
185:59
actually be factoring that into our
186:00
decision making now that’s one of the
186:02
minimum requirements of a game be if
186:05
it’s going to exist which also means of
186:07
a viable civilization at all so when it
186:13
comes to status because I think status
186:15
and sexuality go largely together it’s
186:17
not exactly one for one but there
186:19
there’s a strong correlation I was
186:24
listening to you on a few podcasts and
186:26
you were talking about B Prime and
186:29
talking about spinners and your kind of
186:31
geometric unity and I was just fuckin
186:34
loving it and I was loving even the
186:36
status of like you described
186:39
theoretical physics and mathematics well
186:42
which are topics that you know so much
186:43
better than I do but that I’m fascinated
186:45
by and educating the public about it and
186:48
there was no like status competition
186:50
impulse in me that was like oh boy wait
186:52
he is being seen as smart for these
186:55
things I was like wow this fucking
186:56
awesome I hope that he gets more status
186:58
doing that because it’s obviously good
187:00
for the world jeez I have such different
187:03
intuitions about this I mean you know to
187:07
be blunt about it
187:09
I didn’t really talk about this stuff
187:12
for ages and there was a part of me that
187:16
cared about status but this was always a
187:20
part in fact I really to the extent that
187:25
I think that I have anything interesting
187:26
and new it is a very uncomfortable
187:30
feeling I mean I could show you all
187:32
sorts of cool things on you know if I
187:35
came up with a new lick on the guitar I
187:38
would enjoy showing it to you this is
187:42
something I feel very I have felt very
187:44
uncomfortable about and there are ways
187:46
in which well
187:53
it’s very apart for me from the status
187:56
game I’ve been fascinated looking at
187:58
some of the comments where people say
188:01
you know so-and-so is in it for the
188:02
grift and they just want money and this
188:05
is an ego trip and I have to say the
188:08
least fun part the reason I didn’t do a
188:10
podcast for a long time and the reason
188:12
that I I didn’t commercialize this and
188:16
I’ve left a lot of money on the table
188:18
and I’m intending to commercialize this
188:20
is that I was very uncomfortable with
188:23
all of these issues they didn’t like it
188:25
and I think people imagined that their
188:30
first few increments of status are fun
188:33
so that getting more and more status
188:35
must be awesome and I actually don’t
188:38
think that that’s true I think it’s a
188:39
little bit like wow my first my first
188:42
taste of heroin was pretty sweet I
188:44
should do this all the time it goes into
188:47
some completely different place yeah so
188:51
that is counter to the narrative that
188:54
we’re all seeking maximum status and in
188:56
competition with each other for status
188:58
well if you yeah I think that there is a
189:00
that is a low-resolution narrative right
189:03
I think that you know it’s like it’s
189:09
always make fun of the fact that
189:11
evolutionarily you’re crazy for sugar
189:15
and the fact that they give it away for
189:16
free at Starbucks you know there’s some
189:20
part of you that’s a three-year-old kid
189:23
just wants to use many packets of sugar
189:24
as you possibly can it’s not going to be
189:26
a good thing right yeah keep going I’m
189:28
sorry about that well so this is the
189:30
thing I think I think it’s actually true
189:33
that there’s a lot of status that is not
189:35
really that fun this is also my
189:37
experience but I think it’s also true
189:40
that we can feel good about rather than
189:42
bad about where someone else is doing
189:44
socially well well if we yeah I mean if
189:48
we if we have a kind of love and trust
189:50
and we have an idea like you know I’m
189:54
friends with Andrew Yang and I disagree
189:56
with a bunch of his policies but I have
189:59
a feeling that he is a guy who’s just
190:00
earnest you know I’m knowing knowing him
190:04
socially I have the sense that
190:06
it is not an ego trip for him to want to
190:10
steward the country it’s a-you know
190:11
you’re taking on a position that puts
190:14
you in a life-and-death situation with
190:15
the number of attempts on presidents
190:17
lives let’s say it’s a very solemn
190:20
responsibility and I think that in part
190:24
we want people who we feel are grounded
190:28
and I by the way I’m not always grounded
190:30
you know so I’ve drunk my own status you
190:36
know to excess at times but it’s a very
190:40
tricky thing who do I want to have
190:43
status who do I not want to ask that as
190:45
do I trust I have a friend who is the
190:48
nicest person in the world except when
190:49
he’s doing well and then he becomes very
190:51
difficult to deal with you know so
190:53
they’re like that there’s the person
190:56
who’s fine on one glass of alcohol and
190:59
you don’t want them to have three yeah
191:02
so I think status as a hyper normal
191:06
stimuli we’re in a evolutionary
191:09
environment we couldn’t necessarily have
191:11
more than 150 people pay attention to us
191:13
yeah and now we can have a huge number
191:15
of people pay attention to us and have
191:17
it metro sized with likes or whatever I
191:19
think it is like sugar a hyper normal
191:22
stimulus that is very hard for it not to
191:25
be bad for us and we actually have to
191:27
have a very mature relationship to it
191:28
and addiction of any kind any hyper
191:33
normal stimulus that decreases normal
191:35
stimulus is going to end up being net
191:37
bad for us I think one of the metrics
191:39
for how healthy a society is is inverse
191:41
relationship to addictive dynamics
191:43
that’s the healthy environment
191:46
conditions people that are not prone to
191:49
addiction which means have actual more
191:53
authenticity of choice because addiction
191:55
compulsion writ large is less
191:57
authenticity of choice and what’s
192:00
interesting is the hyper normal stimulus
192:03
what porn is to sex with sugar and salt
192:05
and fat concentrated in a Frappuccino or
192:08
McDonald’s is to food right devoid of
192:11
the actual nutrition or devoid of the
192:12
actual intimacy listen trading submits
192:14
that betray the Ultimates the the
192:17
originally the proximate stimulus
192:20
is tied to the ultimate and ice the the
192:22
brain keeps track of the proximate and
192:25
then you can disconnect some of these
192:26
like birth control disconnected
192:30
sexuality from procreation right and in
192:33
the same way if there was a healthy
192:35
status relationship of in a tribal
192:38
environment where I can’t really lie and
192:40
people really are watching me and know
192:41
me if I’m thought well of it’s because
192:43
I’m actually doing well by everybody and
192:44
I have authentic healthy relationships
192:46
as opposed to I can signal things that
192:49
aren’t true hmm and not and they and
192:54
even get more status through negative
192:58
signaling about other people and things
193:00
like that and get a lot of hits from it
193:03
it’s that is the same kind of thing as
193:05
the fast food or the corn is and so I
193:09
think we have a hypo normal environment
193:12
of the healthy stimulus that actually
193:17
creates a baseline well-being so most
193:22
people I find that when they go camping
193:25
with their friends and they’re in nature
193:27
and they’re actually in real authentic
193:28
human relationships they’re checking
193:30
their phone for dopamine hits from email
193:33
or Facebook less and they’re also
193:35
looking opening the fridge just blindly
193:38
looking less often because they’re
193:40
actually having an authentic meaningful
193:43
engaging interaction but in a world
193:46
where I have a lot of isolation nuclear
193:49
family home structures etc and not
193:53
connected to nature and not necessarily
193:54
connected to meaningfulness that much
193:57
that hypo normal environment creates
193:59
increased susceptibility to hyper normal
194:01
stimuli hyper normal stimuli happened to
194:04
be good for markets because on the
194:08
supply side if I want to maximize
194:10
lifetime value of a customer addiction
194:12
is good for lifetime value of a customer
194:15
but it is very bad for society as a
194:18
whole this I really like so if I
194:22
understand you correctly
194:28
people don’t I mean this actually starts
194:33
to solve a puzzle I think I heard that
194:37
somebody asked Matt Damon whether he
194:39
enjoyed being famous and he said it was
194:43
if I have the story right and maybe
194:45
somebody else I forgive me if I’m wrong
194:47
he said it wasn’t even fun for 15
194:51
minutes and this is the hardest thing to
194:55
convey is that if you’ve never had any
194:57
kind of status at all right that you
195:02
know I I think I said to Tim Ferriss
195:04
that you only wanted to be famous to
195:06
3,000 hand-chosen people want your calls
195:09
returned you know you you want to be
195:12
taken seriously when you have something
195:14
to say you do not want to be universally
195:19
known and that was the hardest decision
195:21
and starting this podcast was I didn’t
195:25
think I had another option I mean part
195:28
of the point of it is to get out ideas
195:30
that I worry are not institutional you
195:35
know there’s no institution that’s
195:36
embracing these ideas and I couldn’t
195:39
figure out met Redford for months is
195:43
there a way to do this without becoming
195:46
part of the story right and because I
195:49
think that privacy and an individual
195:51
life is so much more important and I
195:54
don’t believe that every time you bring
195:55
something up you know it means that you
195:58
should have your life ripped open and be
196:01
dissected and discuss it’s very
196:03
unnatural I think what you’re trying to
196:04
tell me is that people think that they
196:07
want to be fabulously rich they think
196:09
they want to be famous they think they
196:12
want unlimited sexual access and in fact
196:15
it is the first few tastes of these
196:18
things that convince them that there
196:20
must be no limit to how wonderful the
196:22
world can be if only that can be mine
196:24
and in fact there is something I mean
196:27
it’s sort of you know like rosebud at
196:29
the end of such a yeah those are much
196:32
more like addiction and fulfillment and
196:36
addiction
196:38
we’ll give me a spike and then a crash
196:40
and then because of the crash I’m more
196:42
craving something that will spike me
196:43
because I feel really shitty and but
196:45
then I get an erosion of baseline over
196:47
time from the effects of that and so of
196:50
course the chocolate cake is gonna make
196:52
me feel good in the moment but as I have
196:53
a mostly chocolate cake died at my life
196:55
feels shitty er as I average right as I
196:58
do the integral under the curve it gets
197:00
worse whereas the salad doesn’t really
197:02
give me that spike but as I get
197:03
healthier my baseline of pleasure
197:05
throughout not just when I’m eating but
197:07
all of the time goes up because I have
197:09
the capacity to engage in more
197:11
interesting meaningful things and my
197:13
body doesn’t hurt as much in whatever so
197:14
I think the interesting thing is that it
197:16
is actually just like a healthier
197:18
relationship to or a more effective
197:21
relationship to pleasure is anti
197:23
addictive but I think most of these
197:25
things that people think they want are
197:27
hyper normal stimuli that is the
197:29
dopaminergic part separated from the
197:31
substance I don’t know how much I
197:34
believe this but I like it a lot so if I
197:37
understand you correctly there is a
197:39
world of pleasure I don’t even want to
197:44
call it pleasure I don’t even know what
197:45
to call it maybe it’s much more on
197:46
fulfillment that we would give up that
197:50
no let me say it differently what you’re
197:52
really saying is we are blind to the
197:55
effect that somatic pleasure and Status
197:58
pleasure is crowding out fulfilment in
198:01
our lives and that were we to actually
198:04
understand the cost of pleasure of
198:07
rivalry that there is an individual
198:10
reason to abandon somatic pleasure as
198:15
the be-all and end-all how we how we
198:19
create a life I mean this is how many
198:23
how many awesome trips to Vegas did I
198:25
did I have is that the thing that’s
198:27
going to matter most to me on my
198:29
deathbed yeah I don’t think it ever has
198:32
and I don’t think it’s ever what people
198:38
would be most hopeful that there let’s
198:40
give it a name because I don’t think
198:42
I’ve ever been down this particular
198:44
route let’s call it deathbed mindset for
198:47
the moment just to play with it see if
198:48
it works and if it doesn’t work we’ll
198:50
trash it
198:51
so people on their deathbed become
198:53
focused on did I do enough for my
198:57
community do my children think well of
198:59
me I think what happens is people
199:01
realize that everything they got dies
199:04
with them like all in the end its
199:07
lineage only and the way I touch the
199:10
world continues and there’s not just my
199:12
biologic kids lid each of my thoughts
199:15
yeah like memes along with genes and so
199:19
I think when we really start to think
199:22
about this clearly we recognize that
199:27
this direction is self terminating the
199:30
need to get stuff from the world that
199:33
when I die it ends with me that there is
199:35
actually only a kind of self
199:37
transcendence and permanence in the way
199:38
that I touch the world which does ripple
199:40
ongoingly but there’s also this thing
199:43
where yeah again it I feel almost a
199:50
little bit shy talking about it even
199:53
even more than the sex topic in some
199:55
ways because I’m proposing that there is
200:00
something like spiritual growth I think
200:02
it’s actually necessary for civilization
200:04
to make it
200:05
and so people affirming that they are
200:12
these kind to themselves needy things
200:15
that need stuff from the world yeah that
200:17
need other people’s validation and
200:19
attention and etc and living life that
200:22
way were the more of it they get what
200:23
they’re still getting as a self the
200:25
affirmation of that sense of self as
200:27
opposed to coming from a place of
200:29
wholeness and the desire and actual love
200:35
for the beauty of life and the desire to
200:37
have their life be meaningful to life
200:39
that my life ends but life of the
200:42
capital L doesn’t end and that life
200:45
starts to be central to my awareness
200:47
more than my life is and my life becomes
200:49
meaningful and it’s coupling to life
200:51
this answers the sex question it also it
200:54
answers all the other questions but I
200:56
don’t think there is a there to break
200:58
through to yeah and the problem that
201:01
we’re having conceiving of it in your
201:02
money now again I don’t think this gets
201:04
that of all the issues that I’ve raised
201:06
but I think it’s the first point at
201:08
which I start to see there’s something
201:11
really different about your perspective
201:14
so just as a slow learner if we take the
201:18
kind of Gerardi an idea of all desire as
201:21
mimetic and I’m oversimplifying it but
201:23
just meaning I want what other people
201:25
have and then that inexorably causes
201:29
conflict and then the conflict will
201:31
inexorably cause violence I think there
201:35
is statistical truth to all three of
201:37
those steps but not inexorable truth to
201:39
any of them mmm I don’t only want things
201:43
that other people have I you know or
201:47
that I that I learned from other people
201:50
there’s there there are things that are
201:53
just intrinsically fascinating to me or
201:57
there are wanting for other people it is
202:00
not wanting for myself anything in
202:02
particular just actually caring about
202:03
wanting for other people there are
202:05
innate creative impulses where I don’t
202:07
actually need to see and eat like I have
202:09
a friend who is a savant pianist
202:11
brilliant pianist and he almost never
202:13
will play for anybody because his
202:15
experience of playing is so beautiful
202:17
that he doesn’t want to cheapen it by
202:19
having somebody else hear it and move
202:22
into a performative place and it just is
202:25
his own communion with music itself so I
202:29
think there is desire that emerges from
202:32
our connection to life not just the
202:34
social layer and then even if you’re
202:37
doing something that I’m inspired by and
202:38
I want to do something like that too
202:40
they don’t have to create conflict I can
202:42
be okay with you having something and
202:44
want to share it or share in that type
202:46
of phenomena yeah okay now I’m starting
202:50
to you know I have a friend for example
202:53
who’s a fantastic guitarist and I
202:57
noticed that when we play together he
203:00
doesn’t play at his peak ability because
203:05
he wants the pleasure of playing
203:09
together yeah to to be that the thing
203:14
that we share if I was a better
203:15
guitarist it would be more fun
203:17
to trade things back and forth but the
203:20
danger of going out of shared experience
203:23
is far greater and so you know I yeah I
203:28
I know the things that you are saying
203:30
are true and perhaps what I’ve been
203:33
saying back to you could be retranslated
203:34
as the transcendent beyond the proximate
203:39
somatic pleasures that we have is so
203:42
rarely experienced at scale it’s not
203:46
experienced at scale well a little bits
203:49
and religions in religion it happens I
203:52
think that in in families there are
203:54
things that people don’t want to share
203:56
outside of the family because they bond
203:58
the family yeah and but it’s just it’s
204:03
hard to imagine a world in which people
204:05
stop coveting their own name and lights
204:08
you know people being impressed by by
204:12
their car their yacht their house this
204:14
than the other and I think that what
204:16
you’re talking about
204:17
not hard for me to imagine well this is
204:19
the thing I mean the you know the odd
204:21
thing that I have in being the friend
204:25
and the employee of a billionaire is
204:27
that I sometimes get to borrow his life
204:29
yeah and you know he’s made his home
204:32
available to me in Hawaii for example
204:36
and it’s absolutely astounding to be in
204:39
control of an asset like that I have
204:42
another friend who lent me his Island
204:43
year after year but I also found that I
204:47
didn’t want or need that and that both
204:51
of these gentlemen that I’m referring to
204:53
were much more focused on ideas than
204:56
they were on Faberge eggs or displaying
204:59
a Picasso or anything like that because
205:01
ultimately they found they wanted to go
205:05
their association with me was let’s talk
205:08
about things that might move the needle
205:10
in human history rather than do you have
205:13
any idea how much this bottle of wine
205:15
cost and remember I was saying earlier
205:20
that I think dominant paradigms co-opted
205:22
psychology to define healthy psychology
205:25
as supportive of the paradigm so what
205:27
I’m about to say in terms of what I
205:29
think healthy psychology is is not the
205:30
current
205:31
definition of healthy psychology it is
205:33
one that would be fit to a to an actuary
205:36
viable civilization I think
205:38
psychologically healthy humans are
205:40
emotionally coupled to each other so 100
205:44
percent so when you’re happy I’m happy
205:46
I’m stoked for you if you’re hurting I
205:48
feel that I feel compassion and empathy
205:49
I think the worst psychology is the
205:54
inversion of sadism where I feel joy at
205:58
your pain rather than joy your joy and
206:00
we know your pain I think is a French
206:02
expression is not sufficient that one
206:05
succeed in life one’s friends must also
206:08
fail yeah so that is a perfect statement
206:12
of what is most wrong with the world
206:13
right that’s that that is the heart of
206:16
the worst part of game a but I think
206:19
jealousy is one step away from sadism
206:22
because if sadism is I feel joy at your
206:25
pain
206:25
jealousy is I feel pain at your joy or
206:28
your success or envy right and I don’t
206:31
think that is a psychologically healthy
206:33
place for people I think it is a largely
206:37
we condition this because we watch
206:39
movies where we celebrate when the bad
206:41
guy gets it right and we condition the
206:43
fuck out of we celebrate when the bad
206:44
guy gets and we celebrate when our team
206:46
wins and the other team loses so we can
206:48
collectively decoupler empathy from
206:50
other human beings arbitrarily so that
206:52
we can then feel good in a war
206:54
supporting you know when that type of
206:56
thing occurs and we get conditioned that
206:59
second place is the first loser and all
207:01
those types of things but this is
207:03
conditioning again and conditioning of a
207:05
highly neuro plastic species so I think
207:07
our intuitions are all bad if we haven’t
207:09
spent I’m really questioning these
207:11
things and then also looking at cultural
207:13
outliers because I don’t think any of
207:15
this is inexorable is it is it Oh bik WA
207:19
tiss yes is it an extra bone oh but I
207:21
think what is ubiquitous is
207:22
psychopathology
207:23
well Daniel I think what I’ve gotten
207:25
from our conversation is is that you’ve
207:28
got a lot of examples that are at the
207:31
proof-of-concept level of things that
207:33
are under exploited you’ve got an
207:35
observation that we’re far off the
207:37
efficient frontier that there’s one
207:40
giant overlooked opportunity which is
207:44
the
207:44
we are so radically k-selected that our
207:47
developmental period from age zero to
207:51
thirteen could be used for something
207:53
radically different which i think is the
207:55
the biggest hope in your whole complex
207:58
of ideas together with the idea that
208:01
there are realms beyond somatic pleasure
208:04
that most of us spend our entire lives
208:07
not knowing what it’s like to break
208:09
through the status and wealth and
208:13
security games and effectively we have
208:16
no idea at the top of Maslow’s hierarchy
208:18
when fully realized is and that it might
208:21
be possible to at least begin the game
208:24
to buy us some time to try to figure out
208:26
what we would do at scale now I still
208:29
don’t see any world in which we can
208:33
defeat all these multipolar traps but I
208:35
think what you’re really saying to me
208:37
again always correct me if I’m wrong is
208:39
that we could potentially change what
208:45
winning feels like and that when we do
208:48
that then this prisoner’s dilemma is
208:50
don’t look right any longer because I no
208:52
longer want to be the one who defected
208:56
while you cooperated so that I get off
208:58
scot-free and you wind up with a 20 year
209:01
jail term and we have to remove the
209:03
context of the prisoner’s dilemma as our
209:05
model for the world right like actually
209:07
change the nature of the context and
209:13
because that is a fundamentally
209:15
inexorably rival risk dynamic right I
209:18
just I don’t think you’re gonna get rid
209:20
of all rivalry I just I see
209:23
opportunities for decreasing it I see
209:25
opportunities for changing the culture
209:28
the the the weakest part of your
209:30
argument to me at this moment and again
209:32
I’m just learning about it is the need
209:35
for universality with respect to this
209:38
evolution and I think that’s the one
209:40
part of it that I find the hardest to
209:41
imagine we can actually get done so if I
209:45
have a system like a corporation where
209:47
my playing by the rules fully gets me
209:50
ahead less than me defecting on the
209:52
system internally and doing corporate
209:56
politics or a back-end deal or whatever
209:58
then I have the incentive to defect on
210:01
the system and it doesn’t have the
210:02
collective intelligence to notice it
210:04
right because there’s a diminishing
210:07
return on the collective intelligence of
210:09
the system as a function of more scale
210:11
if I could make a system and I I will
210:15
claim that we can and their
210:17
architectures that can achieve it we
210:19
could make a system where the collective
210:22
intelligence scaled with the number of
210:24
people then I would always have more
210:26
incentive to participate with it than to
210:28
defect and if I did defect because I had
210:30
a head injury the system would have the
210:32
intelligence to be able to notice that
210:34
and deal with it now this is the place
210:36
where I’m saying the Dunbar number was
210:39
both care and sense making it was a
210:44
limit on both you know our values
210:46
generation right and our sense making to
210:48
inform choice making so if we want
210:50
better systems of governance ie better
210:51
systems of choice making we need to get
210:53
both collective values generation and
210:55
collective sense making down the
210:57
conditioning gives us ways to start to
211:00
work with things like very different
211:03
value systems but I can’t have a very
211:06
different value system while still
211:08
incentivizing meaning a value equation
211:11
economically where the whale is worth a
211:13
lot dead and nothing alive right and I
211:16
and it doesn’t have adequate sense
211:19
making to even inform what good choice
211:21
making for everyone so we can
211:23
participate with the system is so
211:26
that’ll have to take more time well I
211:30
look forward to continuing our
211:32
discussions and I want to thank you very
211:35
much for coming and sharing your ideas
211:37
with us here on the portal and just
211:39
briefly I want to say I think that I
211:44
think that you’re doing this is awesome
211:47
I you know there there are people who
211:52
say we need divergent ideas and
211:55
heterodox ideas but that don’t have
211:57
grounded clear thinking and you know
212:00
critical thinking and I think for you to
212:02
bring heterodox thinkers and have but
212:06
not just agree with them but have real
212:08
dialectic conversation that is earnestly
212:10
seeking to bring about
212:11
better understanding is beautiful I was
212:14
really excited about that
212:15
I I wish that I could have communicated
212:20
clearer having had better sleep last
212:22
night but hopefully it wasn’t completely
212:23
unintelligible well I traveled from San
212:26
Francisco to do this and so I think I
212:28
was probably a little off my game at
212:30
particularly at the beginning but we can
212:32
do this again and I just want to say
212:34
those are incredibly generous and kind
212:37
words I’ll take them to heart I’m trying
212:39
to get courage myself to do a little bit
212:41
more in this space and so far I got to
212:44
tell you the audience for the show has
212:45
been second to none in terms of behaving
212:49
really admirably and positively on the
212:54
internet I can’t tell you how much great
212:56
feedback we’ve gotten super constructive
212:58
and I hope you think they will look
213:01
they’ll embrace what you said in the
213:03
same spirit so thanks Daniel thank you
213:07
you’ve been watching the or listening to
213:09
the portal with Daniel schmock dude
213:11
Berger and I’ve been your host Eric
213:14
Weinstein thanks for coming through and
213:16
we’ll see you next time
213:17
[Music]
213:31
you