Crony Beliefs

BELIEFS AS EMPLOYEES

By way of analogy, let’s consider how beliefs in the brain are like employees at a company. This isn’t a perfect analogy, but it’ll get us 70% of the way there.[1]

Employees are hired because they have a job to do, i.e., to help the company accomplish its goals. But employees don’t come for free: they have to earn their keep by being useful. So if an employee does his job well, he’ll be kept around, whereas if he does it poorly — or makes other kinds of trouble, like friction with his coworkers — he’ll have to be let go.

.. If a belief performs poorly — by inaccurately modeling the world, say, and thereby leading us astray — then it needs to be let go.

.. If you’ve ever wanted to believe something, ask yourself where that desire comes from. Hint: it’s not the desire simply to believe what’s true.

In short: Just as money can pervert scientific research, so everyday social incentives have the potential to distort our beliefs.

.. I contend that social incentives are the root of all our biggest thinking errors.

.. A meritocracy experiences no anguish in letting go of a misbelief and adopting a better one, even its opposite. In fact, it’s a pleasure.

.. Going further, crony beliefs actually need to be protected from criticism. It’s not that they’re necessarily false, just that they’re more likely to be false — but either way, they’re unlikely to withstand serious criticism. Thus we should expect our brains to take an overall protective or defensive stance toward our crony beliefs.

.. crony beliefs will typically provide more social value the more confident we seem in them. (If Acme hires the mayor’s nephew, but seems constantly on the verge of firing him, the mayor isn’t going to be happy.)

.. But perhaps the biggest hallmark of epistemic cronyism is exhibiting strong emotions, as when we feel proud of a belief, anguish over changing our minds, or anger at being challenged or criticized.

.. The better — but much more difficult — solution is to attack epistemic cronyism at the root, i.e., in the way others judge us for our beliefs. If we could arrange for our peers to judge us solely for the accuracy of our beliefs, then we’d have no incentive to believe anything but the truth.

.. The beauty of Less Wrong, then, is that it’s not just a textbook: it’s a community. A group of people who have agreed, either tacitly or explicitly, to judge each other for the accuracy of their beliefs — or at least for behaving in ways that correlate with accuracy. And so it’s the norms of the community that incentivize us to think and communicate as rationally as we do.

.. Earlier I argued that other people are the cause of all our epistemic problems. Now I find myself arguing that they’re also our best solution.