Removing a keystone – if a person has a keystone within his or her belief system – can cause a system to collapse in a fairly sudden and dramatic manner. I think that, during my own change process, both things happened – a slow accruing of evidence as well as some more sudden and important revelations (my change story contains some examples of each, but here’s one of the latter). Another person who had a “keystone” change experience was David Horowitz, a far left activist whose change to the right was originally sparked by learning that certain leftists he thought were decent were actually cold-blooded killers.
Most people’s belief systems are very very recalcitrant to change, and some are even impervious to it. In the latter cases – which I think are quite common – every small brick that might be removed from the edifice is almost immediately replaced with another brick, making the structure about as strong as before. Maybe even stronger, because it’s withstood many challenges. That’s the function of propaganda – to suppress the truth if it undermines the preferred narrative, but if the truth gets out, to immediately change the subject and come up with a new story to replace it. Then when that’s challenged, there’s another story and another and another for people to use to shore up anything that might be crumbling.
Who’s not susceptible? The aim is to surround yourself with people who disagree with you whose judgement you trust, if any such people actually exist, which they don’t.
From Instapundit and from the comments.
The best description I’ve seen of the psychology is Colonel John Boyd’s 1976 essay Destruction and Creation (PDF) on how an intelligent organism must necessarily think. How they build an internal model of the world, how they enhance that model, and what makes them realize that it’s fundamentally wrong in some key aspect, which causes a cycle where they tear apart their old worldview and rebuild it to more accurately reflect reality.
If the model is flawed, and all models are flawed to some degree, entropy builds up. There are all kinds of little mismatches between the model and observed reality, such as perhaps the prediction that a rural conservative will be mean, and an encounters shows them to be nice. When enough exceptions to the model build up, and they become critically important, the person can essentially have a come-to-Jesus moment where they realize that their model of reality is complete BS, and they start re-examining their beliefs, taking things apart piece-by-piece to find a better, more elegant, more parsimonious explanation for all the facts in their head.
But until that happens, they just attach some ad-hoc explanation to each failed prediction or “exception to the rule”, before they file it away in their mental map, so that they can continue on with the old, flawed model of reality. Unless their model’s predictions about reality become critically important to them, they can continue with the flawed model for quite a long time, but every encounter, every little failed prediction and exception, contributes to the entropy that will eventually cause their model to collapse. In current terms, if the cognitive dissonance becomes to great, they’ll take the red pill.
I also liked this.
A lot of people adopt beliefs not because of logic or reason or principle, but as a matter of fashion. They want to feel ‘with it’, and to be seen as ‘with it’, and they will adopt or reject ideas based on what I have to describe as ‘social aesthetics’ because I lack a more appropriate word.
With this example from someone else.
The Rosenbergs are the same. Every year, CBS, NBC and ABC would run stories about how they were innocent victims of the Red Scare. Then, around about 1990 or so, new evidence showed they were guilty beyond all doubt. CBS, NBC, and ABC ran stories about the new evidence. The following year, they were back to the original narrative, that the Rosenbergs were victims of McCarthyism and panic on the part of the evil American public. They just ignored the new evidence. Nothing can disturb the narrative.
Publishing the Venona Intercepts made it impossible to argue that they were innocent. And so many of their defenders shifted the argument, claiming that they were not “traitors,” per se, as they had a much higher loyalty than to any mere nation.