Search This Blog

Thursday, August 03, 2006

Firm Beliefs

I started writing on this blog for a variety of reasons. One of those was to "enlighten" others on a variety of topics, particularly in matters of economics. I've think I may have had some positive effect in some non-contentious areas of economics, but I've had absolutely no effect regarding subtopics like the effect of taxation and deficits on growth and wealth creation. It seems that everyone has their preconceived notions (fortunately, a few people actually agree with my preconceived notions), and nobody is willing to abandon their beliefs regardless of the evidence.

A recent article in the Washington Post explains some of the mechanisms that ensure that people hold on to their beliefs despite the evidence.

Psychological experiments in recent years have shown that people are not evenhanded when they process information, even though they believe they are. (When people are asked whether they are biased, they say no. But when asked whether they think other people are biased, they say yes.) Partisans who watch presidential debates invariably think their guy won. When talking heads provide opinions after the debate, partisans regularly feel the people with whom they agree are making careful, reasoned arguments, whereas the people they disagree with sound like they have cloth for brains.

Unvaryingly, partisans also believe that partisans on the other side are far more ideologically extreme than they actually are, said Stanford University psychologist Mark Lepper, who has studied how people watch presidential debates. [...]

In an experiment that pols may want to note closely, researchers recently plopped 10 Republicans and 10 Democrats into scanners that measure changes in brain-blood oxygenation. Such changes are thought to be linked to increases or decreases in particular areas of brain activity.

Each of the partisans was repeatedly shown images of President Bush and 2004 Democratic challenger John F. Kerry.

When Republicans saw Kerry (or Democrats saw Bush) there was increased activation in brain areas called the dorsolateral prefrontal cortex, which is near the temple, and the anterior cingulate cortex, which is in the middle of the head. Both these regions are involved in regulating emotions. (If you are eating an ice cream cone on a hot day and your ice cream falls on the sidewalk and you get upset, these areas of your brain remind you that it is only an ice cream, that not eating the ice cream can help keep those pounds off, and similar rationalizations.) More straightforwardly, Republicans and Democrats also showed activation in two other brain areas involved in negative emotion, the insula and the temporal pole. It makes perfect sense, of course, why partisans would feel negatively about the candidate they dislike, but what explains the activation of the cognitive regulatory system?

Turns out, rather than turning down their negative feelings as they might do with the fallen ice cream, partisans turn up their negative emotional response when they see a photo of the opposing candidate, said Jonas Kaplan, a psychologist at the University of California at Los Angeles. [...]

The result reflects a larger phenomenon in which people routinely discount information that threatens their preexisting beliefs, said Emory University psychologist Drew Westen, who has conducted brain-scan experiments that show partisans swiftly spot hypocrisy and inconsistencies -- but only in the opposing candidate.

When presented with evidence showing the flaws of their candidate, the same brain regions that Kaplan studied lighted up -- only this time partisans were unconsciously turning down feelings of aversion and unpleasantness.

This study is certainly closely aligned with my experiences on this blog and also commenting on other blogs (not to mention my experiences in actual, real live, face-to-face conversations). Obviously, being human, I must suffer from the same problem, so my worldview is also based on partisan thinking and y'all should ignore whatever I say. Researchers are probably also prone to this effect. Of course, if the researchers who did the research referred to by the Washington Post article were also affected by this, then the results of their research is bogus, and such an effect doesn't actually exist. But, if it doesn't exist, then they weren't affected, so their research is valid, so it does exist. But if it does exist ...

And so we end up with paradox.

3 comments:

Oroborous said...

It's certainly an effect, but if people's minds couldn't be changed by reasoned argument, then we wouldn't have debates in the first place. (Or informational advertising aimed at adults).

The key, it seems to me, is that while people often have a great resistance to changing their minds upon first hearing an opposing viewpoint, they often mull it over, and either change their minds later, or become more receptive when hearing the argument again.
In the religious missionary field, it's known as "planting the seed".

So, you may have been more successful than you think that you have.

Brit said...

It could be that you only hear from the ones who disagree because they're so keen to disagree.

You might have persuaded many people to who were undecided.

I have learned a lot from blogs, including yours, about stuff that just wasn't on my radar previously.

Can't say that I'm keen on the idea of blogging to enlighten though. That's Orrin's approach and it assumes you know the answers. I think blogs work best as forums for the little 'communities' that grow up, where you soon find that nobody has all the answers to anything.

Bret said...

Brit wrote: "Can't say that I'm keen on the idea of blogging to enlighten though."

Well, I did put "enlighten" in quotes. Also, there are different versions of enlighten. Orrin preaches. My version was to bring information to people that they might not have been previously exposed to which might help them better understand certain topics - such as the role of deficits in financing government spending. Such topics are quite complex making it difficult to have canned answers. On the other side, I've seen a lot of misconceptions and my goal was to reduce those - that would be a form of enlightening.