Birtherism, cognitive dissonance, and the persistence of belief
As the Arizona legislature debates a bill to demand proof of citizenship from presidential candidates, Slate has a new piece about why the “birther” movement won’t go away (which I came across thanks to Eric Knowles). Of particular interest:
The irony of all the birth-certificate proposals—similar bills have been introduced in six states—is that they contain the seeds of the birther movement’s destruction. The moment Obama calls their bluff and hands his birth certificate to the Arizona secretary of state, it’s over.
In theory. That’s the beauty of the birther myth, or any conspiracy theory: No amount of evidence can ever completely dispel the questions. When Obama produced his Hawaii birth certificate and the state of Hawaii verified it, it was a fake. When reporters uncovered announcements of Obama’s birth in 1961 copies of the Honolulu Advertiser and the Honolulu Star-Bulletin, they had been planted. If the Arizona secretary of state verified Obama’s birth certificate, that would be due to the government mind-control chip implanted in his molar.
To put all this another way: Birtherism is here to stay. And not because more people are going crazy, but because crazy has been redefined…
The article puts a bit of a partisan spin on the underlying psychological explanation (oh those crazy conservatives!). But this is just another manifestation of a basic psychological process documented in the 1950s by Leon Festinger (which I’ve previously discussed in relation to the persistence of the discredited vaccine-autism link).
When people have a deeply held belief that they have publicly committed to, disconfirmatory evidence doesn’t necessarily weaken the belief. Instead, Festinger predicted — based on cognitive dissonance theory — that under the right circumstances, disconfirmatory evidence can make beliefs grow stronger and metastasize. Festinger first documented this phenonemon among doomsday cultists, and it plays out again and again among modern conspiracy theories from the left, right, and in between.
Where this all intersects with politics, as the Slate piece points out, is in how politicians can fan and exploit conspiracies. Festinger laid out five conditions for disconformatory evidence to intensify belief. The fifth is, “After the discomfirming evidence comes to light, the believer has social support from other believers.” Politicians speaking in code can sound to believers like sympathetic supporters, while still leaving themselves plausible deniability in more rational circles. I don’t know how many have read Festinger, but they almost certainly know what they’re doing.
Update: Brendan Nyhan sent me a link to a forthcoming paper of his (with Jason Reifler) titled When Corrections Fail: The Persistence of Political Misperceptions. In a series of experiments, they show that correcting misperceptions can backfire when the correction runs contrary to somebody’s political beliefs. For example, conservatives became more certain that Iraq had WMDs prior to the war when they read news articles that corrected that misperception. Likewise for liberals and the misperception that Bush banned all stem cell research.
Their paper doesn’t test all aspects of Festinger’s model, though it does draw on work in the cog-dissonance tradition. Festinger thought that the social context of beliefs was important: you have to publicly commit to your beliefs, and after receiving disconfirmatory evidence you have to have social support from fellow believers. Nyhan and Reifler’s experiment dealt with subjects’ pre-existing beliefs, so it’s entirely possible that those elements were part of the prior experience that subjects brought into the lab. It would be interesting to run an experiment to see if you could modulate the backfire effect by amplifying or dampening those factors experimentally.