Exuberant Irrationality

Why is self-deception so easy and common, self-correction so difficult and rare? Pride is the safe answer. OK, but why does pride affect us, almost all of us, in this way? Mostof usbelieve whatwe want to believe and surroundourselves with people who agree withus -- or else with people who disagree with usin a way that nourishesour vanity: "I can make friends with anyone. Not only am I right, but even people who are wrong like me." When we ask for advice, we often do so in a way that ensures we get the advice we wanted -- and we often ask for this advice only after we've already made up our minds. What we are really seeking in such cases is not so much new wisdom as an endorsement of our own old wisdom, or at least permission to do what we were going to do anyway.

This genius for rationalization is universal. You can find it in an eight-year-old. You can find it among believers and nonbelievers, among liberals and conservatives. In last Monday's Washington Post, Shankar Vedantamhad a columnabout new studies that show how correction can actually aggravate the effect of an error by givingthose who weremistakena chance to dig in:

Political scientists Brendan Nyhan and Jason Reifler provided two groups of volunteers with the Bush administration's prewar claims that Iraq had weapons of mass destruction. One group was given a refutation -- the comprehensive 2004 Duelfer report that concluded that Iraq did not have weapons of mass destruction before the United States invaded in 2003. Thirty-four percent of conservatives told only about the Bush administration's claims thought Iraq had hidden or destroyed its weapons before the U.S. invasion, but 64 percent of conservatives who heard both claim and refutation thought that Iraq really did have the weapons. The refutation, in other words, made the misinformation worse.

And even when refutation does not compound the effects of an error, it can'tentirely undo the damage. Again Vedantam:

In experiments conducted by political scientist John Bullock at Yale University, volunteers were given various items of political misinformation from real life. One group of volunteers was shown a transcript of an ad created by NARAL Pro-Choice America that accused John G. Roberts Jr., President Bush's nominee to the Supreme Court at the time, of "supporting violent fringe groups and a convicted clinic bomber."... Bullock then showed volunteers a refutation of the ad by abortion-rights supporters. He also told the volunteers that the advocacy group had withdrawn the ad. Although 56 percent of Democrats had originally disapproved of Roberts before hearing the misinformation, 80 percent of Democrats disapproved of the Supreme Court nominee afterward. Upon hearing the refutation, Democratic disapproval of Roberts dropped only to 72 percent.

If we do finally climb down from an error -- usually because there is no longer any alternative -- we try to wait until no one is looking. And once we reach firm ground, we run as far away from the error as possible, as fast as possible, so that there will be no risk of anyone's associating us with what we used to believe. Then, from a secure distance, we look back at the error and say, "How could anyone believesomething sostupid? Not me, certainly. I must have been a different person then." Even when then is an hour ago.So why is it so hard for us to change our minds when this also means admitting that we were wrong? The problem is that being right or wrong is never impersonal. One's self-understanding is constituted as much by one's beliefs about the world as by one's memories. Losing one's beliefs is therefore as traumatic as losing one's memory. Not only disorienting but threatening -- threatening the self with disintegration. This is an example of what psychologists call cognitive dissonance. Here the dissonance is between what the available evidence suggests and what we think we can afford to acknowledge. "If that were true, then I'd look like a fool." Or, at the limit: "If that's true, then I've wasted my life." One of the mostpowerful and sobering passages in St. Paul's letters is this one, from the reading for last Friday's Mass:

If there is no resurrection of the dead, then neither has Christ been raised. And if Christ has not been raised, then empty too is our preaching; empty, too, your faith. Then we are also false witnesses to God, because we testified against God that he raised Christ, whom he did not raise if in fact the dead are not raised. For if the dead are not raised, neither has Christ been raised, and if Christ has not been raised, you faith is vain; you are still in your sins. Then those who have fallen asleep in Christ have perished. If for this life only we have hoped in Christ, we are the most pitiable people of all. (Corinthians 15:13-19)

It is this if, then that is so terrifying and necessary. The alternative is a dehumanizing pragmatism: "What does it matter if it's true or not as long as it works -- as long as it gets me through the day?" Insofar as open-mindedness is an intellectual virtue, it isn't just about tolerating new ideas; it's about being able to accept a new idea that can only make room for itself by displacing an old one.This sort of open-mindedness, in ourselves orothers, demands that webe able "to give a reason for our hope." Lucidity is the skeptic's favorite virtue, but this should not preventChristians from seeing that it is a virtue, a strength of mind that grace provides and confirms -- if we will let it.Postscript:In a short essay on the First Things website, Fr. Edward T. Oakes, S.J., quotes this suggestive passage from an essay by T.S. Eliot published in 1930:

I believe that the skeptic, even the pyrrhonist, but particularly the humanist-skeptic, is a very useful ingredient in a world which is no better than it is. In saying this I do not think that I am committing myself to any theological heresy. The ideal world would be the ideal Church. But very little knowledge of human nature is needed to convince us that hierarchy is liable to corruption, and certainly to stupidity; that religious belief, when unquestioned and uncriticised, is liable to degeneration into superstition; that the human mind is much lazier than the human body. . . . If we cannot rely, and it seems that we can never rely, upon adequate criticism from within, it is better that there should be criticism from without.

Matthew Boudway is senior editor of Commonweal.

Please email comments to letters@commonwealmagazine.org and join the conversation on our Facebook page.

Must Reads