Suppose you believe something with your whole heart. Suppose you are presented with evidence that your belief is wrong. What’s likely to happen?
Most people assume they’d change. Not so, said Leon Festinger who posed this question in his landmark study of what he called “cognitive dissonance.” Festinger’s findings tell us why change is so hard. There is a way to change behavior. It’s practiced by many companies including Pixar and by two-thirds of the worldwide church.
Leon Festinger first published his research in 1957 in A Theory of Cognitive Dissonance. His theory explained how people were capable of believing one thing despite evidence to the contrary. Festinger argued that people naturally seek to maintain consistency in their beliefs. So when facts or evidence come along disturbing their settled assumptions, a state of cognitive dissonance occurs. That should be a good thing—but it often isn’t.
It turns out there’s a dark side to dissonance. Many times a set of mechanisms—or triggers—seek to bring consistency back to a person’s thinking by distorting or ignoring reality. Festinger first noticed this tendency in religious groups. When deeply held beliefs come into conflict with reality, religious groups tend to develop explanations that reframe reality to fit their beliefs, rather than vice versa. Festinger cited the Millerites as one example of cognitive dissonance gone awry.
The Millerites were a millenarian religious sect that believed Jesus Christ would return to earth on October 22, 1844. Well, that didn’t happen. But rather than abandon their faith, many Millerites constructed elaborate rationalizations to justify their belief, arguing that Christ had returned spiritually, or that the event had occurred in Heaven, if not on earth. This is a case of cognitive dissonance becoming cognitive resistance to reality, as Festinger noted in another book, When Prophecy Fails:
Suppose an individual believes something with his whole heart…suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong; what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor about convincing and converting people to his view.1
Deep devotion can push cognitive dissonance in the wrong direction. Rather than experiencing dissonance as a positive push to change, Festinger observed that “true believers” go deeper into their faith in order to blunt the threat of disconfirmation. This reality extends beyond religion to all sorts of professions, including the field of economics. Many economists devoted to their theories ignored a great deal of contrarian data leading up the Great Recession of 2008. Philip Mirowski, a Carl Koch Professor of Economic and the History of Philosophy of Science at the University of Notre Dame writes, “Philosophy of science revels in the ways in which it may be rational to discount contrary evidence, but the social psychology of cognitive dissonance reveals just how elastic the concept of rationality can be in social life.”2
There is a way to make cognitive dissonance a constructive experience. In Switch: How to Change Things When Change is Hard, Chip and Dan Heath argue that change is more a matter of behavior than mindset. The Heaths say many leaders mistakenly assume change is rational and that it happens in this order: analyze-think-change. Wrong. The real sequence of change is experience-feel-change. It’s a matter of experiencing small disruptions. “For anything to change, someone has to start acting differently.”3 Cognitive dissonance is constructive in cultures where workers routinely experience little disruptions. That’s the work of contrarians or crap detectors in company leadership.
Pixar has enculturated a crap detecting culture at Disney. The Economist writes that Pixar has reinvigorated Disney by fostering a sense of collective responsibility that is adept at receiving constructive criticism. It started by establishing a culture that includes a system of constant feedback where, after each film is completed, “Pixar demands that each review identify at least five things that did not go well in the film, as well as five that did.”4 Company leaders routinely experience disruption so that when some significant data debunks their assumptions, cognitive resistance doesn’t occur.
In the church, crap detectors are called prophets. They disrupt taken-for-granted assumptions, as can the Eucharist, which is what two-thirds of the worldwide church practices week in and week out. “Take and eat” is an experience of the deepest disruption of the created order: out of death comes life. This cognitive dissonance is constructive since communion, if practiced frequently enough, deepens devotion while reminding us of our depravity. It lowers cognitive resistance when presented with evidence that an assumption (such as an Enlightenment approach of analyze-think-change) might be wrong. Churches with established cultures that include a prophetic voice and a regular experience of the Eucharist are more likely to push cognitive dissonance in the right direction.
Cognitive dissonance is critical since coming to faith is only the beginning of becoming undeceived. Faith communities exist to deepen belief but also to create experiences of cognitive dissonance. That way, fervor is held in tension with our fallenness. Otherwise, faith communities are likely to reframe reality to fit their beliefs when they discover that some of their strategies to change the world might be incorrect.
1 Leon Festinger, et al., When Prophecy Fails (University of Minnesota Press, 1956), 3.
2 Philip Mirowski, “The Great Mortification: Economists’ response to the Crisis of 2007-(and counting)” The Hedgehog Review, Summer 2010, Vol. Twelve, Number Two, 35.
3 Chip Heath & Dan Heath, Switch: How to Change Things When Change is Hard (Broadway Books, 2010), 4.
4 “Planning for the Sequel,” The Economist, June 19, 2010, 73.