Why worldview threats undermine evidence
Creationists, for example, dispute the evidence for evolution in fossils and DNA because they are concerned about secular forces encroaching on religious faith. Anti-vaxxers distrust big pharma and think that money corrupts medicine, which leads them to believe that vaccines cause autism despite the inconvenient truth that the one and only study claiming such a link was retracted and its lead author accused of fraud. The 9/11 truthers focus on minutiae like the melting point of steel in the World Trade Center buildings that caused their collapse because they think the government lies and conducts “false flag” operations to create a New World Order. Climate deniers study tree rings, ice cores and the ppm of greenhouse gases because they are passionate about freedom, especially that of markets and industries to operate unencumbered by restrictive government regulations. Obama birthers desperately dissected the president's long-form birth certificate in search of fraud because they believe that the nation's first African-American president is a socialist bent on destroying the country.
In these examples, proponents' deepest held worldviews were perceived to be threatened by skeptics, making facts the enemy to be slayed. This power of belief over evidence is the result of two factors: cognitive dissonance and the backfire effect. In the classic 1956 book When Prophecy Fails, psychologist Leon Festinger and his co-authors described what happened to a UFO cult when the mother ship failed to arrive at the appointed time. Instead of admitting error, “members of the group sought frantically to convince the world of their beliefs,” and they made “a series of desperate attempts to erase their rankling dissonance by making prediction after prediction in the hope that one would come true.” Festinger called this cognitive dissonance, or the uncomfortable tension that comes from holding two conflicting thoughts simultaneously.
Two social psychologists, Carol Tavris and Elliot Aronson (a former student of Festinger), in their 2007 book Mistakes Were Made (But Not by Me) document thousands of experiments demonstrating how people spin-doctor facts to fit preconceived beliefs to reduce dissonance. Their metaphor of the “pyramid of choice” places two individuals side by side at the apex of the pyramid and shows how quickly they diverge and end up at the bottom opposite corners of the base as they each stake out a position to defend.
In a series of experiments by Dartmouth College professor Brendan Nyhan and University of Exeter professor Jason Reifler, the researchers identify a related factor they call the backfire effect “in which corrections actually increase misperceptions among the group in question.” Why? “Because it threatens their worldview or self-concept.” For example, subjects were given fake newspaper articles that confirmed widespread misconceptions, such as that there were weapons of mass destruction in Iraq. When subjects were then given a corrective article that WMD were never found, liberals who opposed the war accepted the new article and rejected the old, whereas conservatives who supported the war did the opposite ... and more: they reported being even more convinced there were WMD after the correction, arguing that this only proved that Saddam Hussein hid or destroyed them. In fact, Nyhan and Reifler note, among many conservatives “the belief that Iraq possessed WMD immediately before the U.S. invasion persisted long after the Bush administration itself concluded otherwise.”
If corrective facts only make matters worse, what can we do to convince people of the error of their beliefs? From my experience, 1. keep emotions out of the exchange, 2. discuss, don't attack (no ad hominem and no ad Hitlerum), 3. listen carefully and try to articulate the other position accurately, 4. show respect, 5. acknowledge that you understand why someone might hold that opinion, and 6. try to show how changing facts does not necessarily mean changing worldviews. These strategies may not always work to change people's minds, but now that the nation has just been put through a political fact-check wringer, they may help reduce unnecessary divisiveness.
In these examples, proponents' deepest held worldviews were perceived to be threatened by skeptics, making facts the enemy to be slayed. This power of belief over evidence is the result of two factors: cognitive dissonance and the backfire effect. In the classic 1956 book When Prophecy Fails, psychologist Leon Festinger and his co-authors described what happened to a UFO cult when the mother ship failed to arrive at the appointed time. Instead of admitting error, “members of the group sought frantically to convince the world of their beliefs,” and they made “a series of desperate attempts to erase their rankling dissonance by making prediction after prediction in the hope that one would come true.” Festinger called this cognitive dissonance, or the uncomfortable tension that comes from holding two conflicting thoughts simultaneously.
Two social psychologists, Carol Tavris and Elliot Aronson (a former student of Festinger), in their 2007 book Mistakes Were Made (But Not by Me) document thousands of experiments demonstrating how people spin-doctor facts to fit preconceived beliefs to reduce dissonance. Their metaphor of the “pyramid of choice” places two individuals side by side at the apex of the pyramid and shows how quickly they diverge and end up at the bottom opposite corners of the base as they each stake out a position to defend.
In a series of experiments by Dartmouth College professor Brendan Nyhan and University of Exeter professor Jason Reifler, the researchers identify a related factor they call the backfire effect “in which corrections actually increase misperceptions among the group in question.” Why? “Because it threatens their worldview or self-concept.” For example, subjects were given fake newspaper articles that confirmed widespread misconceptions, such as that there were weapons of mass destruction in Iraq. When subjects were then given a corrective article that WMD were never found, liberals who opposed the war accepted the new article and rejected the old, whereas conservatives who supported the war did the opposite ... and more: they reported being even more convinced there were WMD after the correction, arguing that this only proved that Saddam Hussein hid or destroyed them. In fact, Nyhan and Reifler note, among many conservatives “the belief that Iraq possessed WMD immediately before the U.S. invasion persisted long after the Bush administration itself concluded otherwise.”
If corrective facts only make matters worse, what can we do to convince people of the error of their beliefs? From my experience, 1. keep emotions out of the exchange, 2. discuss, don't attack (no ad hominem and no ad Hitlerum), 3. listen carefully and try to articulate the other position accurately, 4. show respect, 5. acknowledge that you understand why someone might hold that opinion, and 6. try to show how changing facts does not necessarily mean changing worldviews. These strategies may not always work to change people's minds, but now that the nation has just been put through a political fact-check wringer, they may help reduce unnecessary divisiveness.
No comments:
Post a Comment