There’s been no shortage of fact-checkers this campaign season. But Washington Post columnist Shankar Vedantam explains that a number of new studies suggest people don't let go of political misinformation after hearing a correction. In fact, the misinformation spreads.
BOB GARFIELD: This is On the Media. I'm Bob Garfield. BROOKE GLADSTONE: And I'm Brooke Gladstone. This political season there’s been no shortage of fact-checkers and truth squads trying to cut through the fog of campaign-generated misinformation. Just last week we had on an editor of the fact-checking website Politifact. Their mission is based on the logical assumption that good information will push out the bad. The trouble is it doesn't.
Washington Post columnist Shankar Vedantam was on our show a year ago, armed with studies suggesting that attempts to end rumors may actually help spread them. This week, his column cites newer studies that focus on this phenomenon, specifically in the realm of politics.
In one study, John Bullock at Yale University exposed subjects to an ad by an abortion rights group bashing then Supreme Court nominee John Roberts for supporting fringe groups that bombed abortion clinics. This ad caused viewers to lower their opinion of Roberts.
Then the group was shown a refutation of the ad, explaining why it was false. Bullock also told the subjects that the abortion rights group had withdrawn the ad. But -
SHANKAR VEDANTAM: Bullock found that when Democrats heard the misinformation and the refutation, many more Democrats continued to think worse about Roberts than they had originally. So, in other words, even though the conventional view is that the correction fixes the bad information, Bullock found this was not the case. BROOKE GLADSTONE: What they found specifically was that they still retained some of that bad impression they got when they consumed the bad information, whereas on the Republican side, the approval rating went back exactly to where it was before they heard the bad information and the refutation. SHANKAR VEDANTAM: That’s exactly right. So, in other words, when people are predisposed to buy a piece of bad information, the refutation ends up essentially not correcting the misinformation entirely. It corrects it only part way, leaving residual feelings of negativity toward whoever the target of the misinformation was. BROOKE GLADSTONE: Some of these new studies found that the conservative reaction to bad information and then the correction of bad information was different from liberal reaction. SHANKAR VEDANTAM: On the face of it, it’s actually inexplicable. The correction actually seemed to make the bad information worse. BROOKE GLADSTONE: I don't get it. Give me an example. SHANKAR VEDANTAM: The researchers, Brendan Nyhan and Jason Reifler, brought in a bunch of Republicans and told them about the Bush Administration’s claims that there were weapons of mass destruction in Iraq before the 2003 U.S. invasion. And then they provided the volunteers with essentially a correction of that information.
About 34 percent of conservatives believed that Iraq had either hidden or discarded the weapons of mass destruction before the U.S. invasion, but after they heard both claim and refutation, 64 percent of conservatives believed that Iraq had had the weapons of mass destruction.
In other words, the refutation caused more people to believe in the Bush Administration’s claim than they did before. BROOKE GLADSTONE: Are there any other examples of that? SHANKAR VEDANTAM: There was one other example – tax cuts increase revenue. This has been a subject of some contention. And, again, the researchers, Brendan Nyhan and Jason Reifler, brought in both conservatives and liberals and told them about this claim of the Bush Administration, and then provided them a refutation by several economists, including several who worked for the Bush Administration, both current and past officials. BROOKE GLADSTONE: People who are arguing against the idea that tax cuts increased revenue. SHANKAR VEDANTAM: That's right, and where 35 percent of conservatives believed the claim that tax cuts increase revenue before they heard the refutation, 67 percent of those provided with both the claim and the refutation believed the claim.
So, again, the refutation strengthened the power of the bad information rather than weakening it. BROOKE GLADSTONE: How is this possible? SHANKAR VEDANTAM: The most plausible explanation seems to be that when conservatives are strongly emotionally invested in that point of view and they hear a refutation, they might start to argue back against the refutation in their own minds. And this internal argument is so strong that it eventually persuades even more of them that the misinformation was accurate.
And it may be that this is true for liberals, as well as conservatives. It may be that the researchers found this effect primarily among conservatives because the questions they asked conservatives were particularly hot-button issues for conservatives. BROOKE GLADSTONE: We have to say at this juncture that researchers Nyhan and Reifler are both self-described Democrats. Can conservatives believe what they say? SHANKAR VEDANTAM: [LAUGHS] As someone who writes about science, the reason I wrote about these experiments is that they were all controlled experiments. You could argue, I suppose, that it’s possible the researchers falsified their data, but short of that, it’s difficult to see how they could have arrived at this conclusion because of their own preconceived political notions. BROOKE GLADSTONE: When we discussed this last year, you suggested that the only way that the media can correct a lie without reinforcing it is to not state the original lie. I don't know, what’s a media to do? SHANKAR VEDANTAM: The researchers don't have a very good handle on what should be done to solve this problem. When I spoke with Jason Reifler at Georgia State University, he said that he was conducting some experiments to test a very novel approach, which is what happens if you boost people’s self-esteem at the same time that you’re giving them a refutation? BROOKE GLADSTONE: What do you mean? Like, nice shoes, and, by the way, that story is wrong? SHANKAR VEDANTAM: [LAUGHS] Well, sort of along the lines that, you know, when I write a column and my editor wants to edit something out of the column, I'm much better able to hear his criticisms of the column when he tells me things about the column that he likes.
And what Jason Reifler and colleagues are thinking is if you can somehow convey to political partisans that changing their mind doesn't necessarily mean admitting that they are stupid or deranged, they might be more likely to change their minds.
In other words, if there was a way to turn down the emotional temperature, as it were, people might be better able to perceive the information and the refutations of bad information more clearly. BROOKE GLADSTONE: As for emotion, is there any way that the media can help remove emotion from a story that they're trying to set straight? SHANKAR VEDANTAM: That’s a good question, Brooke, and, in fact, there’s a further complication, which is in much of the media, building up emotion is precisely how you get people to read your product or watch you on television or listen to you on the radio.
And so, telling media executives the thing to do is to turn down the emotional temperature is not quite what they want to hear. BROOKE GLADSTONE: Shankar, thank you very much. SHANKAR VEDANTAM: Thanks so much, Brooke. BROOKE GLADSTONE: Shankar Vedantam writes the Department of Human Behavior column for The Washington Post.