Two Men, Two Realities
To see just how easy it is to be fooled, one need only visit the controlled confines of the university laboratory. In the spring of 2006, Nyhan and his research partner Jason Reifler of Georgia State University gathered conservative and liberal students to test their resistance to factual information. They asked the group to read an article that included President George W. Bush’s claim that his tax cuts had increased revenue for the U.S. Treasury, which was provably false. Then they added a factual correction: the Bush tax cuts led to a three-year decline in tax revenue, from $2 trillion in 2000 to $1.8 trillion in 2003.
The correction worked among liberals, but among conservatives it produced a curious backfire effect: conservatives were nearly twice as likely to say the Bush tax cuts increased revenue after they had been told this was not true. Such distortions are not limited to the conservative mind. The researchers presented an article showing John Kerry’s claim from 2004 that he would “lift the ban on stem-cell research” imposed by Bush, followed by corrective information: Bush never actually banned stem-cell research; he prevented federal money from funding research on a subset of embryos. The true information had a corrective effect for conservatives and moderates but no impact on liberals. Once again, personal views had intervened. “The more we care about politics and the more it becomes central to our worldview, the more threatening it becomes to admit that we are wrong or our side is wrong,” Nyhan concludes. The studies show that facts that contradict our biases actually have the effect of reinforcing them.
Even more factual information might seem like a good solution to this problem. But the reality is more complex. Researchers have demonstrated in similar conditions that pieces of false information, once heard, establish themselves as “belief echoes” that can persist even after a falsehood is corrected. There is also a tendency among those with more information to be more biased against reality. In 2006, Danielle Shani, then a Princeton graduate student, analyzed a large-scale election survey taken in 2000 that asked voters for evaluations of the Clinton presidency while gauging their levels of political knowledge. She found that more-knowledgeable voters actually showed more bias. Democrats and Republicans, for example, differed predictably on whether the Clinton presidency had improved or damaged national security. But among highly informed Democrats and Republicans, the differences were more stark. When asked if the budget deficit had increased under President Clinton, those with more information exhibited a bias 5.5 times larger than those who knew less.
The bias extends to how people digest news. In a 2007 study published in the Quarterly Journal of Political Science, participants were asked to rate the bias contained in a single news report that was alternately identified as originating from Fox News, CNN and a fictional television station. Simply changing the brand attached to the report changed people’s views of the information. People made assumptions about the veracity of the news independent of what the news actually reported. “As a result, individuals sometimes create bias even when none exists,” concluded authors Matthew Baum of Harvard and Phil Gussin of UCLA. The effect was stronger among those who knew more about politics.
One hint as to why this is the case can be found in other research on the interaction between emotion and fact. Some of the same emotional impulses that lead voters to seek out more information — concern, insecurity and fear, for example — skew their ability to accept accurate information. A 2008 Nyhan and Reifler study asked some research subjects to write a few sentences about a time when they had upheld a value that was important to them. The idea was to get subjects feeling good about themselves before they had their political biases challenged by facts. The exercise worked: when presented with evidence that the 2006 Iraq troop surge had reduced the number of insurgent attacks there, supporters of withdrawing U.S. forces from the country were more likely to accept the validity of the surge after a self-affirming exercise than without the exercise. Self-confidence allowed people to overcome their biases.