A group of psychologists gave a bunch of people a math problem:
Lots of people get this wrong. (That’s not the surprising part.)
Here’s where things get more interesting. They changed the problem into one about gun control: leaving the numbers exactly the same, they posed the problem as one about whether cities that enact gun control laws saw an increase or a decrease in crime. They also surveyed the participants to determine (a) their political views and (b) some sort of measure of their numeracy. In both the skin-rash and gun-control cases, they had two versions of the question, in which all that was switched was the right answer — that is, they interchanged the labels “Rash got better” and “Rash got worse,” leaving all the numbers unchanged.
The top graph is for the skin-rash version of the study. The bottom rash is the gun-control version. In time-honored fashion, red is conservative and blue is liberal.
No big surprises in the skin-rash results (except possibly the suggestion that the curves rise a bit at very low numeracy, but I don’t know how significant that is). The action is all in the lower graph, which indicates the following:
- Both liberals and conservatives are more likely to get the answer right when it accords with their political preconceptions.
- For liberals, there’s a large difference only among the highly numerate — the innumerate do equally badly whichever political valence they see.
Fact 1 is interesting but not astonishing. Fact 2 is the really fascinating one.
The proposed explanation:
Our first instinct, in all versions of the study, is to leap instinctively to the wrong conclusion. If you just compare which number is bigger in the first column, for instance, you’ll be quickly led astray. But more numerate people, when they sense an apparently wrong answer that offends their political sensibilities, are both motivated and equipped to dig deeper, think harder, and even start performing some calculations—which in this case would have led to a more accurate response.
“If the wrong answer is contrary to their ideological positions, we hypothesize that that is going to create the incentive to scrutinize that information and figure out another way to understand it,” says Kahan. In other words, more numerate people perform better when identifying study results that support their views—but may have a big blind spot when it comes to identifying results that undermine those views.
What’s happening when highly numerate liberals and conservatives actually get it wrong? Either they’re intuiting an incorrect answer that is politically convenient and feels right to them, leading them to inquire no further—or else they’re stopping to calculate the correct answer, but then refusing to accept it and coming up with some elaborate reason why 1 + 1 doesn’t equal 2 in this particular instance. (Kahan suspects it’s mostly the former, rather than the latter.)
The Mother Jones article suggests that “both liberals and conservatives” show effect number 2 — highly numerate people being more biased than less numerate people — but from the graph the effect seems much smaller for the conservatives. It seems to me that the liberals are the ones whose behavior needs explaining.
With that caveat, this explanation may well be right, but I wonder if there might be other effects. Maybe highly numerate liberals differ from less numerate liberals in some other way, such as more strongly-held political views. I’m not necessarily espousing that speicfic explanation; I’m just wondering. I’d be interested to hear other ideas.