More Information Doesn’t Matter (So Liberals Don’t Care If Global Warming Skeptics Have Done More Research)

Here is a pretty amazing essay, not because it is surprising, but because it is about an experiment that tested what we all probably expect: people don’t usually change their minds on ideological issues based on evidence. In “How politics makes us stupid,” Ezra Klein writes of a test that asked questions about how to interpret the results of experiments. The test asked a question about a skin cream experiment. Those who tested better at math skills were better able to correctly interpret the results.

But Kahan and his coauthors also drafted a politicized version of the problem. This version used the same numbers as the skin-cream question, but instead of being about skin creams, the narrative set-up focused on a proposal to ban people from carrying concealed handguns in public. The 2×2 box now compared crime data in the cities that banned handguns against crime data in the cities that didn’t. In some cases, the numbers, properly calculated, showed that the ban had worked to cut crime. In others, the numbers showed it had failed.

Presented with this problem a funny thing happened: how good subjects were at math stopped predicting how well they did on the test. Now it was ideology that drove the answers. Liberals were extremely good at solving the problem when doing so proved that gun-control legislation reduced crime. But when presented with the version of the problem that suggested gun control had failed, their math skills stopped mattering. They tended to get the problem wrong no matter how good they were at math. Conservatives exhibited the same pattern — just in reverse.

Being better at math didn’t just fail to help partisans converge on the right answer. It actually drove them further apart.


being better at math made partisans less likely to solve the problem correctly when solving the problem correctly meant betraying their political instincts. People weren’t reasoning to get the right answer; they were reasoning to get the answer that they wanted to be right.

This shouldn’t surprise us. Ezra Klein himself demonstrates the issue when he uses this phenomenon to discount global warming skeptics who study the issue more than people who blindly accept the account of man-made global warming.

This will make sense to anyone who’s ever read the work of a serious climate change denialist. It’s filled with facts and figures, graphs and charts, studies and citations. Much of the data is wrong or irrelevant. But it feels convincing. It’s a terrific performance of scientific inquiry. And climate-change skeptics who immerse themselves in it end up far more confident that global warming is a hoax than people who haven’t spent much time studying the issue. More information, in this context, doesn’t help skeptics discover the best evidence. Instead, it sends them searching for evidence that seems to prove them right. And in the age of the internet, such evidence is never very far away.

So there it is: All Klein’s research about how smart people confirm their political ideologies is all used to confirm his political ideology. He wants an excuse to write off the people who actually research climate change claims and to commend people who are content to believe what they are told without giving it any scrutiny.

Klein’s lengthy piece has more stuff of value and more obvious ideological posturing. My take-away is that we need to train ourselves and our children to be rigorous about logic, independent thinking, and a careful gathering of the information in order to make sure that the biases of the reporters are not producing fudged data. Only individual-by-individual cultural change can produce a world in which people will have the self-discipline to interpret data accurately.

And I have no reason to believe such independent critical thinking will make Global Warming alarmism look any less irrational than it is.