The Scientific American website has a social science article which is somewhat relevant to the topics we discuss here. It’s Why People ‘Fly from Facts’, by Troy Campbell and Justin Friesen. It has this sub-title: “Research shows the appeal of untestable beliefs, and how it leads to a polarized society.”
It’s not specifically about The Controversy between evolution and creationism, but it seems to shed some light on a tactic with which we’re all familiar — moving the goalposts, which Wikipedia describes as:
a metaphor, derived from association football or other games, that means to change the criterion (goal) of a process or competition while still in progress, in such a way that the new goal offers one side an intentional advantage or disadvantage. … Moving the goalposts, similar to “shifting sands” and also known as raising the bar, is an informal fallacy in which evidence presented in response to a specific claim is dismissed and some other (often greater) evidence is demanded. That is, after an attempt has been made to score a goal, the goalposts are moved to exclude the attempt. The problem with changing the rules of the game is that the meaning of the end result is changed, too.
The Scientific American article never uses the phrase “moving the goalposts,” but that’s what the authors seem to be talking about. They begin with an example:
“There was a scientific study that showed vaccines cause autism.”
“Actually, the researcher in that study lost his medical license, and overwhelming research since then has shown no link between vaccines and autism.”
“Well, regardless, it’s still my personal right as a parent to make decisions for my child.”
That’s rather tame. We’ve seen much better in the creationism controversy. This is typical:
Creationist: There are no transitional fossils, so Darwin was wrong.
Sane person: Here’s a whole list of transitional fossils.
Creationist: Oh yeah? Well Hitler was a Darwinist, and you’re going to hell!
Anyway, that’s the kind of argument we’re talking about. Here’s what the authors have to say, with bold font added by us for emphasis:
Our new research, recently published in the Journal of Personality and Social Psychology, examined a slippery way by which people get away from facts that contradict their beliefs. Of course, sometimes people just dispute the validity of specific facts. But we find that people sometimes go one step further and, as in the opening example, they reframe an issue in untestable ways. This makes potential important facts and science ultimately irrelevant to the issue.
We can’t locate their published paper, but we can get along without it. Then the authors give an example from their research:
We presented 174 American participants who supported or opposed same-sex marriage with (supposed) scientific facts that supported or disputed their position. When the facts opposed their views, our participants — on both sides of the issue — were more likely to state that same-sex marriage isn’t actually about facts, it’s more a question of moral opinion. But, when the facts were on their side, they more often stated that their opinions were fact-based and much less about morals. In other words, we observed something beyond the denial of particular facts: We observed a denial of the relevance of facts.
We’re all familiar with that phenomenon. When dealing with creationists, we usually refer to it as reality denial. They give another example and then say:
These experiments show that when people’s beliefs are threatened, they often take flight to a land where facts do not matter. In scientific terms, their beliefs become less “falsifiable” because they can no longer be tested scientifically for verification or refutation.
Yup — that’s how it goes with creationists. Then, as so often happens with social science, the topic swerves into politics:
For instance, sometimes people dispute government policies based on the argument that they don’t work. Yet, if facts suggest that the policies do work, the same person might stay resolvedly against the argument based on principle. We can see this on both sides of the political spectrum, whether it’s conservatives and Obamacare or liberals and the Iraqi surge of 2007.
Here, your Curmudgeon goes his own way. Principle really matters in political issues. A specific government policy might be effective — for example, dictatorships can be terribly effective in enforcing their policies — but that doesn’t mean opposition based on principle is fallacious. Anyway, let’s read on:
One would hope that objective facts could allow people to reach consensus more easily, but American politics are more polarized than ever. Could this polarization be a consequence of feeling free of facts?
Again, we have to point out that political polarization may not be about facts — on either side. It is about facts — or it should be — when it comes to things like teaching creationism in science class. But political conflicts are all too often about the ideology of one party or another, sometimes both. The article continues:
[W]e can experimentally assess a fundamental question: When people are made to see their important beliefs as relatively less rather than more testable, does it increase polarization and commitment to desired beliefs? Two experiments we conducted suggest so.
We’ll skip their example, but here’s what they say about it:
Together these findings show, at least in some cases, when testable facts are less a part of the discussion, people dig deeper into the beliefs they wish to have — such as viewing a politician in a certain way or believing God is constantly there to provide support. These results bear similarities to the many studies that find when facts are fuzzier people tend to exaggerate desired beliefs.
One final excerpt:
So after examining the power of untestable beliefs, what have we learned about dealing with human psychology? We’ve learned that bias is a disease and to fight it we need a healthy treatment of facts and education. We find that when facts are injected into the conversation, the symptoms of bias become less severe. But, unfortunately, we’ve also learned that facts can only do so much. To avoid coming to undesirable conclusions, people can fly from the facts and use other tools in their deep belief protecting toolbox.
Well, dear reader, what did we learn from this excursion into social science? We’re not sure we learned anything new, but perhaps we missed something. What do you think?
• • • • • • • • • • •
. . Permalink for this article