What if I told you that I can turn invisible? You probably wouldn’t believe me and would want me to prove it. But what If I said that I can only turn invisible when no one is looking at me? You might recognize that you have no way to disprove my claim, but I doubt you are going to be compelled to take me seriously.
Astute students of critical thinking may recognize this technique as special pleading. Special pleading is a logical fallacy in which we respond to disproofs of our claims or beliefs by saying those disproofs don’t count, because our claim or belief is special. A classic example is when a psychic agrees to have their powers tested and, upon failing the test, they say the test was unfair or designed to produce a negative result. I told you that I could locate water blindfolded using only a dowsing stick, but now that I’ve failed I’ve realized you didn’t use water from an underground source. This form of special pleading usually comes after-the-fact, meaning we present claims or beliefs as if they could be tested, but then, upon failing, we decide the test doesn’t apply.
In the above example we are attacking the test itself. There is another more common form of special pleading in which we do not condemn the test, but rather, we modify our claim. This is a form known as moving the goalposts. The result is the same – our claims cannot be tested – but the path is a little different. Carl Sagan illustrates this beautifully in his book The Demon-Haunted World, in which he claims to have a dragon in his garage.
“A fire-breathing dragon lives in my garage”
Suppose I seriously make such an assertion to you. Surely you’d want to check it out, see for yourself. There have been innumerable stories of dragons over the centuries, but no real evidence. What an opportunity!
“Show me,” you say. I lead you to my garage. You look inside and see a ladder, empty paint cans, an old tricycle–but no dragon.
“Where’s the dragon?” you ask.
“Oh, she’s right here,” I reply, waving vaguely. “I neglected to mention that she’s an invisible dragon.”
You propose spreading flour on the floor of the garage to capture the dragon’s footprints.
“Good idea,” I say, “but this dragon floats in the air.”
Then you’ll use an infrared sensor to detect the invisible fire.
“Good idea, but the invisible fire is also heatless.”
You’ll spray-paint the dragon and make her visible.
“Good idea, but she’s an incorporeal dragon and the paint won’t stick.”
And so on. I counter every physical test you propose with a special explanation of why it won’t work.
So, rather than saying the test itself was the problem, we are left with a convoluted claim that defies every test. We’ve moved the goalposts so far that no one can reach them. This is sometimes referred to as supernatural creep because we can start with a claim that is plausible and ever so slowly creep towards the supernatural, crossing the line of falsifiability along the way. A claim that there is a race of Bigfoot living undetected in the forest is at least something we can investigate. But when we fail to find Bigfoot using traps, dogs, and trail cameras, the claim might then be modified to explain that Bigfoot has the ability to somehow sense and avoid traps, dogs and trail cameras. Eventually we get to a place where Bigfoot is a psychic interdimensional being capable popping in and out of existence as he pleases. And yes, there are people making such claims. Again, upon hearing such a claim you may recognize that you have no way to disprove it, yet you still wouldn’t be compelled to take it seriously.
But what if I don’t want to wait for testing to make my claim special? What if I want to preemptively shield my claim from criticism or testing of any kind? Luckily I have another form of special pleading that I can apply before-the-fact. This is known as immunized hypothesis. I can simply construct my claims in such a way as to deflect objections of any kind. I can claim to have psychic powers that do not work in the presence of skeptics, or I can claim that ghosts are real but only show themselves to people that already believe in them. I am still adding caveats to my claims, but I am doing it before people have a chance to question them. It’s like adding wheels to the goalposts before the game even starts. This may not seem like a particularly persuasive technique to use in the real world, yet we see two common forms of this tactic on social media that seem to enjoy a considerable amount of success.
If I tell people that biotech companies are forcing farmers to use their seeds and produce poisonous crops, and then I respond to any objections by asking “who paid you to say that,” I’ve immunized my claim from any criticism. I just include the idea that people are being paid to disagree with me as part of my claim. Anyone who argues against me can be dismissed as simply being a paid shill, which confirms at least part of my hypothesis and relieves me of any duty to defend. This may sound silly and transparent, but it’s something skeptics and science communicators hear on a daily basis and often from people who are completely sincere. It’s also a very common technique used by internet gurus and promoters of alternative medicine to shield themselves from criticism, some of which have millions of followers. It’s known as the shill gambit.
If I tell you that naturopaths know of a particular fruit which can cure cancer but that such information is being suppressed by Big Pharma, I’ve immunized my claim from testing of any kind. Not only can I dismiss detractors as merely being part of the conspiracy, I can dismiss all scientific evidence as well. After all, science is just a method of testing claims, and if my claim cannot be tested that makes any scientific study or review ineffective. I don’t have to produce clinical trials if my claim includes the idea that such trials are being denied. If there actually have been trials but they are negative, I can claim that these negative results were manufactured by the conspiracy. Conspiracy is the easiest and ultimate method of immunizing a hypothesis, since all evidence against the hypothesis simply becomes part of the conspiracy. This technique is again a favorite of gurus and promoters of alternative medicine, but it also lends itself to all forms of science denial. It’s no coincidence that denial of everything from vaccines to climate change to the moon landing grounds itself in conspiracy rhetoric. It’s easy, it’s simple, and it’s encompassing. But most importantly, for millions of people, it works.
Although claims of conspiracy and allegations of shilling should be as unimpressive as the claim that I can only turn invisible when no one is looking, these techniques are prominent and comprise a large portion of the push-back science communicators receive when trying to debunk pseudoscientific claims. It’s also something that is not rare to hear when just talking among friends. To a large portion of the public this small bit of sophistry is enough to make the conclusions seem reasonable and shut down any inquiry.
While the fact that conspiracies do happen and that entities have been known to bankroll disinformation campaigns is enough to warrant skepticism, the mere accusation itself should not be enough to convince anyone. “Follow the money” can indeed be pertinent advice, but you have to actually do the following and show your work, not merely construct a narrative which incorporates a profit motive. A good skeptic knows to ask for evidence, and when request for evidence is substituted by merely immunizing the hypothesis, it’s good reason to not take the claim seriously. If my claim only holds up when no one is looking, it’s not a claim worth being looked at.
I’ve dealt with quite a few of these types of arguments regarding my Myth of the Wellness Warrior series. There certainly seems to be no shortage of willingness to forego critical thinking.