If you read my post on the meaning of skepticism and why it’s important, I hope I’ve convinced you that most of us could stand to be more skeptical. Now let’s look at ways you can improve your skills as a skeptic. We’ll talk about standards you can use to test the validity of a claim as well as ways to recognize your inherent biases.
In that previous post I discussed some ways to recognize when a claim warrants further discretion.
- Is there a reason to lie?
- Would a lie have a significant impact on anything important?
- Who benefits from a lie?
- Is the claim extraordinary?
So now we’ll review some ways to apply that discretion.
How do you spot “extraordinary” claims?
From the list above, the last one—the one we often call the “Sagan Standard”—is probably the most difficult to judge. It certainly can be obvious, but it isn’t always.
Consider Einstein’s general relativity, as an example. There’s a pretty good chance you’re not an expert on that theory. If you haven’t studied physics past high school, it’s a given that you are not.

I’m no expert on it, either. That theory makes some seemingly very bizarre claims. It would be easy to doubt those claims if you haven’t studied the science. Yet, those who are experts on the theory will tell you, it’s one of the most well-supported theories in all of science.
So is the claim that matter warps “space-time” an extraordinary one? Not to a physicist; at least, not anymore. At one time, though, the claim was extraordinary. Fortunately, it has since been supported by overwhelming evidence.
On the flip-side, many religious claims are good examples of extraordinary claims that might not seem extraordinary to those who grew up with those beliefs. Stories of miracles, no matter how ordinary they seem in the context of your faith, are not ordinary in the slightest. If they were, we wouldn’t call them “miracles.”

So, to judge the “ordinary-ness” of a claim you’ll need a bit of perspective. You’ll need to consider your own bias and your own expertise.
When the claim is about a subject you don’t understand well, while it’s still wise to remain skeptical, be cautious about becoming cynical. If you don’t understand the subject well enough to evaluate the claim, it’s best to simply reserve judgment until you have a better understanding. In most cases, it’s best to trust that the people who’ve spent their entire adult lives studying the subject know what they’re talking about.
At the same time, if you are biased by cultural norms such as religious beliefs, you might be tempted to accept a claim blindly when you don’t have a good reason to do so. Questioning the validity of long-held beliefs can be one of the most challenging tasks for a budding skeptic, but it is worth the effort. Regardless of your conclusion, you’ll be much more confident in what you believe.
Can the claim be tested?
In science, there is a concept known as “falsifiability.” For any hypothesis, there must be a way to disprove it if it is wrong. A claim that cannot be falsified is not scientific, no matter how “sciencey” it sounds.
To most non-scientists, “testable” might be a better word. Is there a way to test the truth of a claim? If not, it cannot be assumed to be true. That doesn’t make it false by default, but it does imply that you cannot know with any degree of certainty that it is true.
More to the point, if a claim can’t be tested, it is extraordinary and until someone provides a method to falsify the claim, “I don’t know” is the only thing you can say about it with certainty. In many cases, though, “Probably not” is a reasonable conclusion. Did aliens steal my car keys? Probably not.
Should you trust “experts”?
In practicing skepticism, you might be tempted to doubt the opinions of experts. A better approach, though, is to confirm a person’s expertise, and to look at the opinions of experts as a group; look at the consensus, rather than the opinion of a single expert, regardless of that individual’s credentials.
True expert? Or charlatan?
You’ve probably seen, charlatans and scammers often give the appearance of being experts. They speak as if they know a subject well. If they’re good at their game, they probably do know more than you do.

No matter how knowledgable they appear, though, if most acknowledged experts in the field disagree with them, that should raise a red flag. If they have questionable credentials, or none at all, that should also raise a red flag. If they’re claiming there’s a conspiracy to suppress the “truth,” that should raise a really big red flag. And if they’re trying to sell you something while claiming such a conspiracy, alarm bells should be going off in your head.
The more you have to lose by trusting the maverick, the more you should pay attention to the red flags. The greater the reward to the maverick, once more, the more attention you should pay to the red flags.
Can the dissenters be correct? Maybe. But if they are, their dissenting view will eventually become the consensus. If there’s an actual conspiracy, it will eventually be found out.
None of us is an expert in everything. No matter how smart you might be, there are subjects you don’t know much about. At the same time, there are people who’ve spent much of their lives learning about exactly those subjects.
So, on subjects you don’t know well, it is usually wise to accept the knowledge of experts. That doesn’t mean taking it as gospel, but it does mean you should assume that they are most likely correct and if you feel you have reason to doubt them—while it is possible that you’re right—you might just have more to learn.
Beware of “Dunning-Kruger”

We’ve all experienced it. The “Dunning-Kruger” effect refers to a study by two psychologists, David Dunning and Justin Kruger. In their study, they noticed that very often, people with a little bit of knowledge on a subject vastly overestimate their own expertise, to the point of believing they know more than the acknowledged experts. It’s a case of “not knowing what they don’t know.”
If you’ve had kids go to college, you might have recognized this phenomenon in their freshman year. They take an introductory class on a subject and they’re ready to argue about it with everyone they know. I can’t deny having done this myself. I think most of us have. Then we learn a bit more and realize, there’s a whole lot more to learn.
There’s a popular meme you may have seen—”Don’t confuse your Google search with my [law/medical/nursing/whatever] degree.” In other words, don’t be a victim of the Dunning-Kruger effect. When someone knows more than you, pay attention. Don’t follow blindly, but recognize that you are not the expert.
Avoid confirmation bias

This one has become especially challenging in the age of smartphones and Google. We all have access to just about all the information there is, nearly instantly. If we have a question, we can Google it.
I’m a big fan of this technology. I never have to settle for not knowing the answer to a question.
However, as you’re probably aware, much of the information found online is questionable and quite a bit is outright wrong. When you’re trying to answer a complicated question, be especially cautious if you want a specific answer. If you’re looking for evidence to support what you want to believe, no doubt you can find it online.
“Confirmation bias” is what happens when you’re more willing to accept evidence that supports what you want to believe and more willing to reject anything that goes against it. Like the Dunning-Kruger effect, we’ve all experienced this. While working to improve the skill of skepticism, though, you owe it to yourself to be especially aware of this tendency.
If you find yourself typing phrases into your favorite search engine that sound like, “proof of ___” or “evidence for ____”, a good practice would be to also search specifically for the opposing side. Recognize that the phrasing of your search might have confirmation bias built into it, meaning that you yourself are probably biased.
Social media’s impact on confirmation bias and skepticism
Search engines and social media make the problem of confirmation bias worse than it’s ever been in the past. The algorithms that decide what we see online are very largely guided by what we click. If you historically click the links on everything supporting one view while ignoring the opposing sides—and most of us do exactly that—those algorithms “learn” your bias and will present you with even more one-sided content. Over time, you see less information that might balance your view and your bias is strengthened. Skepticism becomes more challenging when you only see one side of an argument.

If you’ve been around long enough, you have very likely noticed increasing divisiveness in our society. What I’ve just described is a very large reason for that very real change. This makes skepticism all the more important.
As difficult as it is, a good skeptic will make the effort to learn about opposing views and especially to understand those views, even if you don’t agree with them.
Occam’s Razor as a Skeptic’s Tool
One “test” to use when comparing two claims, or two explanations for an observation, is called the law of parsimony; it’s better known as Occam’s Razor. The most familiar phrasing of this principle is, “The simplest solution is most likely the right one.”
Many people misunderstand “simple” in this context, though. I prefer to say, “The solution that requires the fewest assumptions is more likely correct.”
As an example, let’s imagine your puppy is missing and you can only think of two possible explanations: The puppy ran away when you left the door open, or the puppy was kidnapped by inter-dimensional beings and transported through a portal to another dimension.
The first requires us to assume that you could have left the door open and that the puppy was able to slip by you without your seeing it.
The second requires us to assume several things:
- Other dimensions exist
- Other beings that you are unable to see or interact with, exist in those other dimensions
- Those inter-dimensional beings are able to interact with beings in this dimension
- The beings have an interest in your puppy
- A portal exists between dimensions
- Your puppy is able to be transported through that portal
There are probably more assumptions that I haven’t considered, but I think it’s clear which explanation requires the fewest assumptions. This is an obviously far-fetched example to illustrate a point, but I think you can see that the more assumptions we need to make, the less likely an explanation is to be correct.
Use the razor with caution
Occam’s Razor is a useful tool but you shouldn’t rely on it as your sole method of determining the accuracy of a claim. The simpler explanation is more likely to be correct, but that isn’t definite. You can use this tool most effectively when comparing claims that have already been shown to be plausible—the inter-dimensional being example above should realistically have been eliminated from consideration on that alone.
Skepticism is a life-long practice
I’ve given you a few tools to begin your journey, but skepticism is something that requires practice. Like any skill, you will get better at it the more you try, and you will never reach perfection. Everyone can be fooled. You practice to minimize that likelihood.
Perhaps the most important phrase any skeptic should learn to use more often is, “I don’t know.” Not knowing is not a weakness. Thinking you know when you don’t, is.
Did you like this post? Do you have criticism? Anything I missed? Let me know in the comments!
Want to be notified when I post new content? Drop your email address in the box below and I’ll let you know.