Now more than ever, everyone could use a healthy dose of skepticism in order to filter out the bullshit that bombards us incessantly. First, a few words about skepticism - what it is, why it's useful, and how it can improve critical thinking skills.
As many words that end in ism, skepticism comes from Greek. The verb σκέπτομαι means to look at or consider carefully, and inspired an ancient philosophical school of thinkers who called themselves "skeptics." In the modern sense, skepticism simply means questioning assertions and withholding judgment until evidence is considered.
When should you be skeptical? Well, certainly in some situations more than others. If your friend Bob calls you and says "apples are 50 cents off per pound at the grocery store this week," you probably needn't seek out confirmation of this from other sources. If, on the other hand, Bob calls and says "there's a magical tree giving away apples in front of the grocery store," some skepticism is reasonably warranted. The amount of skepticism shown towards a claim should be inversely proportional to how likely the claim is to be true, based on your knowledge and experience. The less believable a claim seems to be, the more you should be skeptical about it.
So, how can you form well-reasoned, well-informed opinions that don't suck? By following these rules:
1. Be a sieve, not a bucket. Or, as the saying goes, be open-minded, but not so open-minded that your brain falls out. Be open to new ideas, but not universally accepting of them. Keep your bullshit detector switched on at all times.
2. Be willing to be wrong. Good skeptics should be open to new information, even if it contradicts long-held beliefs. I cannot stress this enough: if you're unwilling to consider that you're wrong about something, you have no hope of discovering the truth. "When an honest man is mistaken, he either ceases to be mistaken, or he ceases to be honest."
3. Be aware of common pitfalls of reasoning, and know your logical fallacies so you can spot them. Our brains aren't actually wired perfectly for critical thinking; we have an overactive tendency to find patterns, and we are frequent victims of our own confirmation bias.
4. Don't fight above your weight class. Admit when you simply don't know enough about an issue to have a strong opinion about it. Not everyone can be an expert on everything, and if you don't know, just say so.
5. Question the sources. Not all information is created equal. Not all scientific studies are equally valid. Just because something is published or reported doesn't mean that it's accurate. Some scientific journals have better reputations than others. It can be extremely difficult to wade through scientific studies and decide what is credible and what is questionable, but it's an exercise well worth the effort.
With these things in mind, let's take a look at some typical areas of discussion in which people frequently have poorly-reasoned, irrational, or ignorant opinions.
Issues Involving Complicated Science
See point number 4 above. There are some things in this world which are simply best left to the experts to debate, so if you're not an expert, be aware of your limitations in understanding. A perfect example of this is climate change. Unless you're a climate scientist, I'm not interested in your opinion on the issue, because you have nothing of value to contribute to the discussion. If you disagree with the majority of climate scientists, and you're not a climate scientist, chances are good that you have a biased or uninformed opinion on the matter. Chances are not good that you have somehow understood something that thousands of experts have not.
Highly Polarizing Issues
Some issues tend to polarize opinions heavily, with few people remaining neutral. In situations like these, especially if the issue is a complicated one, polarized opinions are generally based on one-sided information and confirmation bias. A perfect example of such an issue is the Affordable Care Act, known colloquially as "Obamacare." Opinions on this piece of legislation tend to be polarized along political party lines, and several polls have shown that the general public (the same people who are eager to tell you how much they love or hate the law) has absolutely no idea what it says or does. (See The Onion for a sadly accurate portrayal) This is because the act is 906 pages long, and so the public is at the mercy of news outlets to paraphrase and interpret it. As with most complicated issues, the truth lies somewhere in the middle of the two extremes.
Issues About Food And Medicine
Issues that have a possible direct effect on our personal health are treacherous because they evoke an emotional response. Our hearts are for circulating blood, not for thinking, and an issue that stirs up emotions might also obfuscate our critical thinking faculties. A perfect example of this is GMO food and the lightning rod Monsanto Corporation. People are susceptible to one-sided or blatantly false information about GM technology or about companies like Monsanto because the stakes are so high - our well-being could be negatively affected. In fact, this has become such an oft-cited bad argument that the fallacy has taken the name argumentum ad Monsantum. Basically the failure in logic here is associating an evil corporation with an evil product. It does not follow that GMO food is bad simply because you think Monsanto is bad. It may still turn out to be true that GMO food is bad, but bias-free evidence must be put forward to prove it. In cases like this, the only rational course of action is to follow the science and the findings of regulatory organizations. See point number 5 above.
The anti-vaccination movement is similarly misled by emotion, especially because the issue frequently gravitates towards the health of children. On this issue, see point 5 above and the section below on opposing scientific consensus.
There's a reason that you never hear of a conspiracy theory actually being validated and becoming the new mainstream explanation for an event. Conspiracy theories are the product of a comprehensive failure of reasoning, a worst-case-scenario of irresponsible cognition. The most salient example of this sort of nonsense at present is the 9/11 Truth movement, proponents of which claim that the terrorist attacks were orchestrated by the government, and that basically every finding of the official investigation is wrong. To give an extremely abbreviated list of what's wrong with these sorts of arguments: people who think this way are obviously tainted by strong biases, are unwilling to evaluate evidence fairly, assume the conclusion of their argument and insert it as a premise, and live in a highly-protected echo chamber in which they only communicate with other people who agree with them. Conspiracy theories aren't actually theories at all; a theory is a set of testable propositions which can be used to explain certain phenomena. Conspiracy theories don't explain anything - they simply point out what they deem to be unanswered questions in the accepted theory and then wildly posit alternatives based on no evidence. This is a perfect example of what happens when your bullshit filter fails, and your brain falls out of your excessively-open mind.
Contrary to Consensus
While it's fun to be a contrarian, there are risks when standing on the wrong side of the majority. It depends who that majority is, however; an appeal to majority opinion to support an argument is a logical fallacy (ad populum), while an appeal to scientific consensus can be a strong justification for one's position. It's certainly true that scientific consensus can be wrong, but the comforting thing about science is that it is adaptable to new information and open to change.
The wonderful thing about truth is that, by necessity, it breeds consensus over time. If something is demonstrably true, eventually there will be no choice but to accept it. Thus popular misconceptions about the natural world have been eradicated over time with the discovery of new information. We now realize that the Earth isn't flat, and it isn't the center of our solar system, let alone the universe. It would be ridiculous to hold a contradictory position on these matters today, because the truth is undeniable.
Still, there are other such issues of overwhelming scientific consensus today about which some people obstinately hold contradictory views. The term manufactroversy has been coined to describe these beliefs, as they pretend that a controversy exists when in fact there is none. Darwinian evolution by natural selection, for example, is overwhelmingly supported by evidence and accepted by scientists as fact (as made hilariously manifest by Project Steve), as is man-made climate change, and the efficacy, safety, and necessity of vaccinations to prevent communicable disease. There is no controversy whatsoever in the scientific community about any of these matters, and one disagrees with these assertions only at one's own peril. To refer again to point number 4 above - if you really feel like you can justify your opinion when you disagree with overwhelming scientific consensus, you should run to the nearest scientific journal and publish your innovative competing theory. If truth is on your side, and you can demonstrate it empirically, then science will have no choice but to agree with you. Science loves to be proven wrong, and that's why it can be trusted above any other method of finding the truth.
It is incredibly difficult to approach all questions with an unbiased and open mind. It's even more difficult sometimes to separate the good information from the bad in order to discover the truth about anything. To adopt a skeptical approach to evaluating claims means not to make a judgment before careful analysis of the best available evidence. It's too easy to agree with our friends on complicated questions or look to our favorite news outlet to tell us how to feel about controversial issues. Thinking for yourself is hard work, but being a skeptic is the best way to be sure that your opinions don't suck.
Resources for Skeptics
- The Skeptic's Guide to the Universe - with a great weekly podcast
- The Skeptic's Dictionary - great summaries of hoaxes, fallacies, and pseudoscience
- The Skeptic's Annotated Bible - indispensable for studying ancient holy books
- TechNyou's YouTube channel - short but informative videos about science and critical thinking