Weaponizing the scientific method

We’re experiencing an unprecedented level of lying in American politics right now, mostly courtesy of our “so-called” president. What is there to do about this? Right now it seems like no amount of calling out bullshit stops Trump from baldly lying, or his supporters from accepting the lies, or congressional Republicans from spinelessly hiding in a corner pretending it’s not happening. Trump is doubling down on his lies and lashing out at the media for exposing him, which threatens our free press and our ability to fight back against his wannabe-authoritarian regime. I’ve talked about how hopeless this makes everything seem before, but also how we don’t have much of a choice except to fight back. We are still scrambling to figure out the best way to fight.

As a scientist and someone who prefers to rely on logic to make decisions, I believe that one way to do this is by teaching people to get comfortable using the scientific method all the time. It’s not really that hard once you train yourself to think that way and apply it to all kinds of situations, even those that occur outside of the lab. So what is the scientific method? In a nutshell, it’s a set of principles used to conduct evidence-based inquiry. It’s founded in the idea that you can make an observation (“X phenomenon occurs”), extrapolate a hypothesis to explain the observation (“Y is acting on Z to cause X phenomenon”), and then do tests to find out if your hypothesis is correct or not (“If I remove Y, does X phenomenon still occur?”). A good scientist will form multiple possible hypotheses, including null hypotheses (“Y is not acting on Z to cause X” or “Y and Z have nothing to do with X”), and set up tests that will generate observations to address each hypothesis. At the root of the scientific method is thinking of multiple possible scenarios and applying skepticism to all of them, as opposed to just accepting the first one you think of without questioning it further. Oftentimes you come up with a hypothesis that makes logical sense and is the simplest explanation for a phenomenon (i.e. it’s parsimonious), but when you investigate it further you find that it is not at all correct.

Let’s apply this to politics today. Here’s a basic example: Trump lost the popular vote by nearly 2.9 million votes, but he keeps claiming that this is due to massive voter fraud where millions of people voted illegally. You could just take his claim at face value and not question it. After all, he is the president, so he’s privy to lots of information the general public doesn’t have, and as the leader of the U.S. he should be acting with integrity, right? You could use this logic to form the hypothesis that he is correct and there is evidence of massive voter fraud. Or you could apply a bit of skepticism and formulate an alternative hypothesis: Trump’s claim is false, and there’s no evidence of the massive voter fraud that he cites. How to test this? You can look for the evidence that he claims exists just by Googling… and you won’t actually find any. What you will find are instances of his surrogates claiming that the voter fraud is an established fact, various reputable news agencies and fact checkers debunking the claim (even right-leaning Fox News admits there is no evidence for the claim), and a complete dearth of any official reports that massive voter fraud occurred (which I would link to, but I can’t link to something that doesn’t exist). In support of the claim you will find right-wing conspiracy websites like InfoWars that don’t cite any actual evidence. So based on that inquiry, you could conclude that Trump’s claim is false. It didn’t take much skepticism or effort to address the question, just enough to ask “Can I easily find any solid evidence of this?”

One problem with this strategy: we aren’t doing a great job at teaching people how to think rationally and critically. On a large scale, we would do this by refocusing our education standards around critical thinking (part of what Common Core and Next Generation Science Standards aim to do, albeit with mixed results over the methods and implementation with which they teach the scientific method/critical thought). There has been a lot of pushback to this, particularly in conservative areas. Why? One explanation could be general distrust of government and antipathy towards regulations/standards demanded by the federal government. Another could be over-reliance on religion, which fundamentally demands faith in the absence of evidence. I’m not going to fully wade into this murky debate right now, and before I do I’ll say that not all religion is bad, and it’s not true that no religions teach their followers to think critically and analyze things. But some religious groups don’t teach these principles, and they rely much more on encouraging their followers to just believe what they’re told by their pastor (or whatever religious leader), or face moral doom. This dangerously reinforces the idea that it’s okay to just blindly follow certain authority figures without question. It extends past the church doors, to the school teacher slut-shaming high school girls instead of providing them with comprehensive sex education, to the public official who expresses skepticism over climate change despite an abundance of evidence that it’s happening and caused by humans. People believe what these authority figures say. We’re seeing it now, as more than half of Republicans accept Trump’s claim that he really won the popular vote, with the percentage being higher among Republicans with less education. One of the main reasons we hear from Trump supporters why they voted for him was his tough stance on immigration, and they seem to be happy with his controversial travel ban, even though just recently the Department of Homeland Security found that people from the countries targeted by the ban pose no extraordinary threat compared to people from other Muslim-majority countries (Trump rejected this report, even though it was ordered by the White House). This underscores the importance of providing people with an opportunity to learn and practice critical thought. Rather than putting in a tiny bit of effort to look for the evidence that those claims are true, or thinking about their news source to figure out if it’s biased or not, they blindly trust what Trump says.

Fixing the education system to provide more training in critical thought and use of the scientific method is absolutely necessary long-term, but right now we need a strategy to deal with people who don’t have that training. So if you have conservative friends or family and you’re brave enough to talk politics with them, ask them why they believe things to be true. If they cite something as evidence that isn’t rigorous, ask them why they trust that source. I think it’s possible to do this without talking down to them; most people are probably capable of applying logic in their thinking even if they haven’t been trained to do so. Instead of trying to argue that someone is wrong and back it up by saying “here’s a fact to support my argument and you should believe it because it’s true” (even if that’s accurate), ask them why they think your fact isn’t true, and respectfully lead them to your evidence to back up your argument. Perhaps it won’t work all the time, but if the seed of skepticism can be planted in at least some people, they may be more careful in their voting decisions in the future. If you rely on logic and rational skepticism to make your decisions, you have an obligation to help other people do the same. It’s worth a try.

 

Featured image: Barbara Lee’s town hall, February 18th, Oakland.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s