When protesters marched down Pennsylvania Avenue two weeks ago, they had one major goal in mind: bringing attention to President Donald Trump’s climate policies. After the president called climate change a “hoax” on the campaign trail, and retracted Barack Obama-era climate change measures in his first few months in office, scientists and citizens alike have cried foul. So some celebrated his first 100 days as president by screaming at the White House and holding up signs bearing sayings like “Don’t be a fossil fool,” and “The seas are rising and so are we.”
Among those marching was science journalist Dave Levitan. Levitan wrote Not a Scientist: How Politicians Mistake, Misrepresent, and Utterly Mangle Science before our current president became president, but despite his absence on the page, it’s hard to read through the book without Trump’s administration close in mind.
In just five months, that administration has proposed substantial budget cuts to the Environmental Protection Agency, the National Institutes of Health, and NASA. (Its recently proposed budget for 2018 calls for a 31 percent decrease in EPA funding, an 18 percent decrease in NIH funding, and a 0.05 percent decrease in funding for NASA’s earth science projects). Trump has chosen climate change denier Scott Pruitt to lead the EPA, an agency Pruitt has sued at least 14 times; and, ahead of his inauguration, Trump’s transition team requested (and was denied) the names of all Department of Energy employees who’d worked on climate change issues. He and his appointees have confirmed what many early critics of his political candidacy feared: not just a dismissal of, but also a hostility toward, scientific facts.
His administration may be raising the blood pressure of those worried about the spread of anti-science sentiment, but in Levitan’s book—which identifies 12 unique ways in which politicians attack and skew scientific findings—he tells us, Hey, disregard for hard evidence isn’t new! Jumping back and forth within over a century of America’s timeline, Levitan points out common threads in the ways in which politicians deny scientific findings. The resulting book isn’t optimistic, but it is illuminating: By studying such a long history of political tricks, Levitan has created a handy guide for readers on how to separate the truth from “alternative facts.”
Pacific Standard talked with Levitan about the patterns that emerged in his research for the book, what he thinks concerned constituents should be on the lookout for, and how Americans can stay optimistic for a political future that is more respectful of science.
You finished this book before Trump was elected, and largely before anyone thought he was a valid candidate. After the first few months of his presidency, how has his administration compared to those previous in terms of misrepresenting science?
Trump himself has said very little on scientific subjects so far, but he has installed a cadre of officials who have already produced some remarkably anti-science sound bytes. Pruitt’s claim that carbon dioxide is not the “primary” driver of climate change is probably chief among those. Though there have certainly been some unscientific administrations in the past—George W. Bush’s had a particularly dire record on many scientific issues, though not all—those officials and the apparent disdain for science that we can see in early budget proposals suggests this will be a particularly rough period for science.
How has technology, and the Internet specifically, changed the game?
I’d say there’s one positive and one negative here, and I’ll start with the negative: With such an incredible array of information available online, politicians can make use of some really dubious sources for scientific issues but then point to that source if called on it—this is the “Blame the Blogger” (an obfuscation technique I describe in the book). On the flip side, the Internet has such a wealth of good information that it allows journalists and the general public to fact check any claims at incredible speed. Quickly debunking scientific nonsense could help slow its spread.
Climate change, abortion, and vaccination come up again and again throughout the book. What is it about these issues that the conversations surrounding them are so often diluted by bad or false science?
Money, morality, and fear. Climate—and tobacco and acid rain before it, among others—lends itself to misinformation because there is a gigantic industry that benefits from that misinformation. Abortion is an issue where politicians seize on what many see as a moral dilemma to spread truly outlandish scientific claims, since some people are likely predisposed to believe them, and this functions as an election wedge issue. And on vaccines, it is entirely understandable why parents might be scared of things that might harm their kids, but it’s a bit harder to sniff out the “why” on that one—who really benefits when vaccination rates drop due to misplaced fears? It almost feels more like a contrarian view for the sake of contrarianism.
Are there other areas where constituents should be especially skeptical?
I would say genetically modified organisms is a good one to always look twice [at] before believing what a politician says. That one crosses party lines a bit differently than usual, and there has been a ton of wrong information thrown around.
Where do you think the anti-science sentiment comes from? Why is the spread of mangled science so successful?
This is a great question, and honestly I’m not sure I have a great answer. A simple reason is that science is complicated—it’s a lot easier to say “a fetus feels pain at 20 weeks” than it is to start a discussion on the intricacies of neuroanatomic development. Sound bytes spread easily, the catchier the better, and catchy usually will [be], to some degree at least, wrong. I’m sure there are many other reasons why scientific errors manage to metastasize so easily, but that’s a start.
You’re adamant about the importance of politicians getting their science exactly right. What are some of the most egregious examples of misinformation causing real tragedy or disaster?
Well, I’m not sure I can pinpoint disasters that trace to simple errors like that, but there are plenty of examples of scientific obfuscation from politicians contributing to ugly or tragic outcomes. These have tended to be longer-term issues in the past—just think of the decades the government spent ignoring tobacco risk and how many people died of lung cancer over those years.
Today, though, it feels like we could be staring down some truly tragic issues: say, the news that the Trump administration wants to roll back programs aimed at reducing lead exposure in children, which we could pretty easily trace to statements about “costly” regulations with little benefit, and so on. Those types of claims ignore piles of scientific evidence, and could have an effect almost right away.
In outlining specific ways in which politicians misrepresent and misuse science, you point out “tells” that something about a claim is amiss. What are some of those markers?
Some of the best things to look for involve precision in one direction or another: “No warming in 17 years,” say, or “We don’t know how much humans contribute.” If you hear things like that, ask yourself, why 17, and not 16 or 18? And, who benefits if that lack of precision is highlighted? It’s also good to look out for words like “report” when the source isn’t mentioned, or just generally for definitive statements on topics you might have thought were a bit less concrete.
For people who maybe don’t have time to thoroughly and regularly investigate these claims, what can we do as constituents to combat misinformation?
I would say that maintaining a healthy degree of skepticism regarding political speech on science is a decent start. I understand not everyone is going to go seek out the research paper supporting some claim or other, but just having in your mind that things are rarely as simple as a politician claims might help combat the spread of that misinformation. Another good rule to always have in mind is simply to demand evidence: that just means that, before you believe a claim on science that matters to you, make sure whoever is making the claim has reasonable support for it.
It’s hard not to be disillusioned at this point, when it seems like both sides of a very split country are dead-set on staying within their own echo chambers. Are you optimistic about this country’s ability to fact check its leaders and educate each other?
Another good question that I don’t have a great answer for. I guess I would say that the situation does seem to be improving, with polling suggesting strong support for science, an increasing understanding of climate change’s threat to all of us, and so on. I think the bipartisan agreement that STEM education is important is promising, though whether that agreement results in actual policy support from both sides is far from a given.
But, to be honest, I don’t feel particularly optimistic about it right now. There seems to be such disdain for expertise, and a lack of understanding that there really is a hierarchy of trustworthy sources out there (see Representative Lamar Smith [R-Texas] saying that Science is not an “objective” journal, just for example), that I find it difficult to picture us all uniting behind some glorious banner of scientific rigor in the near future. Sorry to end on a downer, but when you cover this stuff, especially at this particular political moment, that’s sort of where you tend to live.
This interview has been edited for length and clarity.