A critical mass of everyday users is the best defense against political attacks on climate information.
By Denice W. Ross
(Photo: Pascal Guyot/AFP/Getty Images)
Our data is in danger. Since the election, hundreds of scientists, librarians, academics, and technologists have formed a digital bucket brigade, archiving climate and environmental websites and data sets that might get scrubbed by the Trump administration. This isn’t an exaggeration: Facts that don’t support the administration’s views are already in the crosshairs. Bills were recently introduced in Congress that would prohibit the Department of Housing and Urban Development from maintaining its database on community racial disparities. Content about the connection between greenhouse gas emissions and climate change has already begun disappearing from the Environmental Protection Agency website. And the United States Department of Agriculture recently pulled down reports on animal welfare.
Removing inconvenient data is not a new political play. Scientists in Canada are still recovering from the purging of climate data under the Harper administration. Closer to home, this October, Kansas Governor Sam Brownback canceled a quarterly economic report after its findings showed that the local economy did not improve as he’d promised.
Still, there’s a palpable sense of urgency, even emergency, to the latest data defense efforts, dubbed the DataRefuge. Typical to crises, the first responders are largely specialists who can rapidly assemble and get the job done. They’ve organized more than a dozen data rescue events at universities and technology companies with ample bandwidth and collaboration space, stretching all across the country, from coastal hubs like Los Angeles, Philadelphia, and New York, to cities like Austin, Indianapolis, and Ann Arbor.
Climate data sets, and, in fact, most federal data, have always been vulnerable because their most proximal value is to a narrow band of experts. Sources of data relevant to climate change and the environment are spread across many agencies — including obvious ones like National Oceanic and Atmospheric Administration, NASA, and the EPA, as well as not-so-obvious ones like the statistical agencies of the Energy Information Administration, the Bureau of Labor Statistics, and the Census Bureau, which researchers fear may also be at risk. Pulling a data set out of commission in one agency can unravel data products in other agencies. For example, if the NOAA’s Vegetation Health Index is pulled, then the U.S. Drought Monitor will be missing crucial data input, reducing the effectiveness of programs at the USDA and Internal Revenue Service to assist farmers affected by drought.
So how do we save our data? Simple: Use it.
The last decade of progress toward open government has shown that a critical mass of everyday users is the best defense against attacks on public data. Much of this material has gotten so familiar we don’t even really think about it as data: Think about the public data sets that tell you when the next bus is coming, if you’ll need to dose up on allergy meds, or if it’s too hot outside for the little league game. Those figures will be safe. And they should continue to flow even when our federal, state, and local elected leaders change because so many people rely on them.
In contrast, the future of most climate and environmental data sets will continue to be much murkier, at least until a wide range of Americans use the data and see how it’s improving their lives. Which it already is. For instance, the Census Bureau’s American Community Survey, with neighborhood-level data on age, poverty, and disability, gives local authorities and non-profits information on where to set up cooling stations during deadly heat waves. Bureau of Labor Statistics data on occupations helps community colleges meet market demand for a workforce trained in wind energy. And NOAA data on projected sea level rise helps small business owners rebuilding after coastal flooding make informed, proactive decisions about how high to elevate their buildings.
Local governments are uniquely positioned to harness the power of federal climate data, and to bring center stage just how much we need it. In post-Katrina New Orleans, for instance, public use of data lit the path to recovery, and now the city is upping its game with the implementation of an urban resilience strategy, with data at its core. And Philadelphia is taking a data-driven approach to reducing carbon emissions and fostering an inclusive, green economy and healthy neighborhoods. Armed with data, these kinds of relationships with the public are crucial to building and bettering our cities.
Communities rely on timely federal data to see where they’ve been, understand where they are now, and decide where to go next. Federal data sets cover the entire country, and in many cases the entire world — a scale that local communities require in order to understand how they are doing relative to other places. More than ever, federal data sets are an essential piece of infrastructure for communities. Local governments can — and must — be the bridge that connects federal data to the people. Our world may very well depend on it.
This story originally appeared in New America’s digital magazine, New America Weekly, a Pacific Standard partner site. Sign up to get New America Weekly delivered to your inbox, and follow @NewAmerica on Twitter.