Taming the U.S. Government’s Secrecy Machine

The plodding effort to bring a modicum of common sense to how the U.S. declassifies its documents has resisted most efforts to rev it up in the digital age.

The Nixon Presidential Library plans in June to officially release the Pentagon Papers, the 7,000-page secret history of the Vietnam War that was famously first published, in part, by The New York Times in 1971. In the four decades since, the document has been the focus of innumerable newspapers articles, dozens of books — including both complete versions of the actual text itself — and even a couple of movies.

But until now, the Pentagon Papers have never been formally declassified by the government originally responsible for writing them.

This odd fact underscores the convoluted and plodding nature of a U.S. secrecy system that classifies many documents not needing such protection and releases too slowly many others that have long since gone benign.

Barack Obama came into office pledging to reform this beast, although the process he set in motion has been moving about as slowly as the broken system it seeks to repair. Only 19 out of 41 agencies met Obama’s deadline this past December to review their classification policies and begin implementing reform. And as Steven Aftergood, director of the Federation of American Scientists’ Project on Government Secrecy, has observed, the president was never all that specific in exactly what he was looking for in a “fundamental transformation” of the system.

The problem — as exemplified in the official release of a document that’s been easily accessible for decades — clearly requires big ideas more than bureaucratic adjustments. And as part of the government’s efforts, a Public Interest Declassification Board has been considering some of the major architectural changes that would be required to fix and move the system into the 21st century. The board oversees the government process and reports directly to the president (it also reported to the public at a forum today in Washington), although it has little power to compel individual agencies to adopt its ideas.

Jennifer Sims, director of intelligence studies at the Georgetown University Center for Peace and Security Studies, was appointed by Obama to the board. One if its main challenges, she said, is the sheer volume of the backlog — and the requirement that every agency with any interest in an individual document has the right to review it. A document that contains classified information about weapons of mass destruction originally obtained by a CIA agent, for example, must be reviewed by the Department of Energy and the intelligence community.

“This multiagency review of a single document exponentially increases the amount of work involved and the number of eyes that have to review the document,” Sims said. “But that’s all just dealing with the paper documents. The real challenge is going to be handling the volume of digital material that’s being generated every day now.”

Think of how many emails flow in and out of your inbox every day — and multiply that by every government employee in every federal agency from the Defense Intelligence Agency to the National Security Agency.

The information age is yielding reams of official communication not even worth saving, let alone classifying. But for all the ways technology is complicating secrecy, it also holds promise for redesigning the entire system. Part of the board’s mandate is to visualize how that might work not just today but 20 years from now.

“That’s a big task,” Sims said. “That’s a DARPA-type task, a truly transformative task, which makes the work very, very difficult.”

So what would a federal classification system look like on the other end of a “fundamental transformation”? Sims offered her vision, one of several on the board.

“It would almost be an artificial-intelligence-like capability,” she said, “so that we would have no longer this 25-year rule. Documents would be constantly searched by a central automated processing capability that does contextual analysis of every document ingested into the system.”

Such software would crawl through the government’s digital record in real time, determining which documents warrant classification, which don’t and which need to be reviewed by a human eye. It could automatically redact sensitive material and tag classified documents with a targeted release date. Older paper records could be digitized and fed into the system. A user interaction component could allow policymakers to dip into the trove to examine documents or release them, as current events require.

“It is a whole system that solves the front-end problem, which is sometimes overclassification of documents or underclassification of documents or inconsistent classification of documents,” Sims said, “and at the back end, also allows us to process much more rapidly in real time the declassification.”

The technology isn’t so far-fetched. But whether such a lumbering government system could realistically transform itself in this mold — that’s the other question.

“We want to think bold,” Sims said, “but at the same time not get so bold that we don’t grapple simultaneously with our current issues.”

Sign up for the free Miller-McCune.com e-newsletter.

“Like” Miller-McCune on Facebook.

Follow Miller-McCune on Twitter.

Add Miller-McCune.com news to your site.

Related Posts