Just over a year ago, Wikipedia was at a crossroads, although the company didn’t yet know it. As a business and institution, the online encyclopedia was doing quite well: On January 15th, 2016, Wikimedia Foundation, the parent company of Wikipedia, celebrated its 15-year anniversary with a conference and party at its San Francisco headquarters. With more than 38 million articles in 289 languages, the website once regarded by academics as a morass of sort-of facts had now become the world’s cultural repository. And with the launch of a 10-year, $100 million “permanent safekeeping” endowment announced at the anniversary, Wikipedia stands to hold that title for the foreseeable future.
But Wikipedia’s 15th birthday coincided with an inflection point for honesty in American politics. Some 2,600 miles away from the cake-cutting in San Francisco, Republican presidential hopefuls were spending the evening telling falsehoods and half truths from a debate stage in Charleston, South Carolina. Here was former Pennsylvania Senator Rick Santorum claiming the Obama administration had cost the United States more than two million manufacturing jobs, and Texas Senator Ted Cruz claiming the president had authorized the arrival of Syrian refugees without “meaningful” background checks. That a presidential candidate should stretch the truth (or tell an outright lie) isn’t on its face surprising; after all, the biggest gaffe a politician can make is to tell the truth. But as the campaign wore on and Donald Trump began to absorb every molecule of political oxygen — a candidate who spun more lies this cycle than any of his political opponents — that debate in South Carolina increasingly came to mark a strange contrast to the celebration of truth under way across the coast.
In the months since Election Day, technology and media companies have been waging a war on fake news and conspiracy theories in their role as the digital infrastructure for content, from fact-checking partnerships between media outlets and Google meant to filter falsehoods from algorithmically neutral search engines, to handy Chrome extensions that parse the president’s tweets so you don’t have to. An October Pew Research Center survey found 81 percent of Americans saw Trump and Hillary Clinton disagreeing on “basic facts” of American life, and as perpetually mistrusted news organizations tried to fact check on shoestring budgets, the responsibility for combating fake news in an increasingly fragmented knowledge ecosystem has belonged to Facebook. But the sprawling social network’s efforts to combat bad satire and conspiratorial kudzu may be for naught: While more and more Americans get their news from Facebook, only 18 percent say they believe or trust what they’re reading, according to a BuzzFeed/Ipsos poll.
Wikipedia’s role in beating back the post-truth age doesn’t rest with blacklisting certain sources as perpetually unreliable (as the website did with the Daily Mail) or preventing congressional staffers from meddling with their boss’ bios. An elder statesman of the content ecosystem in Internet years, Wikipedia has been combating misinformation by thoughtfully and purposefully iterating on strict guidelines of verifiability that Wikipedians (active editors in the Wikipedia community) both refine and enforce information transparently in open channels. With newsroom budgets shrinking and the traditional authoritative institutions increasingly treated with skepticism, Wikipedia’s ever-changing verification structure has reversed its public credibility, so far that a 2014 YouGov poll found that British users trusted the site more than the local news media. The “marketplace of ideas,” the economic justification for freedom of expression coined by John Stuart Mill, exists in Wikipedia, and it’s working quite nicely to distill a multitude of facts into a singular truth.
“A lot of students were taught to look at Wikipedia as fake news before fake news existed, but a lot of students now trust Wikipedia more than ever,” says Jen Malkowski, a professor of film and media studies at Smith College. “Wikipedia used to not be a valid form of information because it was collective, but my students have learned a lot about what that collectiveness actually means in the context of the site in terms of trust and collaboration and verifying what’s fact and what’s not.”
But the company’s goal isn’t to develop a turnkey structural solution to the Internet’s fake news problem by scaling up its internal fact-checking infrastructure, a global research and copy desk with rigorous logic and rules (in contrast to Twitter’s messy truth engine) — instead, it’s to engender its unique institutional brand of media literacy in every young mind in the country.
That’s where the Wiki Education Foundation, the non-profit spun off from the larger Wikimedia Foundation in 2013 that serves as a bridge between Wikipedia and academia, comes in. While graduate students have frequently served as “Wikipedians-in-residence” at various cultural institutions (like the National Archives and Records Administration) to increase the flow of historical documents into the online encyclopedia, Wiki Edu invites college professors to have their students author Wikipedia articles on related coursework in lieu of a traditional research paper. (Malkowski was a Wiki Edu participant last year.) In the fall semester alone, some 6,300 students in 276 courses across the U.S. and Canada accounted for 10 percent of the content added to Wikipedia, totaling more than 4.25 million words.
“Wikipedia’s been dealing with fake news for 16 years, and we’ve developed very extensive policies for determining what is a reliable source or not.”
“One of the challenges of Wikipedia content in general is that it’s all written by volunteers; the well-developed areas are the ones they’re most interested in: current events, pop culture, and so on,” says LiAnna Davis, director of programs at Wiki Edu. “There’s a lagging behind in terms of quality and development of content in academic subject areas. But academic content is hugely important, and it isn’t going away, from the text of legislation to the timeline of a particular moment in history, to the chemistry of what you’re cooking in the kitchen — for those facts, Wikipedia will always be there.”
But the goal of the program isn’t just to bolster Wikipedia’s contributor pool, although attrition among the site’s volunteer editor base is a frequent concern (despite a leveling off in recent years). Rather, it’s a crash course in the site’s rigorous research and fact-checking process and, as Davis puts it, “a proven way to improve digital literacy while simultaneously adding accurate content to the world’s largest fact-based encyclopedia.” And the injection of media literacy pedagogy into the American classroom couldn’t come at a better time: According to a Stanford University survey of 7,800 students from middle school to college, even those at the country’s top universities have trouble distinguishing between fake news or not.
“Within the context of these talks about ‘alternative facts,’ there’s a lot of focus on the problem, and not a lot of reporting and understanding of the solutions,” Davis says. “Wikipedia’s been dealing with fake news for 16 years, and we’ve developed very extensive policies for determining what is a reliable source or not. When students go through the process of contributing to the site, they figure out how to write neutral, fact-based entries, and they have to disclose all of their sources. The debate we’ve seen recently has only crystallized for us that the skills we teach through the program are vitally important.”
We generally think of the Internet not as a global commons but as a series of self-segregating (although not all-encompassing) epistemic bubbles with their own senses of right and wrong, maintained by unique networks and memes. But Wikipedia doesn’t create the same sort of political polarization we’ve come to expect from vast online communities: An October working paper from the National Bureau of Economic Research found that, over time, conversations among the encyclopedia’s active users became less segregated, and their contributions to articles more impartial. Contributors become “more neutral over time, not more extreme,” the authors wrote, “remarkably, the largest such declines are found with contributors who interact with articles that have greater biases.”
“Generally, we believe that if you give people good information, they can make good decisions, and that comes with trusting the users,” says Wikimedia Foundation communications director Juliet Barbara. “Having open, transparent spaces for public discourse is really critical: Behind every Wikipedia article, there’s a newsroom in the form of a Talk channel, where editors are discussing current events and bringing different points of view to the table.”
This brand of collaboration is likely due to the unique loyalty to Wikipedia that underpins such unusual online collaboration. The American public may be more polarized than ever before, but the NBER research shows that Wikipedians have a tendency to place the integrity of the site above any personal ideological bias. Wikipedia is both medium and message for the editors who devote their time to the site’s maintenance. They are, in some ways, fighting to keep the peace while digital vigilantes and neo-Nazi trolls wage war in the flashier avenues of truth and misinformation, the Facebooks and Reddits whose raison d’etat isn’t fact but devotion to feeling. “It’s a good thing Wikipedia works in practice,” says Barbara, “because in theory it’s a total disaster.”
That it works in practice may be the best tool for professors to imbue their students with the media literacy skills they so desperately need. Davis says that Wiki Edu expects 7,500 students in around 335 courses to contribute to the site this spring, and, according to preliminary surveys, 95 percent of participating students found writing a Wikipedia article to be extremely valuable in giving them media literacy skills necessary to understanding the changing digital landscape.
“Knowledge isn’t necessarily stable, and there have always been debates over what a fact is and isn’t,” says Malkowski, the Smith professor. “But we have, as a human civilization, always developed ways to vet information and feel confident about facts and data. For my students, that involved learning a lot about how to use Wikipedia: how to look at revision history, how to navigate the various Talk pages, and to see what kind of Talk pages emerge around a certain topic.”
Most students simply preferred the research and fact checking of an obscure Wikipedia entry on the history of radio documentaries to the conventional research paper — if only because they actually feel like they’re making a contribution to the global knowledge base.
“They really do a much better job on Wikipedia articles than they do on research papers where I’m the only audience,” Tamar Carroll, professor of history and program director of of Digital Humanities and Social Sciences at Rochester Institute of Technology, says of students in her past Wiki Edu curricula. “They’re very proud, and they often do way more than they have to do. Students added characters to an entry way beyond the minimum, simply because they were really interested in the subjects. There’s an incredible level of engagement and quality of the work.”
“It’s worth noting, by the way, that a lot of academic research is behind paywalls, and most people don’t have free access to it,” Malkowski adds. “My students are really aware that they have a privilege in this kind of access to information.”
This doesn’t mean Wikipedia is the perfect magnifying glass and mood ring for the entire Internet: As TheEconomist observed on the site’s 15th birthday, Wikipedians “are predominantly male and English-speaking, and may carry a worldview that does not represent the collective, [and] it remains an ongoing challenge to recruit participants from different backgrounds and countries.” Even if one-term academic Wikipedians maintain their involvement in the site, there’s the risk that their teachings on media literacy may get lost in the shuffle of college life — or, even worse, notoriously unfriendly Wikipedia veterans might simply spurn outsiders. And there’s always the concern that consumers, swaddled in the comforting muck of their Facebook and Twitter feeds, won’t discover the truth nestled at the top of Google.
But for a generation of Americans untrustworthy of traditional public institutions, drowning under a constant daily flood of information and unsure of what the future holds, a lesson on a relatively proven pedagogy might serve as a better vaccination against the lies than any algorithmic tweak to Facebook’s NewsFeed. Yes, Wikipedia’s noble army of volunteers will continue to keep the encyclopedia articles as close to objective reality as possible, but that’s just one front in the war on misinformation — to win the war, it’s up to the crucibles of American education to imbue voters with the media literacy skills they need to navigate the post-truth world.
“For my students at Smith, they’re now seeing a new political stake in their ability to vet information,” Malkowski says. “It’s more important to them than ever before.”