In the midst of rising public concern about data leaks involving Facebook and foreign actors using the platform to sway the 2016 election, the social media giant announced this week it was launching a social-science research initiative.
In partnership with the non-profit Social Science Research Council, Facebook's social-science program will put out a call for university scientists to apply for grants to study the effects of social media on democracies and elections, potentially using proprietary Facebook data. The money will come from various foundations known to give to Democrats, Republicans, and journalists.
"Our goals are to understand Facebook’s effect on upcoming elections—like Brazil, India, Mexico and the U.S. midterms—and to inform our future product and policy decisions," Facebook's official announcement reads. "By working with the academic community, we can help people better understand the broader impact of social media on democracy—as well as improve our work to protect the integrity of elections."
In a moment when Facebook is embroiled in controversy, the initiative tries to generate goodwill on many fronts at once. "They found a way to not hurt the company at the same time as they can do some incredible social good, at the same time as they might actually learn things to keep them out of trouble in the future," says Gary King, a political scientist at Harvard University who created the process by which Facebook will run the initiative, with steps intended to safeguard users' privacy and the independence of the science that comes out of it. For example, Facebook, along with the Social Science Research Council and funding foundations, will work together to appoint an expert commission that makes most of the initiative's decisions. The commission is supposed to be a buffer between the company, funders, and scientists, so that scientists don't feel pressure to please their patrons directly. "It's win-win," King says. "That's the theory, anyway. We'll see how it works."
The social-science ethicists Pacific Standard interviewed who were not involved in forming the initiative were optimistic about the prospect of studying Facebook's effect on politics. They saw merit in some of the safety and integrity measures King had come up with. Still, lots of questions remain—and both Facebook and the Social Science Research Council have so far been mum about details. For example, when asked questions such as how much money will be available for grants and how Facebook users will be notified if their profile data is used in a study, Social Science Research Council president Alondra Nelson says, "That's still to be worked out."
Outside ethicists Pacific Standard spoke with were particularly pleased that Facebook won't require scientists to share their findings before publishing; leaving scientists free to publish without pre-approval helps ensure the research's independence from undue company influence. Facebook has never before allowed researchers to run studies using its data without pre-publication approval.
The initiative will be funded by seven non-profits: the Alfred P. Sloan Foundation, the Charles Koch Foundation, the Democracy Fund, the John S. and James L. Knight Foundation, the Laura and John Arnold Foundation, the Omidyar Network, and the William and Flora Hewlett Foundation. The Charles Koch Foundation is headed by a billionaire industrialist who, along with his brother, is one of the biggest political donors in America, supporting pro-business and libertarian causes and politicians. The Knight Foundation, meanwhile, is known for funding journalism projects. Facebook will not provide any money.*
The initiative's expert commission, with input from Facebook, will decide what questions they want scientists to answer using Facebook data. (That's to prevent Facebook from "only choosing the ones where they'll look good," King says.) The initiative will then post its call for grant applications. The Social Science Research Council will hire peer reviewers to decide which applications to fund. If this all sounds complex, it is. "The whole process was a little like negotiating the Arab-Israeli peace treaty, but with a lot more partners," King says.
But ethicists were pleased to see the the bits about peer review. "Those are really good signs that this is trying to uphold the highest standards of research," says Elizabeth Buchanan, who studies Internet research ethics at the University of Wisconsin–Stout.
Things are perhaps iffier on the data privacy side. Facebook will be responsible for anonymizing and packaging data sets to hand over to the scientists to study. Scientists who then use that data are supposed to follow their universities' institutional review boards' guidance. Institutional review boards are panels of experts and community members who are responsible for overseeing studies involving human volunteers and protecting the volunteers' rights and safety. The boards are usually associated with medical studies, but they exist for the social sciences too. Following some past failures, "these boards are [now] very particular. It's all about the safety of the participants' data," says Maria Lahman, a social scientist who co-chairs the University of Northern Colorado's research ethics board and wrote a textbook on the topic.
But even a good institutional review board will be embarking on uncharted waters with the Facebook social science initiative. The social network's billion-plus users simply represents far more people than perhaps any institutional review board has ever been responsible for.
"I've never supervised anything like this. Nobody has," Lahman says.
How will a board require a researcher to get informed consent, for example, from five million Facebook users in a single data set? "I think that's where the IRB piece breaks down," Buchanan says. "We're trying to force big data into a very small avenue of human subjects research protections."
In addition, although the rules for institutional review boards are standardized, it varies how exactly each board interprets those rules. Some say that anonymized data, like what Facebook will likely provide scientists, doesn't require their protection at all. Then what's left is Facebook's own privacy policies, which have been at the center of the public's and lawmakers' concerns during Facebook Chief Executive Officer Mark Zuckerberg's testimony to Congress this week. In its announcement about the initiative, Facebook said it will work with outside experts to develop secure data sets, and will keep all the study data exclusively on its own servers.
Despite lingering cautions, social scientists seemed encouraged at the idea of having Facebook data out there for them to study. It used to be that behavioral-science researchers worked mostly with government data, but now so much of people's lives take place online, on private companies' platforms. That's left an important dimension of human interaction closed off to independent research—with grave consequences.
"It's clear we are being manipulated," Lahman says, "and so it's important that people that spend their whole career trying to work for the good of society get into the game so this can be leveled out and understood better."
*Update—April 13th, 2018: This post has been updated to reflect the fact that, while employees at the Hewlett Foundation have donated almost exclusively to Democrats in recent years, the foundation itself issues grants to areas such as the arts, education, and the environment; those funds do not go to partisan, political entities.