Facebook’s Promise to Moderate Hate Speech Falls Short in Private Groups

Dozens of hateful posts in a secret Facebook group for current and former Border Patrol agents raise questions about how well if at all the company is policing disturbing postings and comments made outside of public view.
Facebook logo displayed at the F8 Facebook Developer's conference.

Facebook says its standards apply just as much in private groups as public posts, prohibiting most slurs and threats based on national origin, sex, race, and immigration status.

But dozens of hateful posts in a secret Facebook group for current and former Border Patrol agents raise questions about how well if at all the company is policing disturbing postings and comments made outside of public view.

Many of the posts ProPublica obtained from the 9,500-member “I’m 10-15” group (10-15 is Border Patrol code for “alien in custody”) include violent or dehumanizing speech that appears to violate Facebook’s standards. For example, a thread of comments before a visit to a troubled Border Patrol facility in Texas by Democratic Representative Alexandria Ocasio-Cortez, of New York, and Veronica Escobar, of Texas, included “fuck the hoes” and “No mames [fist].” Another post encouraged Border Patrol agents to respond to the Latina lawmaker’s visit by hurling a “burrito at these bitches.” And yet another mocked a video of a migrant man trying to carry a child through a rushing river in a plastic bag. A commenter joked, “At least it’s already in a trash bag”—all probable violations of the rules.

Facebook, citing an open federal investigation into the group’s activities, declined to answer questions about whether any posts in the 10-15 group violated its terms of service or had been removed, or whether the company had begun scrutinizing the group’s postings since ProPublica’s story was published. It also refused to say whether it had previously flagged posts by group members or had received complaints.

Facebook’s only response, emailed by a spokeswoman who refused to let ProPublica use her name, was: “We want everyone using Facebook to feel safe. Our Community Standards apply across Facebook, including in secret Groups. We’re cooperating with federal authorities in their investigation.”

Since April, the company has been calling community groups “the center of Facebook.” It has put new emphasis on group activity in the newsfeed and has encouraged companies, communities, and news organizations to shift resources into private messaging. These forums can give members a protected space to discuss painful topics like domestic violence, or to share a passion for cookbooks. Groups can be either private, which means they can be found in search results, or secret, which means they are hidden unless you have an invitation.

This is part of an intentional “pivot toward privacy.” In a March blog post, Facebook Chief Executive Officer Mark Zuckerberg wrote, “Privacy gives people the freedom to be themselves and connect more naturally, which is why we build social networks.”

But this pivot also fosters hidden forums where people can share offensive, potentially inflammatory viewpoints. “Secret” groups such as 10-15 are completely hidden from non-members. Would-be participants need an invitation to even find the landing page, and administrators of the groups have full jurisdiction to remove a person’s access at any time.

When such groups operate out of sight, like 10-15, the public has a more limited view into how people are using, or misusing, the platform. In a secret group, only members can flag or report content that might be in violation of Facebook’s policies. The administrators of the group can set stricter policies for members’ internal conversations. They cannot, however, relax broader Facebook standards. They also can’t support terrorist organizations, hate groups, murderers, criminals, sell drugs, or attack individuals.

Civil rights groups say they have been noticing and raising the issue of hateful posts in hidden forums for years—with limited response from Facebook.

Henry Fernandez, senior fellow at the Center for American Progress, a liberal think tank, and a member of Change the Terms, a coalition of civil rights groups pushing for better content moderation on Facebook, said the platform keeps creating features without “without vetting them for their implications for the use by hate groups or, in this case, Border Patrol agents acting in hateful ways.”

Posts in hidden groups have incited incidents of violence in the real world, most famously against Rohingya Muslims in Myanmar and at the 2017 white supremacist march in Charlottesville, Virginia. The military launched an investigation of a secret Facebook group in 2017 after Marines shared naked pictures of female service members. Facebook has acknowledged the problem and has made some efforts to address it with new initiatives, such as a proposed independent review board and consultations with a group of 90 organizations, most focusing on civil rights.

ProPublica’s Border Patrol story came out the day after Facebook released an audit of civil rights issues on the platform. Recommendations included strengthening hate speech policies around national origin, enforcing a stricter ban on the promotion of white supremacy, and removing an exemption that had allowed humorous posts that contained offensive content.

Facebook did not say whether it will make all of the recommended changes. But in a blog post, Chief Operating Officer Sheryl Sandberg wrote, “We will continue listening to feedback from the civil rights community and address the important issues they’ve raised so Facebook can better protect and promote the civil rights of everyone who uses our services.”

Jessica Gonzalez, vice president of strategy and senior counsel at FreePress and co-founder of Change the Terms, said that, even after the back and forth with auditors, she was not surprised that the hateful posts in 10-15 were not flagged.

“What Facebook released on Sunday is an improvement,” she said, “but I think Facebook has engaged in this all along in an appeasement strategy. They’ll do what they need to do to get the bad publicity off [their] backs.”

The civil rights audit also called for better transparency about civil rights issues on Facebook’s advertising portal, which became a priority for the company after multiple ProPublica investigations and lawsuits by civil rights groups.

Bhaskar Chakravorti, dean of global business at Tufts University’s Fletcher School of Business, said the new emphasis on privacy is part of Facebook’s attempt to keep users on the platform, while reassuring investors.

“So to the extent that Facebook provides shelter to groups of all kinds—whether they are people who are sharing hateful messages or messages for the good of the world—it benefits their business model.”

This post originally appeared on ProPublica as “Civil Rights Groups Have Been Warning Facebook About Hate Speech In Secret Groups For Years” and is republished here under a Creative Commons license.

Related Posts