Facebook says it will build a system to prevent advertisers from buying credit, housing, or employment ads that exclude viewers by race.
By Julia Angwin
The Facebook sign and logo is seen in Menlo Park, California, on November 4th, 2016. (Photo: Josh Edelson/AFP/Getty Images)
Facing a wave of criticism for allowing advertisers to exclude anyone with an “affinity” for African-American, Asian-American, or Hispanic people from seeing ads, Facebook said it would build an automated system that would let it better spot ads that discriminate illegally.
Federal law prohibits ads for housing, employment, and credit that exclude people by race, gender, and other factors.
Facebook said it would build an automated system to scan advertisements to determine if they are services in these categories. Facebook will prohibit the use of its “ethnic affinities” for such ads.
Facebook said its new system should roll out within the next few months. “We are going to have to build a solution to do this. It is not going to happen overnight,” said Steve Satterfield, privacy and public policy manager at Facebook.
He said that Facebook would also update its advertising policies with “stronger, more specific prohibitions” against discriminatory ads for housing, credit, and employment.
In October, ProPublica purchased an ad that targeted Facebook members who were house hunting and excluded anyone with an “affinity” for African-American, Asian-American, or Hispanic people. When we showed the ad to a civil rights lawyer, he said it seemed like a blatant violation of the federal Fair Housing Act.
After ProPublica published an article about its ad purchase, Facebook was deluged with criticism. Four members of Congress wrote Facebook demanding that the company stop giving advertisers the option of excluding by ethnic group.
The federal agency that enforces the nation’s fair housing laws said it was “in discussions” with Facebook to address what it termed “serious concerns” about the social network’s advertising practices.
And a group of Facebook users filed a class-action lawsuit against Facebook, alleging that the company’s ad-targeting technology violates the Fair Housing Act and the Civil Rights Act of 1964.
Facebook’s Satterfield said that the new changes are the result of “a lot of conversations with stakeholders.”
Facebook said the new system would not only scan the content of ads, but could also inject pop-up notices alerting buyers when they are attempting to purchase ads that might violate the law or Facebook’s ad policies.
“We’re glad to see Facebook recognizing the important civil rights protections for housing, creditm and employment,” said Rachel Goodman, staff attorney with the racial justice program at the American Civil Liberties Union. “We hope other online advertising platforms will recognize that ads in these areas need to be treated differently.”