Facebook Says it Will Stop Allowing Some Advertisers to Exclude Users by Race

Facebook announced today that, as promised, it has launched an automated system that will prevent advertisers from using racial categories in ads for housing, employment and credit. The system will also warn advertisers to comply with the law when using other protected categories, such as age, sex or medical condition.

______________

Facing a wave of criticism for allowing advertisers to exclude anyone with an “affinity” for African-American, Asian-American or Hispanic people from seeing ads, Facebook said it would build an automated system that would let it better spot ads that discriminate illegally.

Federal law prohibits ads for housing, employment and credit that exclude people by race, gender and other factors.

Facebook said it would build an automated system to scan advertisements to determine if they are services in these categories. Facebook will prohibit the use of its “ethnic affinities” for such ads.

Facebook said its new system should roll out within the next few months. “We are going to have to build a solution to do this. It is not going to happen overnight,” said Steve Satterfield, privacy and public policy manager at Facebook.

He said that Facebook would also update its advertising policies with “stronger, more specific prohibitions” against discriminatory ads for housing, credit and employment.

In October, ProPublica purchased an ad that targeted Facebook members who were house hunting and excluded anyone with an “affinity” for African-American, Asian-American or Hispanic people. When we showed the ad to a civil rights lawyer, he said it seemed like a blatant violation of the federal Fair Housing Act.

After ProPublica published an article about its ad purchase, Facebook was deluged with criticism. Four members of Congress wrote Facebook demanding that the company stop giving advertisers the option of excluding by ethnic group.

The federal agency that enforces the nation’s fair housing laws said it was “in discussions” with Facebook to address what it termed “serious concerns” about the social network’s advertising practices.

And a group of Facebook users filed a class-action lawsuit against Facebook, alleging that the company’s ad-targeting technology violates the Fair Housing Act and the Civil Rights Act of 1964.

Facebook’s Satterfield said that today’s changes are the result of “a lot of conversations with stakeholders.”

Facebook said the new system would not only scan the content of ads, but could also inject pop-up notices alerting buyers when they are attempting to purchase ads that might violate the law or Facebook’s ad policies.

“We’re glad to see Facebook recognizing the important civil rights protections for housing, credit and employment,” said Rachel Goodman, staff attorney with the racial justice program at the American Civil Liberties Union. “We hope other online advertising platforms will recognize that ads in these areas need to be treated differently.”

ProPublica is a Pulitzer Prize-winning investigative newsroom. This article is republished with permission under a Creative Commons license. Sign up for their newsletter.