Facebook's been promising to fix discrimination problems with its ads platform ever since a Pro Publicareport last fall found that its targeting tools had the potential to violate civil rights laws.
A year later,Nanda van Bergen those efforts don't seem to have amounted to much.
SEE ALSO: Here's how easy it is for anyone to use Facebook like Russian actors didA follow-up investigation from the Pro Publicaon Tuesday found that Facebook still approved housing ads that excluded users based on their religion, gender, or "multicultural affinity" (Facebook's thinly veiled stand-in for race which was previously called "ethnic affinity"). Each of those demographics are considered protected groups under the Fair Housing Act.
After Pro Publica's report first broke last year, the social network attempted to ease the controversy with some bare-minimum changes to its platform and an internal probe a week later. The following February, it rolled out more tools meant to curb discrimination on the platform, including a machine learning system meant to detect when combinations of targeting categories might be discriminatory and warnings to businesses placing housing and employment ads.
Even so, the news site reported that each of the recent ads it took out to test Facebook's systems was approved within a few minutes. The only exception was an ad centered on users interested in Islam, which took close to half an hour.
Facebook blamed the problem on "a technical failure."
“This was a failure in our enforcement and we’re disappointed that we fell short of our commitments," Facebook's vp of product management, Ami Vora, said in a statement. "We don’t want Facebook to be used for discrimination and will continue to strengthen our policies, hire more ad reviewers, and refine machine learning tools to help detect violations. Our systems continue to improve but we can do better."
The company also announced that it would began expanding its warning system to all ads that are set up to exclude certain groups. The system previously only applied to housing, job, and credit ads.
Facebook also claims that its new software and additional human reviewers have managed to flag millions of discriminatory ads since they were deployed earlier this year.
The report follows another investigation by CNBC last week that found that Facebook was rife with posts advertising the illegal sale of prescription opioids and other dangerous drugs.
Reports like these, as well as the ongoing congressional investigation into the role Facebook played in a Kremlin-aligned campaign to influence the U.S. presidential election, have poked many holes in Facebook's claims that it strictly polices its advertising platform for nefarious actors.
This post has been updated to include a statement from Facebook.
Topics Facebook
(Editor: {typename type="name"/})
NYT Connections Sports Edition hints and answers for May 19: Tips to solve Connections #238
Something You Never Want to Hear a Man Say—“It’s Like Sex, Right?”
Watch: Christine Schutt Remembers Writing Her First Stories
'Quordle' today: See each 'Quordle' answer and hints for October 14, 2023
The Best Gaming Concept Art of 2016
Pro wrestling stars are dunking on fellow wrestler, Kane, for his tweet on Roe v. Wade
The Adventures of Don Wilen, Allen Ginsberg’s Accountant
NYT Strands hints, answers for May 5
The 11 best tweets of the week, including dumbbells and Barbie
NYT Connections hints and answers for May 10: Tips to solve 'Connections' #699.
Lisa Hanawalt: Hot Dog Taste Test
接受PR>=1、BR>=1,流量相当,内容相关类链接。