New study suggests gender biasness in Facebook’s ad delivery system

New study suggests gender biasness in Facebook’s ad delivery system

Tech News: New study suggests gender biasness in Facebook’s ad delivery system.

A study by researchers at the University of Southern California found that Facebook’s advertising system discriminates against women by showing them different ads than men and excluding women. from seeing some ads.

“Facebook’s job posting may result in a bias in the delivery of job postings based on gender beyond what can be legally justified by possible differences in qualifications,” the researchers wrote in their report, “thus reinforcing the arguments previously raised that Facebook’s ad serving algorithms may violate anti-discrimination laws. “

The research team bought ads on Facebook for job postings for delivery drivers who had similar qualification requirements, but for different companies. The ads did not specify specific demographics. One was an ad for Domino’s pizza drivers, the other for Instacart drivers. According to the researchers, Instacart has more female drivers, but Domino’s has more male drivers. The study found that out Facebook directed the Instacart delivery job to multiple women and the Domino delivery job to multiple men.

The researchers conducted a similar experiment on LinkedIn, where they found that the platform’s algorithm showed the Domino ad to as many women as the Instacart ad.

Two more pairs of similar job postings the researchers tested on Facebook showed similar results: an ad for a software engineer at Nvidia and a job posting for a car salesman was shown to multiple men, and a job posting for a Netflix software engineer and an ad for a jewelry salesman it was shown to more women. It is unclear if this means that the algorithm had identified the current demographics of each job posting when it optimized the ads. Facebook doesn’t say much about how his ad feed works.

“Our system looks at many signals to try to offer people ads they will be most interested in, but we understand the concerns raised in the report.” Facebook spokesman Tom Channick said in an email to The Verge. “We have taken significant steps to address ad discrimination issues and today we have teams working on ad fairness. We continue to work closely with the civil rights community, regulators and academics on these important issues.” .

However, this isn’t the first time research has discovered that Facebook’s ad targeting system discriminates against some users. A 2016 ProPublica survey found that Facebook’s “ethnic affinity” tool could be used to prevent black or Hispanic users from see certain ads. If those ads were for housing or job opportunities, the targeting could violate federal law. Facebook said in a response that it would strengthen its anti-discrimination efforts, but a second report from ProPublica in 2017 found the same problems existed.

And in 2019, the U.S. Department of Housing and Urban Development filed a lawsuit against Facebook for housing discrimination after establishing that there were reasonable grounds to believe it Facebook had posted advertisements in violation of the Fair Housing Act.

The HUD said in a complaint that Facebook’s targeting tools were reminiscent of redlining practices because they allowed ads to exclude men or women. from see certain ads, plus a map tool “to exclude people living in a certain area from see an ad by drawing a red line around that area, “according to the complaint. Facebook resolved the lawsuit and said in 2019 that it abandoned its ad targeting options for housing and job postings.