Meta rolls out AI ad-targeting tech in an effort to reduce discrimination

The company promised the system as part of a settlement.

Meta is acting on its vow to reduce ad discrimination through technology. The company is rolling out a Variance Reduction System (VRS) in the US that ensures the real audience for an ad more closely matches the eligible target audience — that is, it shouldn’t skew unfairly toward certain cultural groups. Once enough people have seen an ad, a machine learning system compares the aggregate demographics of viewers with those the marketers intended to reach. It then tweaks the ad’s auction value (that is, the likelihood you’ll see the ad) to display it more or less often to certain groups.

VRS keeps working throughout an ad run. And yes, Meta is aware of the potential privacy issues. It stresses that the system can’t see an individual’s age, gender or estimated ethnicity. Differential privacy tech also introduces “noise” that prevents the AI from learning individual demographic info over time.

The anti-discrimination method will initially apply to the housing ads that prompted the settlement. VRS will reach credit and employment ads in the country over the following year, Meta says. Read More

#diversity

Discriminating Systems — Gender, Race, and Power in AI

There is a diversity crisis in the AI sector across gender and race.
The AI sector needs a profound shift in how it addresses the current diversity crisis.
The overwhelming focus on ‘women in tech’ is too narrow and likely to privilege white women over others.
Fixing the ‘pipeline’ won’t fix AI’s diversity problems.
The use of AI systems for the classification, detection, and prediction of race and gender is in urgent need of re-evaluation.

The diversity problem is not just about women. It’s about gender, race, and most fundamentally, about power.10 It affects how AI companies work, what products get built, who they are designed to serve, and who benefits from their development. Read More

#diversity