Tech

Meta will change algorithm that allowed people to be racist while advertising housing

The government must agree to the changes.

Photo of Jacob Seitz

Jacob Seitz

Finger touching on Create an Ad button on Facebook website.

Meta agreed to change its advertising algorithm to be less discriminatory, according to a press release from the government.

Featured Video

The company told the Department of Justice (DOJ) it would change its algorithm as a part of a settlement agreement resolving allegations that Meta violated the Fair Housing Act (FHA). According to the DOJ, Meta’s housing advertising system illegally discriminated against Facebook users based on their “race, color, religion, sex, disability, familial status, and national origin.” 

The settlement resolved the DOJ’s first lawsuit challenging algorithmic discrimination under the Fair Housing Act. The suit alleged that Meta used algorithms that relied, in part, on characteristics protected under the FHA to determine which Facebook users received ads for housing.

Under the new settlement, Meta agreed to “​​develop a new system over the next six months to address racial and other disparities caused by its use of personalization algorithms in its ad delivery system for housing ads,” but the new algorithm must be approved by the DOJ before it is implemented. If the DOJ determines the algorithm to be insufficient in resolving its complaint, the settlement agreement will be terminated.

Advertisement

The suit hinged on three key aspects of Meta’s ad targeting and delivery system, alleging that the company “enabled and encouraged” advertisers to decide whether users were eligible to receive housing ads by relying on protected identity attributes. The suit said Facebook then developed a tool called “Lookalike Audience” that used machine learning to find Facebook users who shared similarities with those deemed eligible for housing ads, also using FHA-protected characteristics. Finally, the suit alleged that Meta used those characteristics to determine which subset of an advertiser’s targeted audience would receive the ads. 

Meta was also fined $115,000.

The agreement states that Facebook must stop using the “Lookalike Audience” tool (now called “Special Ad Audience”) by January 2023, and sets the same timeline for the company to develop a new algorithm to determine housing ad selection. 

“This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit,” said Assistant Attorney General Kristen Clarke. “The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities.”

Advertisement

Meta is not the only tech company to come under fire for violating the Fair Housing Act. In April, real-estate giant Redfin agreed to pay a $4 million fine and implement a new internal monitoring system as a result of a lawsuit from the National Fair Housing Alliance. Zillow, another real-estate tech behemoth, has been accused of hosting listings that violate the FHA several times.


Read more of the Daily Dot’s tech and politics coverage

Nevada’s GOP secretary of state candidate follows QAnon, neo-Nazi accounts on Gab, Telegram
Court filing in Bored Apes lawsuit revives claims founders built NFT empire on Nazi ideology
EXCLUSIVE: ‘Say hi to the Donald for us’: Florida police briefed armed right-wing group before they went to Jan. 6 protest
Inside the Proud Boys’ ties to ghost gun sales
‘Judas’: Gab users are furious its founder handed over data to the FBI without a subpoena
EXCLUSIVE: Anti-vax dating site that let people advertise ‘mRNA FREE’ semen left all its user data exposed
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
 
The Daily Dot