Meta settles claims that ads violated fair housing laws
Meta Platforms Inc. will change its ad delivery system to address concerns that it violates the Fair Housing Act by discriminating against users, as part of a settlement with a federal regulator.
The accord resolves a lawsuit by the U.S. Department of Housing and Urban Development alleging that the algorithms used in Meta’s advertising systems allowed marketers to violate fair housing laws by limiting or blocking certain groups of people from seeing housing ads on the service.
“Because of this ground-breaking lawsuit, Meta will — for the first time — change its ad delivery system to address algorithmic discrimination,” Manhattan U.S. Attorney Damian Williams said in a statement.
Meta said Tuesday that it built machine learning technology to ensure that ads reach people that reflect the overall potential audience for a particular ad, and not just a subset of that group.
In a blog post, Meta wrote that it will “work to ensure the age, gender and estimated race or ethnicity of a housing ad’s overall audience matches the age, gender, and estimated race or ethnicity mix of the population eligible to see that ad.” The company will also pay a fine of just over $100,000.
Meta said it will use the new technology for employment and credit ads as well as for housing.
Meta’s ad targeting capabilities have come under fire in recent years. In some cases, the company’s very specific targeting options may have enabled marketers to exclude certain groups from ads, for things like housing. In other cases, Meta’s targeting options were linked to a person’s protective characteristics, like race or religion.
In the HUD complaint, the U.S. alleged Meta’s algorithm allowed advertisers to find users who share similarities with groups of other individuals.
Meta hopes to get the new system up and running by the end of the year, said Roy Austin, the company’s vice president of civil rights. Austin added that Meta will also seek feedback on these changes from civil rights groups in the coming months. Many civil rights groups have been critical of the company’s use of personal data for targeting and how it can lead to discrimination.
The case is U.S. v Meta Platforms Inc., 22-cv-5187, U.S. District Court, Southern District of New York.
Comments are closed.