Image Credit: Pexels

Department of Justice Makes Steps Against Discrimination in AI-Based Tenant-Screening Practices

6 Shares
6
0
0
0
0
0
Listen to this story

The U.S. Department of Justice recently took another step against racism in algorithm-based tenant-screening practices. 

In a public statement released on Jan. 9, they announced that they had filed “a Statement of Interest” in accordance with the Louis et al. v. SafeRent et al lawsuit, an ongoing case about discrimination in the renting process. The statement’s purpose is to explain the role of the Fair Housing Act on screening systems when it comes to using technology to choose tenants. According to the department, the act protects people of all religions, race, sex and color from experiencing discrimination in the housing system. 

“Housing providers and tenant screening companies that use algorithms and data to screen tenants are not absolved from liability when their practices disproportionately deny people of color access to fair housing opportunities,” said Kristen Clarke, the Assistant Attorney General Kristen Clarke of the Department’s Civil Rights Division, per the statement. “This filing demonstrates the Justice Department’s commitment to ensuring that the Fair Housing Act is appropriately applied in cases involving algorithms and tenant screening software.”

Filed in May of last year by tenants Mary Louis and Monica Douglas, the Louis et al. v. SafeRent alleges that the two plaintiffs were denied rental housing because of a screening software known as “SafeRent Solutions.” Founded by entrepreneur Frank Yankovich, the algorithm-based service is designed to help renters determine which tenants to choose by assigning each applicant a score. 

The tech site averages information from previous rental situations, damages to former property, credit history and more to create the score for AI to choose the prospective tenants from.

After being denied rental housing, the two women filed a lawsuit to highlight the prejudice made against Black and Hispanic people by SafeRent Solutions. Louis herself cites that she was only denied housing because of her race; the site included debt unrelated to living situations and didn’t take into account a housing voucher that she intended to use to pay and cover 70% of her rent.

“We are bringing these claims against SafeRent because its actions interfere with our ability to stabilize low-income families in housing with their vouchers,” said the Executive Director of the agency involved in the plaintiffs’ lawsuit, David Gibbs, per a statement. “The tenant screening software makes it almost impossible for us to place families in many developments because these otherwise qualifying applicants often have non-tenant consumer debt.”

“This is especially frustrating because our clients always prioritize paying their rent to keep a roof over their heads,” he added. 

You May Also Like