Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Class action lawsuit on AI-related discrimination reaches final settlement

Nov 21, 2024 - financialpost.com
A class action lawsuit alleging AI-related discrimination has reached a final settlement, with the company behind the algorithm, SafeRent Solutions, agreeing to pay over $2.2 million and modify parts of its screening products. The lawsuit was led by Mary Louis, a Black woman who was denied tenancy by a third-party service using an algorithm to score rental applicants. The lawsuit claimed the algorithm discriminated based on race and income, as it did not consider housing vouchers and relied heavily on credit information.

The settlement does not include any admissions of fault by SafeRent Solutions, which maintains that its scores comply with all applicable laws. The case is one of the first of its kind and could set a precedent for AI accountability. The settlement stipulates that SafeRent can't include its score feature on its tenant screening reports in certain cases and must have any new screening score validated by a third-party agreed upon by the plaintiffs.

Key takeaways:

  • A class action lawsuit alleging AI-related discrimination in tenant screening has reached a final settlement, with the company behind the algorithm agreeing to pay over $2.2 million and roll back certain parts of its screening products.
  • The lawsuit alleged that the algorithm, developed by SafeRent Solutions, discriminated on the basis of race and income, and did not take into account the benefits of housing vouchers.
  • The settlement stipulates that SafeRent can't include its score feature on its tenant screening reports in certain cases, including if the applicant is using a housing voucher.
  • This case is one of the first of its kind and could lay the groundwork for AI accountability in the future.
View Full Article

Comments (0)

Be the first to comment!