The case highlights the growing concern over the use of unregulated AI systems in decision-making processes, such as job applications, home loans, and medical care. Despite SafeRent's argument that it should not be held liable for discrimination as it did not make the final decision on tenancy, the judge ruled that the company's algorithm could be held accountable as it played a role in housing access. The settlement requires SafeRent to exclude its score feature in certain cases and mandates third-party validation for any new screening score it develops.
Key takeaways:
- Mary Louis, a Black woman, was denied tenancy by an algorithm from a third-party service, leading to a class action lawsuit alleging discrimination based on race and income.
- The company behind the algorithm, SafeRent Solutions, agreed to a settlement of over $2.2 million and to roll back certain parts of its screening products that were alleged to be discriminatory.
- The lawsuit claimed that SafeRent's algorithm did not consider the benefits of housing vouchers, discriminating against low-income applicants, and relied too heavily on credit information, which unfairly impacted applicants with housing vouchers who are Black and Hispanic.
- Despite the settlement, AI systems used to screen or score Americans remain largely unregulated, and lawsuits like Louis's are beginning to establish a foundation for AI accountability.