Home Money & Business Business Final Settlement Reached in Class Action Lawsuit Over AI Discrimination

Final Settlement Reached in Class Action Lawsuit Over AI Discrimination

0
Final Settlement Reached in Class Action Lawsuit Over AI Discrimination

In the spring of 2021, Mary Louis, a Black woman, was filled with joy at the prospect of moving into a new apartment in Massachusetts. However, her excitement quickly turned to disappointment when she received an email indicating that a “third-party service” had denied her tenancy. This service employed an algorithm designed to evaluate rental applicants, prompting Louis to spearhead a class action lawsuit alleging that this algorithm was discriminatory based on both race and income.

On Wednesday, a federal judge approved a settlement in this landmark case, resulting in the company behind the algorithm agreeing to pay over $2.2 million and amend certain aspects of its screening tools alleged to be discriminatory. Importantly, the settlement does not imply any admission of wrongdoing from SafeRent Solutions, which claimed that it continues to uphold compliance with all relevant laws, and argued that litigation was a costly and lengthy process.

While this type of lawsuit may be relatively novel, the utilization of algorithms and artificial intelligence (AI) systems for screening or scoring individuals is not new. For many years, such technologies have quietly influenced significant decisions affecting the lives of citizens, whether they are applying for jobs, home loans, or medical services. Unfortunately, these algorithms remain largely unregulated, despite evidence of their potential to discriminate.

Todd Kaplan, one of Louis’ attorneys, remarked that management companies and landlords should now be aware that the systems they assume to be trustworthy are subject to scrutiny and potential legal challenge. The lawsuit contended that SafeRent’s algorithm failed to consider the advantages of housing vouchers, an important factor in determining a renter’s ability to afford monthly payments, thus unfairly disadvantaging low-income applicants who qualify for such assistance.

Additionally, the suit charged that SafeRent’s model placed excessive weight on credit history. Critics argued this approach does not accurately reflect an applicant’s ability to reliably pay rent and unjustly penalizes Black and Hispanic applicants who often have lower median credit scores due to historical disparities.

Speaking on behalf of the plaintiffs, attorney Christine Webber pointed out that even if an algorithm isn’t explicitly coded to discriminate, the data it relies on or the way it weighs that data can lead to outcomes that are just as harmful as intentional discrimination.

When Louis learned of her application’s denial, she attempted to appeal the decision, providing references from two landlords to demonstrate her reliable payment history over 16 years, despite having a less-than-stellar credit profile. Meanwhile, she was under pressure as she had already notified her previous landlord of her move and was responsible for her granddaughter’s care.

However, the management company responded that they could not entertain appeals, as they had to adhere strictly to their tenant screening results. This left Louis feeling disheartened; she expressed that the algorithm did not account for her personal story. “Everything is based on numbers. You don’t get the individual empathy from them,” she lamented, fearing that the system was insurmountable.

Despite the lack of support for aggressive regulations surrounding AI technologies in legislative bodies, lawsuits like Louis’ are paving the way for accountability in this realm. SafeRent’s defense argued that it should not be held accountable for discrimination since it did not make the final decision on renting; it merely provided scores for landlords’ consideration.

However, Louis’ legal team insisted that SafeRent’s role in facilitating access to housing along with the U.S. Department of Justice’s involvement in the case prompted the judge to deny SafeRent’s dismissal motion.

Under the settlement agreement, SafeRent is prohibited from including score features in their tenant screening reports for certain cases, specifically for applicants with housing vouchers. Moreover, if SafeRent develops a new screening score, it will necessitate validation by an independent third-party that the plaintiffs approve.

Fortunately for Louis, her son discovered a more affordable apartment for her through Facebook Marketplace, although it was $200 more expensive and located in an area she preferred less. Nevertheless, she maintains a resilient outlook: “I’m not optimistic that I’m going to catch a break, but I have to keep on keeping, that’s it. I have too many people who rely on me.”