A 2020 investigation by The Markup and Propublica found that tenant-screening algorithms often encounter obstacles like mistaken identity, especially for people of color with common last names. A Propublica assessment of algorithms made by the Texas-based company RealPage last year suggested it can drive up rents.
A second case against SafeRent under the Fair Housing Act concluded in federal court in Connecticut in November and awaits a judge’s decision. It was brought by Carmen Arroyo and others, who say the company’s CrimSAFE algorithm deemed a shoplifting charge that was later dropped “disqualifying,” leading to a request for her disabled son, who is unable to speak or walk, to be denied. The case alleges the system discriminated on the basis of disability, national origin, and race.
In response to the brief filed by the DOJ and HUD, Andrew Soukup, an attorney for SafeRent, said the company aims to supply property managers and landlords with predictions to help them make good decisions but does not itself make housing decisions. “SafeRent does not decide whether to approve anyone’s application for housing. Those decisions are made by property managers and landlords,” he said in a statement.
The Department of Justice’s intervention in the SafeRent case is one part of recent efforts by the US government to enforce civil rights law on algorithms that make important decisions about people’s lives. On the same day, the department announced terms of a settlement agreement with Meta for selling ads that allegedly violate the Fair Housing Act. The company has developed a system to reduce discrimination in Facebook ads and will remain under federal government supervision until 2026.
“Federal monitoring of Meta should send a strong signal to other tech companies that they too will be held accountable for failing to address algorithmic discrimination that runs afoul of our civil rights laws,” said Clarke, the Department of Justice civil rights division leader in a statement. Last year she worked with the Equal Employment Opportunity Commission to issue guidance to businesses using hiring algorithms on how to avoid violation of the Americans With Disabilities Act.
Together, those interventions suggest the DOJ is determined to enforce federal antidiscrimination law to protect people’s rights in the era of algorithms. “Obviously, advertising is different than tenant screening, but it puts these different industries on notice that they can’t hide behind a lack of transparency anymore and that there is going to be greater accountability,” said Gilman, the University of Baltimore law professor. She has represented low-income clients for 25 years, and in the past few years has encountered more cases in which she suspects an algorithm working in the background denied a client housing. But whether existing antidiscrimination law will prove adequate or whether new law is necessary to protect against harmful algorithms is an unresolved issue.
The signal sent to the housing sector this week by the Department of Justice seems in line with other proclamations by the Biden administration on addressing the role AI can play in human rights abuses. Last year, the White House proposed an AI Bill of Rights, a set of principles intended to protect citizens from algorithms in critical areas of their lives like housing, health care, finance, and government benefits. The Trump administration had attempted to make it more difficult to prosecute landlords who use tenant-screening algorithms under the Fair Housing Act.