RENT Magazine Q3 '24

THE FAIR HOUSING ACT AND INDUSTRY TOOLS - HOW DOES AI IMPACT THIS? To better understand these new notices, it is helpful to briefly review what the Fair Housing Act has to say about advertising and resident screening. The most important objective is for a property to avoid any policy or practice that effectively discriminates against individuals on the basis of their protected category(ies). This means ensuring the language or imagery used to advertise doesn’t show a preference for or alienate on the basis of race, color, religion, sex, national origin, disability, or familial status. (Depending on where your property is located, state and local protected categories would also apply.) When it comes to screening a prospective resident, this means that decisions made based on credit, eviction, and criminal history also need to be handled carefully to avoid a fair housing violation. So, how does the use of AI or algorithmic programs potentially conflict with this objective? HUD’s new notices address this, stating that any property utilizing these tools is now at risk of violating the Fair Housing Act if their use results in discriminatory outcomes. Since HUD has vastly expanded the scope of what may constitute a discriminatory outcome, the takeaway is that if your property has been utilizing such tools, you are now at risk of liability and should take proactive steps to ensure you are in compliance. But with so many different tools available, how do you know where to start to identify potential violations and prevent them? The notices listed a few common tools used in the property management industry. Let’s review the risks, identify some ways to limit liability, and build a plan toward consistently supervising and revisiting the tools.

PAGE 68

Powered by