Avoid Discrimination When Using AI-Based Tenant Screening Services
HUD warns landlords about their liability risks.
Tenant screening has become a $1 billion industry. Like many landlords across America, you may look to third-party screening companies to gather and analyze key information about rental prospects and issue a report assessing how likely they are to pay rent and obey the key terms of their lease. But while enabling you to steer clear of problem tenants and make sounder rental decisions, relying on outside reports to decide whether to accept or reject applicants carries potential fair housing risks.
This is especially true when the screening company uses artificial intelligence and other machine learning technologies (which we’ll refer to collectively as “AI”) to vet applicants. Fair housing and privacy advocates have been warning that the opaque algorithms on which AI screening is based enable digital discrimination and amplify existing biases in an already unequal housing market. And now, the U.S. Department of Housing and Urban Development (HUD) has weighed in on the issue.
On May 2, 2024, the agency issued important new guidance (Guidance) warning that use of AI for tenant screening may lead to discriminatory rental decisions that expose not only the screening company but also the landlord to liability risk under fair housing laws.
While digital discrimination risk isn’t a new issue, the Guidance raises the stakes. With HUD looking over their shoulder, it’s become imperative for landlords that rely on AI-based tenant screening to be on the lookout for and guard against hidden biases and other digital dysfunctions that can taint their rental decisions.
This month’s lesson will help you tackle that challenge. First, we’ll explain how AI tenant screening works and how it can get you into fair housing hot water. After we point out the potential pitfalls in AI screening, we’ll list 12 best practices you can follow to sidestep them and minimize your liability risks for digital discrimination. To reinforce and test your understanding of the material, you can take the Coach’s Quiz at the end of the lesson, enabling you to apply the principles and analysis to real-life situations that are likely to arise if your tenant screening company uses AI.
Avoid Digital Discrimination in Advertising
On the same day that it published the Guidance, HUD issued accompanying guidance addressing digital discrimination in the context of housing advertising and marketing. For a lesson analyzing and explaining how to comply with that guidance, see Fair Housing Coach, “HUD to Landlords: Make Sure Your Digital Advertising Doesn’t Discriminate,” June 2024.
WHAT DOES THE LAW SAY?
The starting point is the federal Fair Housing Act (FHA), which bans discrimination in the sale, rental, and financing of a dwelling because of race, color, religion, sex, national origin, familial status, or disability.
Intentional vs. Unintentional Discrimination. Intentionally deciding not to lease to members of a protected group such as families with young children, is an obvious violation. But most of the discrimination that occurs in the real world is indirect and more subtle. It occurs when a landlord does something that appears neutral and nondiscriminatory on its face but has the effect of excluding protected groups. Result: A well-intentioned landlord that bases rental decisions on the reports of tenant screening companies that uses AI containing hidden biases against people of certain races, religions, etc., could be liable for indirect discrimination.
Direct vs. Vicarious Liability. There are two bases for holding a landlord that engages in indirect digital discrimination liable. As the Guidance notes, under principles of direct liability, landlords are responsible for ensuring that their rental decisions comply with the FHA. This is true even if they outsource the task of screening applicants to a tenant screening company. While the screening company may also be responsible, landlords “retain authority over screening practices and decisions at their properties.”
Risk of vicarious liability also comes into play to the extent the tenant screening company is deemed the landlord’s agent. Explanation: As the Guidance explains, landlords are vicariously liable for the discrimination committed by their agents while acting within the scope of their agency, regardless of whether the landlord actually knows or should have known of the agent’s discriminatory conduct.
Potential Problems with Using AI Technology to Screen. Fundamental AI-based tenant screening technology isn’t wired for fair housing compliance. One problem is that the technology wasn’t designed for, and hasn’t until recently been widely used for, making rental decisions. As a result, its findings may have little bearing on whether a rental applicant will actually comply with their lease.
The Guidance also cautions that the technology may use bad or incomplete data. Thus, in compiling eviction history, there’s a tendency to include all eviction court records regardless of case disposition, meaning evictions are noted on public records even if the case was decided in the tenant’s favor. The same tendency to include all records regardless of ultimate disposition also applies to criminal record data, including administrative citations, bench warrants, and traffic tickets, along with misdemeanors and felonies.
Potential to Reduce Rental Process Transparency. Transparency in the rental process is crucial to compliance. Misunderstandings, disputes, and lawsuits are much less likely to occur when you furnish applicants with clear information about your rental policies, processes, and standards. But AI screening technology can undermine the transparency of the process, the Guidance cautions. Screening companies “tend not to disclose” how the automated software they use to generate screening reports works, including the extent to which it relies on AI.
AI technology also has a tendency to hide the precise reasons for a denial, a crucial piece of information in the fair housing compliance context. Some screening reports include a simple recommendation to accept or deny or assign the applicant a numerical score or grade without an explanation. Some reports detail the records found, while others simply state if the applicant “passed” or “failed” in various areas.
12 BEST PRACTICES TO PREVENT AI-BASED DISCRIMINATION
To sum up, you’re liable for the discriminatory housing decisions you make, even when you base those decisions on the findings of an outside tenant screening company. Using screening companies that rely on AI can distort your rental decisions and undermine the transparency of your rental process. The question then becomes what to do to manage those risks.
The answer, according to the Guidance is to develop clear and transparent policies and practices to ensure that all denials reflect your own sound judgment and only use tenant screening services that will help you implement those policies. Here are 12 best practices HUD suggests you use to achieve those objectives.
Best Practice #1: Select the Right Tenant Screening Company
There are more than 100 companies offering tenant screening services in the U.S., and they’re not all the same. So, the first thing you need to do is select the right company.
Compliance Strategy: According to the Guidance, in selecting a tenant screening company, landlords “should inquire into the ways in which the company ensures its screenings are accurate and nondiscriminatory.” Specifically, select companies that:
- Allow you to customize the screening criteria (which we’ll discuss more in the next section);
- Frequently update their data;
- Monitor for unjustified discriminatory effects;
- Report clear and specific reasons for denials;
- Allow individuals to correct inaccuracies;
- Publicly disclose key details about their screening systems; and
- Comply with all applicable federal, state, and local laws.
Best Practice #2: Ensure Use of Proper Screening Criteria
The primary compliance imperative is to ensure that screening companies screen rental applicants only for information relevant to whether they’re likely to comply with their tenancy obligations. The unwillingness of screening companies to disclose how their technology works may make this hard to verify. Consequently, it’s generally inadvisable to purchase an “off-the-shelf” product.
Compliance Strategy: Look for a screening company that allows you to not only see but, if necessary, customize the screening criteria to the extent you believe that those criteria may disproportionately exclude applicants of certain races or other protected classes. We’ll talk more about overseeing screening companies below.
Best Practice #3: Disregard Irrelevant Criteria
The Guidance also sets out general principles to follow in weighing different screening criteria, including criteria to disregard.
Compliance Strategy: Key recommendations from the Guidance:
- Don’t base denials on past actions unrelated to tenancy and past incidents unlikely to recur, such as eviction due to job loss or family or medical emergency;
- Waive criteria that may be okay for most applicants but are irrelevant to the individual circumstances of the particular applicant you’re screening, such as a minimum income requirement for applicants whose rent will be paid by somebody else;
- Be mindful that some records are more relevant than others—for example, give more weight to recent than older records;
- Afford no weight to records without a negative outcome, such as a record of an eviction proceeding where the tenant/applicant won; and
- Also disregard a court record that doesn’t provide enough information to determine who won, unless you get additional information about the outcome of the case.
Best Practice #4: Use Only Accurate Records
The Guidance cautions that datasets used for tenant screenings are often incomplete, lacking in key personal identifiers or updated infrequently. Automated systems might mis-categorize records with missing or unclear information if they aren’t programmed to account for those scenarios.
Compliance Strategy: Ensure that screening records are accurate and use specific information in queries to avoid discriminatory screenings. Recognize that inaccuracies in screening records are a common cause of discriminatory rental decisions, especially inaccuracies that disproportionately affect members of certain demographic groups. For example, the problem of attributing records of people bearing the same or similar names to the wrong person is more common for last names that are prevalent among Latino, Asian, or Black individuals.
Best Practice #5: Be Careful About Basing Denials on Credit History
The Guidance cites three types of screenings that are particularly likely to result in indirect exclusion of groups the fair housing laws protect. The first is use of credit history, or credit scores that national credit bureaus assign to consumers based on data indicating how likely they are to default on a loan. Most tenant screening companies incorporate this information into their own screening models. But credit scores don’t measure a consumer’s risk of not paying rent. In addition, use of credit scores creates potential fair housing problems due to some protected groups’ lack access to equitable credit and homeownership opportunities.
Median Individual FICO Credit Scores by Ethnicity (as of August 2021)
Group |
Median FICO Credit Score |
Black individuals |
627 |
Hispanic individuals |
667 |
Native American individuals |
612 |
White individuals |
727 |
As the Guidance explains, Black and Brown persons are more likely to have inaccurate credit reports or experiences resulting in low or no credit scores. Black and Brown people are also disproportionately represented among those who are “credit invisible,” i.e., have minimal or no credit history.
Another example of a credit-invisible applicant is someone who recently immigrated to the U.S. This person may not have any history that suggests they’re a credit risk, let alone a rental risk. Their credit record simply lacks information regardless of their financial history in their country of origin. Use of credit scores might also have discriminatory effects against the disabled and victims of domestic violence, the vast majority of whom are women.
Compliance Strategy: Be aware of these discrimination risks and seek to de-emphasize use of credit scores for screening, especially when more relevant financial information is available. In general, HUD says landlords should avoid denials based on credit scores or history if:
- An applicant’s financial background is of little relevance, such as when the applicant has a cosigner who meets the landlord’s financial screening criteria;
- The negative credit history is due to an event that’s unlikely to recur, such as a family or medical emergency; and
- Minimal or poor credit history is due to domestic violence, dating violence, sexual assault, or stalking that’s not the victim’s fault and that doesn’t bear upon the likelihood of their paying rent on time in the future.
Landlords can implement this policy, HUD adds, by manually disregarding the applicant’s credit score when it’s not relevant or programming an automated screening model to do so.
Coach’s Tip: Having no or limited credit history is even less relevant than having poor credit history. Accordingly, the Guidance suggests that landlords adopt a policy of admitting applicants so long as they don’t have a negative credit history rather than requiring them to have a positive credit history.
Best Practice #6: Beware of Basing Denials on Eviction History
The second form of potentially discriminatory screening practice is rejecting applicants because they have a history of being evicted. This practice is so common that many tenant screening companies have built private databases from court records of eviction cases. The problem is that eviction disproportionately affects tenants who belong to protected classes. For example, over half of all eviction cases are filed against Black tenants even though fewer than one in five tenants are Black. Hispanic renters, women, families with children, and the disabled are also targeted for eviction at disproportionate rates.
In addition, the Guidance notes that court eviction records are highly unreliable, citing a large study in which 22 percent of the eviction records evaluated either contained ambiguous information on how the case was resolved or falsely represented a tenant’s eviction history.
Compliance Strategy: Be aware that the quality of eviction records in screening company databases varies and that overbroad screenings for eviction history may have an unjustified discriminatory effect. To counteract these risks, the Guidance says that landlords shouldn’t base denials on eviction records that are old, incomplete, irrelevant, or where a better measure of an applicant’s behavior is available. Specific recommendations:
- Don’t use an eviction record if information about the record was known before screening unless you give applicants the chance afterwards to have the record disregarded and corrected afterwards;
- Don’t base a denial on eviction proceedings where the tenant prevailed, settlement was reached, or the matter was dropped;
- Disregard unjustified evictions, such as evictions against a tenant in retaliation for asserting their legal rights or because they were, through no fault of their own, the victim of domestic violence;
- Accord less weight to “no fault” evictions in jurisdictions where they’re allowed; and
- Be prepared to make accommodations to the screening policy if the eviction was related to the applicant’s disability—for example, an eviction for late payment of rent because of the timing of an SSI or SSDI payment or a medical emergency.
Best Practice #7: Beware of Basing Denials on Criminal History
Studies show that individuals with disabilities and Black and Brown persons have historically been on the wrong end of the U.S. criminal justice system at disproportionate rates. Criminal record discrimination is a complex subject for which HUD has issued separate guidance. (See “Performing Criminal Records Checks on Rental Applicants Without Committing Discrimination,” Fair Housing Coach, February 2021). But since the issue is also relevant to AI screening, it’s covered in the Guidance.
Compliance Strategy: Criminal records screening is discriminatory only when it’s broader than it needs to be to accomplish a landlord’s security purpose and there are less discriminatory alternatives available. You can avoid crossing the boundary into overbroad by ensuring that the criminal records screening methods and algorithms that your screening company uses and the grades and recommendations it provides:
- Differentiate between offenses based on their nature, severity, and how long ago they occurred;
- Consider only records that result in a conviction rather than merely an arrest; and
- Give the applicant an opportunity to provide evidence of rehabilitation or other mitigating factors.
You may also have to waive your criminal records policies or make other reasonable accommodations when the applicant has a disability. For example, accommodation may be required if the disability renders the applicant unable or unlikely to commit a repeat offense, such as a record of assault by someone who has since developed a severe mobility impairment.
Best Practice #8: Ensure Screening Company Sticks to Your Screening Policy
The screening company’s screening process should consider only the records listed in your stated screening policy. For example, the screening company shouldn’t screen for misdemeanors or civil violations if your policy is to screen for felony convictions.
Compliance Strategy: Verify that your screening company’s eviction, rental, credit, and criminal history screening standards and processes are in line with your own. Recognize that, as we noted above, disconnects can happen where records are inaccurately categorized. The Guidance suggests that landlords who use automated screenings consider not asking applicants any questions about their history, even if those questions are within the scope of their policies. “Such questions can confuse or discourage applicants while not giving the housing provider any information beyond that which they will learn from the automated screening,” according to HUD.
Best Practice #9: Verify Denial Recommendations Against Your Own Criteria
The potential for discrimination glitches in AI screening makes it risky to accept a screening company’s denial recommendation at face value.
Compliance Strategy: The Guidance recommends that landlords who receive a denial recommendation from a screening company make an independent determination of whether the information in the screening report is actually disqualifying under their own screening policies. If not, HUD says the landlord should accept the applicant despite the denial recommendation and consider contacting the screening company to adjust the grounds for denial recommendations going forward.
Best Practice #10: Be Transparent with Rental Applicants
The Guidance advises landlords to take steps to ensure transparency so that applicants know how they’ll be screened beforehand and why they were denied afterward. Providing this information early can also reduce the number of unqualified applicants, saving landlords and applicants time and expense.
Compliance Strategy: Put your screening policies in writing and make them public and readily available to potential applicants either in hard copy or via a link to your website. Your policies should contain enough detail for applicants to determine whether they’re likely to qualify, include what records you consider, which incidents are disqualifying, and how far back the screening goes. Also let applicants know how they can contest an inaccurate, incomplete, or irrelevant record; submit evidence of mitigating circumstances; and request reasonable accommodations for a disability.
Provide denial letters that contain as much detail as possible about the reasons for denial, including any and all of the specific standard(s) that the applicant failed to meet and how they fell short of each one.
Bad: “You were denied because of your credit score.”
Good: “You were denied because we require a credit score of XXX and you have a credit score of YYY, according to ZZZ service.”
Attach screening reports and all records that you relied on to the denial letter and instruct applicants how to submit an appeal if a record is inaccurate, incomplete, or irrelevant; a mitigating circumstance exists; or a reasonable accommodation for a disability is needed.
Best Practice #11: Let Rental Applicants Challenge Negative Information
The Guidance says landlords should allow applicants to challenge any potentially disqualifying information.
Compliance Strategy: Ensure denied applicants get the actual opportunity to:
- Dispute the accuracy or completeness of any negative information—for example, by demonstrating that a record belongs to another person with a similar name or omits a court decision in their favor; and/or
- Show that they’ll comply with their lease obligations, even if a negative record is accurate.
One form of the latter is by providing evidence of “mitigating circumstances” indicating that any negative behavior is unlikely to recur—for example, the applicant’s successful completion of a rehab or financial literacy program, a positive reference from a social services provider, or a new job. Applicants may also choose to contest the relevance of a standard to their particular circumstance. Example: An applicant with a Housing Choice Voucher may challenge a denial based on minimum income requirements.
Best Practice #12: Ensure Screening Company Tests Its Model for FHA Compliance
It may be hard to ensure that you and your screening company are complying with the FHA when the latter relies on a complex model that lacks “interpretability.” After all, how can you justify automated denials that have a discriminatory effect without knowing the precise “reasoning” behind those decisions?
Compliance Strategy: If your tenant screening company uses AI, ensure that it programs complex models in accordance with best practices for nondiscriminatory model design and with attention to aspects likely to pose fair housing concerns. Ask if it trains the model on demographically representative data to help ensure that it doesn’t erroneously learn to screen out particular protected classes at higher rates. In addition, verify that it performs ongoing monitoring for these issues to ensure that changes over time, such as demographic shifts, don’t cause a dataset to become unrepresentative or incomplete.
Take The Quiz Now
July 2024 Coach's Quiz |