HUD to Landlords: Make Sure Your Digital Ads Don’t Discriminate

Follow our five best practices when using AI-based digital ad platforms.

 

 

Savvy use of digital media for marketing purposes can be a game changer for landlords. But it can also get them into fair housing trouble. The same artificial intelligence (AI) and machine learning algorithmic technologies that empower you to streamline and target your marketing can also be used, whether deliberately or inadvertently, to exclude groups the fair housing laws protect.

Follow our five best practices when using AI-based digital ad platforms.

 

 

Savvy use of digital media for marketing purposes can be a game changer for landlords. But it can also get them into fair housing trouble. The same artificial intelligence (AI) and machine learning algorithmic technologies that empower you to streamline and target your marketing can also be used, whether deliberately or inadvertently, to exclude groups the fair housing laws protect.

The U.S. Department of Housing and Urban Development (HUD) has long recognized and warned against this. In 2019, it acted on its suspicions by bringing a discriminatory advertising lawsuit against Facebook and its corporate owner, Meta. After settling that case in 2023, HUD is now doubling down.

On May 2, 2024, HUD issued important new guidance making it clear that it’s keeping an eye out and intends to hold housing providers accountable for digital discrimination. With HUD looking over their shoulder, it’s become imperative for landlords who rely on AI platforms to ensure that their marketing and advertising practices and messages meet fair housing standards without having discriminatory effects on any protected groups.

That isn’t as simple as it sounds. Digital discrimination can be subtle and difficult to detect, especially for landlords who aren’t well versed in how AI and algorithmic (which, for simplicity’s sake we’ll refer to collectively as “AI”) advertising platforms work.

This month’s lesson will help you tackle the challenge of keeping your digital marketing compliant. First, we’ll explain the fair housing advertising laws and how using digital platforms for marketing can run afoul of them. Then, we’ll outline three kinds of AI tools that can get you into trouble when applied to housing. Finally, we'll lay out a set of five best practices you can follow to minimize your liability risks for discriminatory digital advertising. After the lesson, we’ll give you a quiz allowing you to apply the material to real-life situations and determine how well you’ve learned the material.

WHAT DOES THE LAW SAY?

The federal Fair Housing Act (FHA) bans discrimination in the sale, rental, and financing of a dwelling because of race, color, religion, sex, national origin, familial status, or disability. Among other things, you’re not allowed “[t]o make, print, or publish, or cause to be made, printed, or published any notice, statement, or advertisement, with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin.” In addition to intentional discrimination, the ban applies to practices that aren’t deliberately discriminatory but have an “unjustified discriminatory effect.”

Potential liability for discriminatory advertising is very broad in scope. The new HUD guidance says that an “advertiser” at risk includes not just the operator of an ad platform—that is, a website, mobile application, or other channel—but also “entities or individuals placing advertisements” for rental housing on the platform. That would include housing owners, landlords, and management companies.

How digital marketing can lead to fair housing liability. The fundamental problem is that AI platforms may be wired to do things that violate these rules, including targeting advertising for housing toward some groups and away from others that the discrimination laws protect. While this can be done deliberately, it may also happen as a result of automated systems operations that are designed not to comply with fair housing laws but to make ad delivery more efficient in meeting an advertiser’s objectives.

Example: Because of the way it’s programmed, a system may conclude that men are more likely than women to click on certain housing ads. Accordingly, it may direct those ads only to people it believes to be men in violation of fair housing laws banning discrimination on the basis of sex.

The HUD guidance lists other ways that ad targeting may constitute illegal discrimination against protected groups, including by:

  • Denying them information about available housing opportunities;
  • Discouraging or deterring them from applying;
  • Quoting them different and less favorable prices or rental conditions;
  • Steering them to particular neighborhoods or buildings; and
  • Targeting them for predatory products or services.  

As HUD acknowledges, the kind of fair housing transgressions that AI platforms commit can happen “without the advertiser’s direction or knowledge, and can even frustrate an advertiser’s intention that an ad be distributed more broadly.” Even so, as the advertiser, landlords are legally responsible for these violations. In its parallel guidance on tenant screening services issued on the same day, HUD notes that “housing providers are responsible for ensuring their rental decisions comply with the FHA even if they have largely outsourced the task of screening applicants to a tenant screening company.” Of course, these same principles apply to marketing and advertising.

3 KINDS OF DIGITAL ADVERTISING TECHNOLOGIES TO BEWARE OF

The HUD guidance discusses the discrimination risks associated with different tools that digital platforms typically use to enable advertisers to target the audience for their ads, as well as the systems they use to deliver them.

1. Audience Categorization Tools

Audience categorization tools allow for the segmentation and selection of potential audiences for an ad by gender, age, income, location, interests, activities, connections, and other categories. As the guidance explains, categorization tools come in different forms, such as drop-down menus, toggle buttons, search boxes, or maps. For example, the ad placement interface may display a toggle button prompting the advertiser to select “men” or “women” as the potential audience. Such tools may allow for both inclusion and exclusion of categories.

The guidance warns that using tools to segment and select audiences for ads on the basis of protected characteristics would violate fair housing laws by limiting access of protected groups to information about housing opportunities. Such would be the case, for example, where the families with children or people with service animals are excluded from the target audience or the ad targets predominately white neighborhoods. Landlords would be liable even if the exclusion is based on “purportedly ‘benign’ purposes—such as an advertiser’s belief that a particular property is inappropriate for children because it has potentially hazardous features like stairs or pools,” the guidance cautions.

Using categorization tools to show different content to different groups on the basis of protected characteristics could also “result in steering, pricing discrimination, or other discriminatory outcomes.” While commonly used to advertise non-housing services and products like movies, framing an ad one way to one group and a different way to another group is problematic for advertising housing to the extent it provides “meaningfully different information about housing opportunities” on the basis of race, gender, religion, and other protected characteristics.

Example: A landlord runs separate versions of a similar ad in predominately Black and white neighborhoods. The difference is that the former version includes prominent content stating “no criminal records!” and “good credit only!” The discrepancies in the versions might be seen as a means of encouraging or discouraging applicants on the basis of race.

DEEP DIVE

The Mechanics of Digital Discrimination

The HUD guidance explains how digital ad platforms generate the personal data they need to categorize people by race, gender, or other personal characteristics. In some cases, consumers self-identify and disclose this personal information when signing up for a product, making a purchase, or signing into their browser. Some ad platforms infer personal characteristics based on available data about their purchase or browsing history, activities, movements, and information about people with whom the consumer interacts. HUD cites the example of a platform inferring that a consumer is female based on the person’s past purchases of women’s clothing. 

2. Custom & Mirror Audience Tools

The HUD guidance also cautions landlords to be careful when using platforms that enable advertisers to deliver ads only to “custom” or “mirror” audiences.

Custom audience tools allow landlords and other advertisers to deliver ads only to those included on their customer database. Landlords obtain typically obtain these lists from a data broker or other source and then upload them to the ad platform. Some custom audience tools identify consumers who’ve taken a specific action tracked by an advertiser or ad platform, such as visiting a particular website or making a particular purchase.

Mirror audience tools enable advertisers to find consumers who are similar to or mirror consumers on a customized list, called the “source audience.” 

Using custom tools to advertise housing may violate the FHA when the “source audience” is based on protected characteristics. Similarly, using mirror audience tools becomes problematic when mirroring is designed “to introduce, replicate or enhance” discriminatory exclusions or limitations in the source audience.

Example: A landlord of a community with all white tenants posts an ad and furnishes a source list of its current tenants to an ad platform with an audience mirroring application. The platform then generates an expanded list of other people who are similar to the source list based on their online behavior. Result: The mirror list includes only white people, thus denying non-white consumers access to information about the housing opportunity.

3. Algorithmic Delivery Functions

In addition to audience selection, using AI to advertise housing creates potential fair housing problems stemming from how the advertising content is actually delivered. Explanation: There are more digital ads that could reach a particular end user consumer than there is space to place them. Ad platforms rely on sophisticated algorithms to determine which of the eligible ads to actually deliver, passing along the ads that they believe are the most likely to achieve the advertiser’s objective. The problem is that these algorithmic delivery functions may deliver ads based on a consumer’s race, sex, religion, or other protected characteristics.

Example: Aware of the discrimination risks involved, a landlord wants to advertise evenly across a multi-racial metropolitan area. But instead of delivering the ad to every consumer in the selected area, the ad platform delivery system filters down the recipients to those it determines will be most likely to engage with the ad based on their income, profession, and past interaction with rental housing ads. Result: In spite of the landlord’s intentions, the ad gets delivered only to white consumers in the city’s wealthiest neighborhoods.

The guidance notes that discrimination issues may also arise where ad platforms charge higher prices to show ads on the basis of protected characteristics, such as for advertising to women because of their purchasing history. Thus, a landlord that thinks it’s just opting for lower rates may inadvertently end up directing its ads to men and away from women.

5 BEST PRACTICES TO PREVENT DIGITAL DISCRIMINATION

While it may be perfectly suitable for advertising most kinds of goods and services, digital and AI technologies may lead to discriminatory results when they’re used to advertise housing. This is true even when landlords act with good intentions and the discrimination is totally accidental.

So, where do we go from here?

To start, it needs to be stressed that nobody is suggesting that landlords totally avoid using digital platforms to advertise their properties. The point of the HUD guidance is simply to ensure that landlords are aware of their responsibility for the content of their advertising and aware of the glitches that can cause digital ads to violate the FHA.

The good news is that as long as you understand the risks, there are things you can do to manage them. The name of the game is to ensure that whatever digital advertising technology you use isn’t wired to limit or exclude groups on the basis of protected characteristics. Here are five best practices you can follow to achieve that objective and keep your digital ads compliant. 

Best Practice #1: Select the Right Digital Platform

It’s not you, it’s the technology. More precisely, it’s how AI and machine learning systems are wired that can cause problems when applied to the advertisement of housing. And because those systems come from the platform, it’s imperative to select platform providers that you can trust. According to HUD, “before using an ad platform, [landlords] should ensure that they obtain necessary information and disclosures from the ad platform regarding how the platform mitigates risks” of liability for discriminatory advertising.

Specifically, select a platform that runs ads for housing in a separate and specialized interface that’s designed to guard against discrimination in audience selection and ad delivery and offers audience targeting options for housing ads that don’t directly describe, relate to, or serve as proxies for FHA-protected characteristics. Also verify that the ad platform provider:

  • Performs regular end-to-end testing of its systems to ensure effectiveness in detecting discriminatory outcomes—for example, by running multiple ads for equivalent housing opportunities at the same time and comparing the demographics of the delivery audience;
  • Proactively identifies and adopts less discriminatory alternatives for AI models and algorithmic systems;
  • Takes steps to ensure that algorithms are similarly predictive across protected groups and makes necessary adjustments to correct for disparities;
  • Ensures that ad delivery systems aren’t generating differential charges on the basis of protected characteristics or charging more to deliver ads to a nondiscriminatory audience; and
  • Documents, retains, or publicly releases in-depth information about ad targeting functions and internal auditing.

Best Practice #2: Ensure Audience Selection Tools Don’t Discriminate

If the digital platform offers audience categorization functionality, ensure that those tools don’t segment audiences based on consumers’ protected characteristics or what are called “close proxies” enabling the platform to infer those characteristics. Consider using platforms such as Meta and Google that have taken measures to flag and ensure that advertisers of housing aren’t offered use of audience categorization tools that pose potential FHA problems. Ask the platform provider if it has and regularly audits this flagging function.

Best Practice #3: Ensure Custom & Mirror Audience Tools Don’t Discriminate

HUD recommends that landlords “carefully consider the source, and analyze the composition, of audience datasets used for custom and mirror audience tools for housing-related ads.” In other words, ensure that custom and mirror data isn’t compiled on the basis of protected characteristics. The guidance also advises making “considered use of any tools provided by the ad platform for evaluating the projected demographics of a targeted audience.” Seek out platform providers that are aware of the risks and have taken steps to disable these functions to guard against potential liability.

Best Practice #4: Make Sure Ad Platform Knows You’re Advertising Housing

Ad platforms wired to minimize FHA liability risks should have some kind of mechanism to alert the system that the proposed ads relate to housing and require special treatment. So, be careful to follow ad platform instructions to ensure that the mechanism and special risk mitigation measures activate.

Best Practice #5: Monitor Your Own Housing Ads & Campaigns

If possible, landlords should monitor their own ads and advertising campaigns “to identify and mitigate discriminatory outcomes.” Thus, for example, a digital ad to which only white prospects from wealthy neighborhoods respond could be a signal that something is amiss in the audience selection or ad delivery function.  

Take The Quiz Now

June 2024 Coach's Quiz