icon The Guru Is In

Drawing a Red Line Around That Area

By
4 min read

If my instincts are correct, a seven page complaint filed on March 28, 2019, HUD v. Facebook, may one day be seen as an industry-disrupting legal event on par with U.S. v. Microsoft (1998) and U.S. v. IBM (1969). HUD accuses Facebook of violating the Fair Housing Act’s prohibitions on discrimination:

Ads for housing and housing-related services are shown to large audiences that are severely biased based on characteristics protected by the Act, such as audiences of tens of thousands of users that are nearly all men or nearly all women.

Even making due allowance for this being plaintiff’s opening salvo, the evidence it mounts, taken directly from Facebook’s terms of service, makes grim reading.  The exclusionary discrimination is done by Facebook’s search and selection algorithms for deciding which

Facebook users are shown, or denied, which ads:

[Facebook] collects millions of data points about its users, draws inferences about each user based on this data, and then charges advertisers for the ability to microtarget ads to users based on [Facebook’s] inferences about them.

Facebook also tells its users it’ll be doing this:

[Facebook] “use[s] location-related information, such as your current location, where you live, the places you like to go and the businesses and people you’re near to provide, personalize and improve our Products, including ads, for you and others.”

To hear HUD tell it, Facebook does this to make money, charging more money based on the effectiveness of its discrimination:

[Facebook] charges advertisers different prices to show the same ad to different users. The price to show an ad to a given user is based, in large part, on how likely [Facebook] believes that user is to interact with the particular ad.

Not only is gender explicitly a selection criterion, ethnicity or race can almost certainly be deduced or profiled:

As [Facebook] explains, its advertising platform enables advertisers to “[r]each people based on… zipcode…age and gender… specific languages…the interests they’ve shared, their activities, the Pages they’ve like[d]…[their] purchase behaviors or intents, device usage and more.”

If you want to discriminate, one click does the trick:

[Facebook] has provided a toggle button that enables advertisers to exclude men or women from seeing an ad, a search-box to exclude people who do not speak a specific language from seeing an ad…

The most user-friendly of all, and most advertiser-selectable, is by location:

…and a map tool to exclude people who live in a specified area from seeing an ad by drawing a red line around that area.

When they hit that phrase, the Fair Housing and Equal Opportunity attorneys must have chortled.

Several of the defenses Facebook will surely mount appear belied by their own guide:

It’s not about housing. But it is.

[Facebook] promotes its advertising platform with “success stories,” including stories from a housing developer, a real estate agency, a mortgage lender, a real-estate-focused marketing agency and a search tool for rental housing.

It’s not us, it’s the advertisers. 

[Facebook] alone, not the advertiser, determines which users will constitute the “actual audience” for each ad…Even if an advertiser tries to target an audience that broadly spans protected class groups, [Facebook’s] ad delivery system will not show the ad to a diverse audience if the system considers users with particular characteristics most likely to engage with the ad. If the advertiser tries to avoid this problem by specifically targeting an unrepresented group, the ad delivery system will still not deliver the ad to those users, and it may not deliver the ad at all.

We have clean hands. Except that the better Facebook’s customers discriminate, the more ad revenue Facebook makes.

It’s not intentionally discriminatory. That’ll be tough to win against the backdrop of disparate impact and affirmatively furthering fair housing.

We can fix this easily. That requires overturning Facebook’s entire business model, which is founded upon machine-learned discrimination – and what happens to the profits then?

Trust us, we’re the good guys.  This is not a good time for Facebook to mention trust.

Although Facebook will put a brave face on this, the auguries are unpropitious. U.S. v. IBM, filed in 1969, took 13 years to litigate with 950 witnesses and 30,000,000 pages of documents, and by the time it was withdrawn in 1982 IBM had lost its primacy. Ironically, the loss came at the hand of the same upstarts, Microsoft, that 16 years later would itself become entangled in its own seven-year government litigation. In the face of threats to break the company up, the company agreed to a Department of Justice settlement, a year after Bill Gates stepped down as CEO.

David A. Smith is founder and CEO of the Affordable Housing Institute, a Boston-based global nonprofit consultancy that works around the world (60 countries so far) accelerating affordable housing impact via program design, entity development and financial product innovations. Write him at dsmith@affordablehousinginstitute.org.