Skip to main contentSkip to navigationSkip to footer

Emerging Fair Housing Issues: AI and Technology

13 minPRO
4/6

Key Takeaways

  • Algorithms discriminate through training data bias, proxy variables, and decision opacity.
  • Landlords bear liability for all tools used, including third-party algorithms—compliance cannot be outsourced.
  • FCRA requires articulable denial reasons; "the algorithm decided" is not permissible.
  • Compliance requires due diligence, transparency, override capability, and quarterly disparate impact monitoring.
This track contains subscriber-only lessons

Explore free tracks in this area of study, or subscribe for full access.

Browse available tracks
"Disparate Impact, ESA Disputes & AI Screening Risks" is a Pro track

Upgrade to access all lessons in this track and the entire curriculum.

Test Your Knowledge

1.How can algorithms used in tenant screening violate fair housing law?

2.What emerging regulatory approach addresses algorithmic bias in housing decisions?

3.What is the landlord's liability when a third-party screening service's algorithm produces discriminatory outcomes?