NFHTA Forum: Mining the Data: Algorithmic Bias in Housing Related Transactions
Algorithmic-based data systems are increasingly relied on to screen potential rental applicants, underwrite home mortgage loans, insure residential properties, and for other housing-related purposes. Data sets and algorithms that are unrepresentative, insufficient, and biased perpetuate discriminatory outcomes. These outcomes include denial of housing opportunity, unfair terms and pricing, and continued housing segregation. Understanding how technological bias manifests and affects persons in protected classes is necessary for enforcing fair housing laws effectively to counteract emerging tools that can perpetuate discrimination. The National Fair Housing Training Academy (NFHTA) and the National Fair Housing Alliance (NFHA) hosted a public forum that examined bias in data and technology and solutions for eliminating injustice in algorithmic-based systems. Over 575 fair housing partners joined us in this critical conversation.