NFHA Supports Federal Government’s Joint Statement on Enforcement Efforts Against Discrimination and Bias in Automated Systems
FOR IMMEDIATE RELEASE
April 28, 2023
Contact: Izzy Woodruff | iwoodruff@nationalfairhousing.org
NFHA Supports Federal Government’s Joint Statement on Enforcement Efforts Against Discrimination and Bias in Automated Systems
Washington, D.C. — This week, the Equal Employment Opportunity Commission (EEOC), Federal Trade Commission (FTC), Consumer Financial Protection Bureau (CFPB) and Department of Justice (DOJ)’s Civil Rights Division issued a joint statement on enforcement efforts to protect against discrimination and bias in automated systems including machine learning, statistical models, and artificial intelligence (AI) systems.
“The National Fair Housing Alliance applauds the stated agencies in their efforts to address the use of artificial intelligence in housing and financial services by not only recognizing the potential benefits of AI but also emphasizing the need to avoid discrimination and other harms through principles such as transparency, fairness, and accountability that should guide the development and deployment of AI systems. As innovative technologies become embedded into housing and lending processes, it is crucial that technologists resolve to develop and deploy responsible innovation that complies with federal civil rights, fair competition, consumer protection, and equal opportunity laws,” said Lisa Rice, President and CEO of the National Fair Housing Alliance (NFHA).
“While the agencies recognize data, models, and their application as the main gateways to algorithmic bias, it is essential that companies developing automated systems and our federal agencies recognize post-deployment monitoring as an effective practice for mitigating algorithmic bias,” said Dr. Michael Akinwumi, NFHA’s Chief Tech Equity Officer.
NFHA has developed a Purpose, Process and Monitoring (PPM) framework aimed at mitigating consumer harm in automated systems and has called on all companies using or developing automated systems to adopt NFHA’s framework, the National Institute of Standards and Technology (NIST)’s AI Risk Management Framework (AI RMF), and the White House Office of Science and Technology (OSTP)’s AI Bill of Rights as a way forward to effectively avoid consumer harms associated with automated systems.
“It is necessary for all relevant stakeholders — industry partners, policymakers, researchers, and advocates — to collaborate to ensure that AI is used in a way that promotes equal opportunity, fairness, and accountability,” said Snigdha Sharma, NFHA’s Senior Tech Equity Analyst.
###
The National Fair Housing Alliance (NFHA) is the country’s only national civil rights organization dedicated solely to eliminating all forms of housing and lending discrimination and ensuring equal opportunities for all people. As the trade association for over 170 fair housing and justice-centered organizations and individuals throughout the U.S. and its territories, NFHA works to dismantle longstanding barriers to equity and build diverse, inclusive, well-resourced communities.