3/8/2024 in NFHA News, Press Releases, Responsible AI/Tech Equity

NFHA’s Chief Responsible AI Officer To Testify Before U.S. Commission on Civil Rights Regarding Use of Facial Recognition Technology


March 8, 2024
Press Contact: Janelle Brevard | jbrevard@nationalfairhousing.org

Washington, D.C. — Today, Michael Akinwumi, Chief Responsible AI Officer for the National Fair Housing Alliance (NFHA), will testify before the U.S. Commission on Civil Rights regarding civil rights implications of the federal use of facial recognition technology (FRT).

Facial recognition technology, when used responsibly and ethically, may offer enhanced security and efficiency in housing. However, it poses serious civil rights risks and challenges such as potential racial bias in misidentification, invasion of privacy, and unwarranted surveillance, which can lead to discrimination in housing and unjust law enforcement practices. In his testimony, Akinwumi urges the Commission to conduct a thorough investigation into the application of FRT in public housing, its associated risks, and its impact on communities of color.

FRT’s paramount advancement in face recognition, detection, and analysis is largely due to Artificial Intelligence (AI) technologies. For example, many of the recent advances in FRT are due to deep learning algorithms, a subset of machine learning that uses layered (or “deep”) neural networks to analyze various forms of data, thereby enabling computers to learn from experience and understand the world in terms of a hierarchy of concepts. Though there are significant risks of bias and discrimination in AI systems, including face recognition systems behind FRT, the risks are not insurmountable. Akinwumi, on behalf of NFHA, urges the Commission to ask Congress to enact comprehensive legislation to advance Responsible AI through a robust framework emphasizing fairness, privacy, equality, and equity to AI technologies benefit society without causing harm.

NFHA urges the Commission to advocate for legislative action to address and mitigate the risks of FRT bias, promoting the establishment of fairer and more equitable technological frameworks by:

  1. Ensuring Strong Civil and Human Rights Protections;
  2. Ensuring Compliance with Existing Civil Rights and Consumer Protection Laws;
  3. Integrating the Review of Equity in the Algorithm’s Lifecycle and Requiring Auditing Requirements for AI, including FRT, in Housing and Lending;
  4. Promoting Effective Training for the Federal Workforce;
  5. Ensuring Equitable Digital Access, Public Data Access, Transparency and Explainability;
  6. Ensuring Technologies Developed Outside of the U.S. Adhere to U.S. Rules and Regulations;
  7. Improving Consumers’ Ability to Have Agency Over Their Data; and
  8. Investigating the Impact of FRT-equipped Surveillance Technologies in Public Housing.

Technological innovations can significantly benefit people, society, and the economy. However, deploying facial recognition systems without proper rights-preserving protocols, testing, and oversight can disenfranchise underserved groups by unfairly and inappropriately denying access to housing, credit, and other high-stakes opportunities or services. In light of the economic and social repercussions of racial inequalities exacerbated by FRT, Akinwumi advocates for the Commission to recommend comprehensive guidelines to ensure technological advancements benefit all citizens, safeguarding civil rights and fostering economic growth.

Akinwumi’s full testimony can be found here.

The National Fair Housing Alliance (NFHA) is the country’s only national civil rights organization dedicated solely to eliminating all forms of housing and lending discrimination and ensuring equal opportunities for all people. As the trade association for over 170 fair housing and justice-centered organizations and individuals throughout the U.S. and its territories, NFHA works to dismantle longstanding barriers to equity and build diverse, inclusive, well-resourced communities.