Tech Equity Initiative

Building Equity and Upward Mobility by Making Sure the Technologies That Impact Our Lives Are Fair

Many of the technologies used in housing and financial services are not fair for women and people of color yet they impact every area of our lives. From determinations about whether people can get loans or how much they will pay for them to whether a sick patient can get the healthcare they need, algorithms drive important decisions. A mathematical formula can dictate whether a new college graduate can get a job, a single mom can rent an apartment, or whether a family can get insurance for their new home. Algorithms can be life-altering. That’s why it is so important they are fair and do not infuse bias into the process.

Many factors contribute to unfair and disparate outcomes in tech including structural barriers like residential segregation, the racial wealth gap, and the dual credit market. But adopting solutions that have less or no discriminatory impacts on consumers can help advance equitable opportunities for everyone. Click the button below to learn more about structural barriers that drive unfair outcomes in tech.

This Initiative Focuses on Five Main Goals:

  • Developing solutions for removing bias from the technologies that shape our lives
  • Increasing transparency and explainability for AI tools
  • Advancing research to help reduce bias in tech
  • Developing policies that promote more effective oversight for AI tools
  • Supporting efforts to increase diversity, equity, and inclusion in the tech field

Upcoming Events

Attend our upcoming national conference on October 6, 2020 to hear from leading tech, fair lending, and civil rights experts about implementing solutions for advancing tech equity. Click the button below to register and learn more about the conference.

Join Our Team

If you are interested in being a part of a dynamic team working to eliminate bias and increase transparency, explainability, and responsible policies in the tech space, check out our career opportunities.