Sexist AI? What to do about gender-based algorithmic bias in the financial sector

November 11, 2020

By Sonja Kelly, Director of Research and Advocacy, Women’s World Banking

Bias happens. It is widely discussed across the world as different industries use machine learning and artificial intelligence to increase efficiency in their processes. I’m sure you’ve seen the headlines. Amazon’s hiring algorithm systematically screened out women candidates. Microsoft’s Twitter bot grew so racist it had to leave the platform. Smart speakers do not understand people of color as well as they understand white people. Algorithmic bias is all around us, so it is no surprise that Women’s World Banking is finding evidence of gender-based bias in credit-scoring algorithms. With funding from the Visa Foundation, we are starting a workstream describing, identifying, and mitigating gender-based algorithmic bias that affects prospective women borrowers in emerging markets.

Categorizing people as “creditworthy” and “not creditworthy” is nothing new. The financial sector has always used proxies for assessing applicant risk. With the increased availability of big and alternative data, lenders have more information from which to make decisions. Enter artificial intelligence and machine learning—tools which help sort through massive amounts of data and determine what factors are most important in predicting creditworthiness. Women’s World Banking is exploring the application of these technologies in the digital credit space, focusing mostly on smartphone-based services that have seen global proliferation in recent years. For these companies, available data might include an applicant’s list of contacts, GPS information, SMS logs, app download history, phone model, available storage space, and other data scraped from mobile phones.

Digital credit offers promise for women. Women-owned businesses are one-third of SMEs in emerging markets, but win a disproportionately low share of available credit. Ensuring available credit gets to women is a challenge—loan officers approve smaller loans for women than they do for men, and women collect greater penalties for mistakes like missed payments. Digital credit assessment takes this human bias out of the equation. When deployed well it has the ability to include thin-file customers and women previously rejected because of human bias.

“Deployed well,” however, is not so easily achieved. Maria Fernandez-Vidal from CGAP and data scientist consultant Jacobo Menajovsky emphasize that, “Although well-developed algorithms can make more accurate predictions than people because of their ability to analyze multiple variables and the relationships between them, poorly developed algorithms or those based on insufficient or incomplete data can easily make decisions worse.” We can add to this the element of time, along with the amplification of bias as algorithms iterate on what they learn. In the best-case scenario, digital credit offers promise for women consumers. In the worst-case scenario, the exclusive use of artificial intelligence and machine learnings systematically excludes underrepresented populations, in particular women

It is easy to see this problem and jump to regulatory conclusions. But as Women’s World Banking explores this topic, we are starting first with the business case for mitigating algorithmic bias. This project on gender-based algorithmic bias seeks to understand the following:

  1. Setting up an algorithm: How does bias emerge, and how does it grow over time?
  2. Using an algorithm: What biases do classification methods introduce?
  3. Maintaining an algorithm: What are ways to mitigate bias?

Our working assumption is that with fairer algorithms may come increased profits over the long-term. If algorithms can help digital credit companies to serve previously unreached markets, new businesses can grow, consumers can access larger loan sizes, and the industry can gain access to new markets. Digital credit, with more inclusive algorithms, can provide credit to the elusive “missing middle” SMEs, a third of which are women-owned.

How are we investigating this topic? First, we are (and have been—with thanks to those who have already participated!) conducting a series of key informant interviews with fintech innovators, thought leaders, and academics. This is a new area for Women’s World Banking, and we want to ensure that our work builds on existing work both within and outside of the financial services industry to leverage insights others have made. Next, we are fabricating a dataset based on standard data that would be scraped from smartphones, and applying off-the-shelf algorithms to understand how various approaches change the balance between fairness and efficiency, both at one point in time and across time as an algorithm continues to learn and grow. Finally, we are synthesizing these findings in a report and accompanying dynamic model to be able to demonstrate bias—coming within the next couple months.

We’d love to hear from you—if you want to have a chat with us about this workstream, or if you just want to be kept in the loop as we move forward, please feel free to reach out to me, Sonja Kelly, at sk@womensworldbanking.org.