Algorithmic Bias, Financial Inclusion, and Gender

January 18, 2024

By Sonja Kelly, Director of Research and Advocacy, and Mehrdad Mirpourian, Senior Data Analyst

The discussion around artificial intelligence (AI) as a driving force for the economy and society has become increasingly popular, as evidenced by more than two dozen AI-focused sessions at the 2024 World Economic Forum in Davos. In 2020, we began a journey to understand algorithmic bias as it relates to women’s financial inclusion. What is it? Why does it matter especially now? Where does it emerge? How might it be mitigated? This topic is especially important as we speed into a digital finance future. Women are less likely to own a phone, less likely to own a smartphone, and less likely to access the internet. Under these conditions, it is not a guarantee that digital credit underwriting will keep women’s digital constraints in mind. We focused our inquiry on the risks of algorithm-based underwriting to women customers. Today, we’re sharing what we’ve learned and where this research is taking Women’s World Banking in the future.

In Algorithmic Bias, Financial Inclusion, and Gender: A primer on opening up new credit to women in emerging economies, we emphasize that finding bias is not as simple as finding a decision to be “unfair.” In fact, there are dozens of definitions of gender fairness, from keeping gendered data out of credit decisions to ensuring equal likelihood of granting credit to men and women. We started with defining fairness because financial services providers need to start with an articulation of what they mean when they say they pursue it.

Pursuing fairness starts with a recognition of where biases emerge. One source of bias is the inputs used to create the algorithms—the data itself. Even if an institution does not use gender as an input, the data might be biased. Looking at the data that app-based digital credit providers collect gives us a picture of what biased data might include. Our analysis shows that the top digital credit companies in the world collect data on GPS location, phone hardware and software specifications, contact information, storage capacity, and network connections. All of these data sources might contain gender bias. As mentioned, a woman has more unpaid care responsibilities and is less likely to have a smartphone or be connected to the internet. Other biases might include the model specifications themselves, based on parameters set by data scientists or developers. We heard from practitioners in our interview sample about mistakes that coders make—either through inexperience or through subconscious biases—that all but guarantee bias in the model outputs. Finally, the model itself might introduce or amplify biases over time as the model continues to learn from itself.

For institutions wanting to better approximate and understand their own biases in decision-making, Women’s World Banking provides a crucial guide for lenders, amidst the backdrop of a rapidly changing credit landscape. Policymakers and data scientists alike can walk through recommendations for providers to detect and mitigate bias, ensuring credit scoring methods are inclusive and preventing unintentional exclusion of women. Download the free guide here.

There are many easily implementable bias mitigation strategies relevant to financial institutions. These strategies are relevant for algorithm developers and institutional management alike. For developers, mitigating algorithmic bias may mean de-biasing the data, creating audits or checks to sit alongside the algorithm, or running post-processing calculations to consider whether outputs are fair. For institutional management, mitigating algorithmic bias may mean asking for regular reports in plain language, working to be able to explain and justify gender-based discrepancies in the data, or setting up an internal committee to systematically review algorithmic decision-making. Mitigating bias requires intentionality at all levels—but it doesn’t have to be time consuming or expensive.

Addressing the issue of prospective biases in lending is an urgent issue for the financial services industry—and if institutions do not do it themselves, future regulation will determine what bias mitigation will look like. If other industries provide a roadmap, financial services should be open and transparent about the biases that technology may either amplify or introduce. We should be forward thinking and reflective as we confront these new global challenges, even as we continue to actively leverage digital finance for financial inclusion.

Women’s World Banking remains committed to being part of the solution. Our upcoming work stream phase involves developing a curriculum for data scientists, specifically designed to help them detect and mitigate bias against rejected credit applicants in algorithms. Additionally, considering there is no training program available today that equips regulators to make sure financial and regulatory technologies work for women, we have developed a multi-month inclusive fintech program for regulators. Participants will gain an understanding of key risks and opportunities posed by emerging technologies like AI, tech trends impacting women’s financial inclusion, and the skills and support network to stay at the cutting edge of inclusive policy innovation. If you’re interested in supporting this work, click here. If you would like updates on our programs, sign up for our mailing list.