In a world of gender bias, Lendingkart’s AI-based credit model stands apart

March 11, 2022

By Sonja Kelly, Director of Research and Advocacy, Women’s World Banking

While undoubted progress has been made in some areas of gender equality, examples of everyday gender bias are still so prevalent that they almost go unnoticed. In the corporate world, unequal pay, boardroom bias, even next technologies like AI and voice recognition seem to be getting in on the bias act – for example. Women’s World Banking research has uncovered that the way financial services providers lend money through artificial intelligence is slanted towards men, which explains, at least in part, the $1.7 trillion USD financing gap between male- and female-owned small to medium sized enterprises (SMEs).

This is why our finding that Indian digital credit provider Lendingkart’s credit scoring model does not differentiate between men and women is both interesting and welcome, and points to a possible future of gender parity in financial services.

Lendingkart was founded on the goal of making it easier for entrepreneurs to access working capital to set up and grow their businesses, largely through unsecured loans. An unsecured loan is a loan that doesn’t require any type of collateral. This is crucial in the world of women-owned businesses where women are less likely than men to own assets in their own names. Women’s World Banking, itself a 40-year old non-profit that works to include more women in the formal financial system, partnered with University of Zurich to undertake an extensive audit of Lendingkart’s credit scoring system. The team created criteria to assess “fairness” such as likelihood of approval, loan terms, and repayment rate. They then used advanced statistical techniques to test Lendingkart’s underwriting model against these criteria, controlling for additional variables. Using the fairness criteria, Women’s World Banking and Lendingkart could assess the likelihood of a hypothetical woman and a similar man proceeding through various points of the loan approval process. The result was parity. Where there was a slight gender imbalance, it was explained by a low volume of women SME credit applicants, not the actual scoring methodology itself (as an aside, this is an important finding in itself as it reinforces the belief that women business owners are less likely to apply for loans than men).

The findings were notable in two ways – the first was that to achieve that level of fairness in a relatively new credit scoring model is rare. Often it takes a while to learn what fairness is. To achieve that level of gender parity early on was remarkable. The second was that accuracy and fairness go hand-in-hand, making the business case for gender fairness. Lendingkart focuses on making its credit scoring model as accurate as possible, and an outcome of that accuracy is gender parity. So there is a double upside for lenders – better decisions yielding better and more diverse customers.

As Lendingkart explains: “We actively train our credit scoring model to be as accurate as possible. The emphasis on accuracy has also translated into fairness across the most important and impactful dimensions. We are proud of the ways in which our credit scoring model treats women applicants with the same consideration it treats men applicants.”

The bias audit builds on Women’s World Banking’s recent study, Algorithmic Bias, Financial Inclusion, and Gender, which offers insights on where biases in AI emerge, how they are amplified, and the extent to which they work against women. The bias audit used advanced statistical techniques and reject inference analysis on de-identified information on borrowers, and concluded:

  • On average, women were about as likely to be approved for a loan as men are.
  • The credit scoring algorithm gave similar scores to men and women.
  • Gender had nearly no effect on loan terms, including loan size and interest rate.
  • Men and women customers of Lendingkart had the same repayment rate, different than the market average in which men customers represent nearly twice the non-performing assets (NPA) that women’s do (7 percent NPA versus 4 percent NPA).

Setting aside any sort of moral, ethical, or “CSR” conversation for a moment, the financial numbers do not lie. Gender bias is an economic anchor and business inhibitor, so why does the financial industry persist in excluding and ignoring women? One overarching reason is because lenders do not look at their own data. Lendingkart has shown that it is possible to unbias credit scoring, so our call to action to lenders everywhere is to look at your data by gender, and build fairness into your algorithms. We give practical tips for how to do that in our research paper Algorithmic Bias, Financial Inclusion, and Gender.