How mortgage algorithms perpetuate racial inequality in home lending



Lenders are more likely to refuse mortgages to people of color than to whites with a comparable financial profile, according to a new report from news agency The Markup. Racial bias was present even after reporters controlled factors such as income and area of ​​residence, as well as factors that lenders previously said would explain the imbalances: the debt-to-income ratio and the combined credit-to-value ratio.

Reporters were unable to monitor credit ratings due to public data restrictions, but government regulators determined that credit ratings alone do not account for racial differences in lending

After analyzing more than two million common mortgage applications in 2019, reporters Emmanuel Martinez and Lauren Kirchner found that nationwide lenders were 40% more likely to reject Hispanic applicants, 50% more likely to reject applicants from Asia and the Pacific, 70% more more often. reject Native American applicants and are 80% more likely to reject black applicants than financially comparable white applicants. And inequality was even more significant in cities like Waco, Texas, where Hispanic job seekers were 200% more likely to be rejected than their white counterparts.

In an interview with Marketplace’s David Brancaccio, Martinez said biased algorithms are a major contributor to inequality.

“There is certainly a human factor in all this, but more and more, [lending] becomes more manageable by algorithms. Loan specialists collect all the financial characteristics of a potential borrower, and then enter them into the algorithm. But when you consider things like wealth and assets, research shows that white families have eight times more wealth than blacks. And so when you use it as a barrier to entry, it will disproportionately affect black and colored families, ”Martinez said.

Below is an edited transcript of the interview.

Emmanuel Martinez: I have found that people of color are more likely to be rejected than their white counterparts, even if they look the same financially. And [that includes] factors that the credit industry believes can explain them. For context, I’ve been researching this topic for the last four years, I’ve worked a lot with this data. AND first time i posted this analysis, I found the same. Lenders told me that they don’t deny people color because of their race, they deny them because of things like debt-to-income ratio and combined credit-to-value ratio. And now, thanks to this analysis, I have these two ratios – and I still believe that people of color are being turned down at higher rates, even with these two important financial characteristics.

David Brancaccio: Just to make it clear to us. People who look about the same on paper, with the only difference: are they white or colored – it seems that this reduces the chances that you will get a loan or you will be turned down?

Martinez: Yes. I found that nationwide, Hispanics are 40% more likely to be rejected. And worst of all – black candidates: they are 80% more likely to get rejected than their white counterparts, even though they have the same financial characteristics.

Brancaccio: You also found regional differences: Hispanics in Waco, Texas are 200% more likely to get rejected.

Martinez: Yes, I found that Waco, Texas was the worst place for Hispanics according to my statistics. But that is different from, say, Boston, which is 70% more likely to be rejected. So it just depends on where the colored person is. In some places they are closer to parity, while in others, such as Waco, Texas, there is a large gap between Hispanics and their white counterparts.

Brancaccio: Yes, you found that in Chicago, black borrowers are 150% more likely to be denied a loan, so that’s really different.

Martinez: Yes, it strongly depends on the region. Looking at America’s largest cities, Chicago is the worst for black contenders – and Chicago is particularly interesting because much of the redline’s history goes back to Chicago. But when you look at places like Denver, Colorado or Sacramento, California, [Black applicants are] 60% more likely to be rejected. So there are definitely places with inequality, but there are also places where inequality is much worse.

Brancaccio: I saw you quote from the American Bankers Association. They said that the data you can get is still limited and they don’t believe you have proven that the system is discriminating.

Martinez: Right. Their updated statement is that it is still about credit ratings and credit histories; that if I had it it would explain the differences. But when the CFPB, when the government looked at that particular metric, they found that when they maintain a constant credit rating, people of color are still being denied at higher rates than their white counterparts. Thus, credit ratings do not fully explain the imbalances.

Brancaccio: You have been studying this for many years. Do you understand what is causing this? Are these loan officers personally biased? Is this something else? Are these many factors?

Martinez: There are many factors. There is certainly a human factor in all this, but more and more it becomes more and more controlled by algorithms. Loan officers collect all the financial characteristics of a potential borrower and then enter them into the algorithm. But when you consider things like wealth and assets, research shows that white families have eight times more wealth than blacks. And so when you use it as a barrier to entry, it will disproportionately affect black and colored families. Other disproportionately influencing variables that Freddie and Fanny pay attention to in their algorithms is the gig economy, and it is cited as the main source of income. People of color are more likely to point out the fact that their main source of income is the gig economy. And the lenders don’t like it. They like a steady income. This is another factor that disproportionately affects people of color.

Brancaccio: Some critics of the system believe that this is a structural device of racism. The way you design and implement algorithms can exacerbate long-standing inequalities.

Martinez: Exactly. For lenders, the solution revolves around risk and they do not want to lend to risky borrowers. And here is the philosophical debate advocates are trying to bring in: this risk should not rule everything; that the conversation should be more detailed.

Brancaccio: And proponents of change are not suggesting lenders give up this notion of risk. Are they not trying to find innovative ways of assessing who is most likely to pay and who is not, without these factors that play on historical discrimination and historical bias?

Martinez: Yes, and I think this is an important aspect to grasp. They are not saying that we should throw them away, they are saying that we should consider more subtle points of view. For example, if someone can pay their rent and be a consistent and timely rent payer, wouldn’t that also be a signal that that person is likely to pay off their mortgage? But rent is not something that many of these algorithms have considered in the past, until recently. As soon as we started communicating this aspect, Fannie announced that they are starting to consider the history of rent payments as part of their decision-making process. It will start next month in September. It’s kind of a detailed conversation about how we should look at more variables that are more fair to people.

Brancaccio: So, some room for change. But, nevertheless, Fannie Mae, for all its power, still has a tradition of using the so-called classic FICO credit rating. And this is something that has been around for a long time, and it’s a little unclear what exactly goes into your credit rating. They don’t tell you.

Martinez: Yes, there are many mysteries around credit ratings. For example, a credit rating that is available to you and me through my banking app or through a credit bureau will not necessarily be used when the lender decides to approve or deny you a mortgage. There is another formula that is used to calculate your mortgage rating. But Freddie and Fanny stick to this algorithm 15 years ago, although there are fairer algorithms. The government, Congress, advocates, and even FICO itself have tried to convince Fannie and Freddie to use fairer credit scoring algorithms. Even FICO has new, updated credit scoring models, and they recommended that Freddie and Fannie use these new ones, but Freddie and Fannie resisted.

Brancaccio: But the reason we don’t really know what goes into your credit rating is because companies say they are worried that we borrowers will start using the system to our advantage and spoil their predictive quality.

Martinez: Exactly. Even the algorithms used by Freddie and Fanny, this decision is registered and collected by the government, but is not subject to disclosure. So I can’t figure out what decision these algorithms are making. And the reason why Freddie and Fanny told the government they should keep it a secret [of the public data] because they didn’t want anyone to reverse engineer them. So there are many mysteries around these things. There is something about how FICO builds its algorithm and what percentage of income versus credit or debt counts, but we only know from a very superficial point of view. The public does not have detailed information on how it all works.


Source link