Brennan Johnson, left, and Trevor McIntosh, who secured a mortgage for their home through, in front of their home in Wheat Ridge, Colo., August 18, 2020. (Benjamin Rasmussen/The New York Times)
Brennan Johnson, left, and Trevor McIntosh, who secured a mortgage for their home through, in front of their home in Wheat Ridge, Colo., August 18, 2020. (Benjamin Rasmussen/The New York Times)

In 2015, Melany Anderson’s 6-year-old daughter came home from a play date and asked her mother a heartbreaking question: Why did all her friends have their own bedrooms?

Anderson, 41, a pharmaceutical benefits consultant, was recently divorced, living with her parents in West Orange, New Jersey, and sharing a room with her daughter. She longed to buy a home, but the divorce had emptied her bank account and wrecked her credit. She was working hard to improve her financial profile, but she couldn’t imagine submitting herself to the scrutiny of a mortgage broker.

“I found the idea of going to a bank completely intimidating and impossible,” she said. “I was a divorced woman and a Black woman. And also being a contractor — I know it’s frowned upon, because it’s looked at as unstable. There were so many negatives against me.”

Then, last year, Anderson was checking her credit score online when a pop-up ad announced that she was eligible for a mortgage, listing several options. She ended up at, a digital lending platform, which promised to help Anderson secure a mortgage without ever setting foot in a bank or, if she so desired, even talking to another human.

In the end, she estimated, she conducted about 70% of the mortgage application and approval process online. Her fees totaled $4,000, about half the national average. In November 2019, she and her daughter moved into a two-bedroom home not far from her parents with a modern kitchen, a deck and a backyard. “We adapted to the whole COVID thing in a much easier way than if we were still living with my parents,” Anderson said this summer. “We had a sense of calm, made our own rules.”

Getting a mortgage can be a harrowing experience for anyone, but for those who don’t fit the middle-of-last-century stereotype of homeownership — white, married, heterosexual — the stress is amplified by the heightened probability of getting an unfair deal. In 2019, African Americans were denied mortgages at a rate of 16% and Hispanics were denied at 11.6%, compared with just 7% for white Americans, according to data from the Consumer Finance Protection Bureau. An Iowa State University study published the same year found that LGBTQ couples were 73% more likely to be denied a mortgage than heterosexual couples with comparable financial credentials.

Digital mortgage websites and apps represent a potential improvement. Without showing their faces, prospective borrowers can upload their financial information, get a letter of preapproval, customize loan criteria (like the size of the down payment) and search for interest rates. Software processes the data and, and if the numbers check out, approves a loan. Most of the companies offer customer service via phone or chat, and some require that applicants speak with a loan officer at least once. But often the process is fully automated.

Last year, 98% of mortgages originated by Quicken Loans, the country’s largest lender, used the company’s digital platform, Rocket Mortgage. Bank of America recently adopted its own digital platform. And so-called fintech startups like Roostify and Blend have licensed their software to some of the nation’s other large banks.

Reducing — or even removing — human brokers from the mortgage underwriting process could democratize the industry. From 2018 to 2019, Quicken reported a rise in first-time and millennial homebuyers. Last year, said, it saw significant increases in traditionally underrepresented homebuyers, including people of color, single women, LGBTQ couples and customers with student loan debt.

“Discrimination is definitely falling, and it corresponds to the rise in competition between fintech lenders and regular lenders,” said Nancy Wallace, chair in real estate capital markets at Berkeley’s Haas School of Business. A study that Wallace co-authored in 2019 found that fintech algorithms discriminated 40% less on average than face-to-face lenders in loan pricing and did not discriminate at all in accepting and rejecting loans.

If algorithmic lending does reduce discrimination in home lending in the long term, it would cut against a troubling trend of automated systems — such as AI-based hiring platforms and facial recognition software — that turn out to perpetuate bias. Faulty data sources, software engineers’ unfamiliarity with lending law, profit motives and industry conventions can all influence whether an algorithm picks up discriminating where humans have left off. Digital mortgage software is far from perfect; the Berkeley study found that fintech lenders still charged Black and Hispanic borrowers higher interest rates than whites. (Lending law requires mortgage brokers to collect borrowers’ race as a way to identify possible discrimination.)

“The differential is smaller,” Wallace said. “But it should be zero.”

The persistence of gatekeepers started in 2016 and is licensed to underwrite mortgages in 44 states. This year, the company has underwritten about 40,000 mortgages and funds roughly $2.5 billion in loans each month. After a COVID-19 slump in the spring, its fund volume for June was five times what it was a year ago.

With $270 million in venture funding, the company generates revenue by selling mortgages to about 30 investors in the secondary loan market, like Fannie Mae and Wells Fargo. The company attracts customers as it did Anderson: buying leads from sites like Credit Karma and NerdWallet and then marketing to those customers through ads and targeted emails.

In 2019, saw a 532% increase in Hispanic clients between the ages of 30 and 40 and a 411% increase in African Americans in the same age bracket. Its married LGBTQ client base increased tenfold. “With a traditional mortgage, customers feel really powerless,” said Sarah Pierce,’s head of operations. “You’ve found a home you love, and you’ve found a rate that’s good, and somebody else is making the judgment. They’re the gatekeeper or roadblock to accessing financing.” Of course, is making a judgment too, but it’s a numerical one. There’s no gut reaction, based on a borrower’s skin color or whether they live with a same-sex partner.

Trevor McIntosh, 35, and Brennan Johnson, 31, secured a mortgage for their Wheat Ridge, Colorado, home through in 2018. “We’re both millennials and we need to immediately go online for anything,” said Johnson, a data analyst. “It seemed more modern and progressive, especially with the tech behind it.”

Previously, the couple had negative home buying experiences. One homeowner, they said, outright refused to sell to them. A loan officer also dropped a bunch of surprise fees just before closing. The couple wasn’t sure whether prejudice — unconscious or otherwise — was to blame, but they couldn’t rule it out. “Trevor and I have experienced discrimination in a variety of forms in the past, and it becomes ingrained in your psyche when interacting with any institution,” said Johnson. “So starting with digital, it seemed like fewer obstacles, at least the ones we were afraid of, like human bias.” ( introduced me to Anderson, McIntosh and Johnson, and I interviewed them independently.)

Digital lenders say that they assess risk using the same financial criteria as traditional banks: borrower income, assets, credit score, debt, liabilities, cash reserves and the like. These guidelines were laid out by the Consumer Finance Protection Bureau after the last recession to protect consumers against predatory lending or risky products.

These lenders could theoretically use additional variables to assess whether borrowers can repay a loan, such as rental or utility payment history, or even assets held by extended family. But generally, they don’t. To fund their loans, they rely on the secondary mortgage market, which includes the government-backed entities Freddie Mac and Fannie Mae, and which became more conservative after the 2008 crash. With some exceptions, if you don’t meet the standard CFPB criteria, you are likely to be considered a risk.

Fair housing advocates say that’s a problem, because the standard financial information puts minorities at a disadvantage. Take credit scores — a number between 300 and 850 that assesses how likely a person is to repay a loan on time. Credit scores are calculated based on a person’s spending and payment habits. But landlords often don’t report rental payments to credit bureaus, even though these are the largest payments that millions of people make on a regular basis, including more than half of Black Americans.

For mortgage lending, most banks rely on the credit scoring model invented by the Fair Isaac Corp., or FICO. Newer FICO models can include rental payment history, but the secondary mortgage market doesn’t require them. Neither does the Federal Housing Administration, which specializes in loans for low and moderate-income borrowers. What’s more, systemic inequality has created significant salary disparities between Black and white Americans.

“We know the wealth gap is incredibly large between white households and households of color,” said Alanna McCargo, the vice president of housing finance policy at the Urban Institute. “If you are looking at income, assets and credit — your three drivers — you are excluding millions of potential Black, Latino and, in some cases, Asian minorities and immigrants from getting access to credit through your system. You are perpetuating the wealth gap.”

For now, many fintech lenders have largely affluent customers.’s average client earns over $160,000 a year and has a FICO score of 773. As of 2017, the median household income among Black Americans was just over $38,000, and only 20.6% of Black households had a credit score above 700, according to the Urban Institute. This discrepancy makes it harder for fintech companies to boast about improving access for the most underrepresented borrowers.

Ghost in the machine

Software has the potential to reduce lending disparities by processing enormous amounts of personal information — far more than the CFPB guidelines require. Looking more holistically at a person’s financials as well as their spending habits and preferences, banks can make a more nuanced decision about who is likely to repay their loan. On the other hand, broadening the data set could introduce more bias. How to navigate this quandary, said McCargo, is “the big AI machine learning issue of our time.”

According to the Fair Housing Act of 1968, lenders cannot consider race, religion, sex, or marital status in mortgage underwriting. But many factors that appear neutral could double for race. “How quickly you pay your bills, or where you took vacations, or where you shop or your social media profile — some large number of those variables are proxying for things that are protected,” Wallace said.

She said she didn’t know how often fintech lenders ventured into such territory, but it happens. She knew of one company whose platform used the high schools clients attended as a variable to forecast consumers’ long-term income. “If that had implications in terms of race,” she said, “you could litigate, and you’d win.”

Lisa Rice, the president and chief executive of the National Fair Housing Alliance, said she was skeptical when mortgage lenders said their algorithms considered only federally sanctioned variables like credit score, income and assets. “Data scientists will say, if you’ve got 1,000 bits of information going into an algorithm, you’re not possibly only looking at three things,” she said. “If the objective is to predict how well this person will perform on a loan and to maximize profit, the algorithm is looking at every single piece of data to achieve those objectives.”

Fintech startups and the banks that use their software dispute this. “The use of creepy data is not something we consider as a business,” said Mike de Vere, the chief executive of Zest AI, a startup that helps lenders create credit models. “Social media or educational background? Oh, lord no. You shouldn’t have to go to Harvard to get a good interest rate.”

In 2019, ZestFinance, an earlier iteration of Zest AI, was named a defendant in a class-action lawsuit accusing it of evading payday lending regulations. In February, Douglas Merrill, the former chief executive of ZestFinance, and his co-defendant, BlueChip Financial, a North Dakota lender, settled for $18.5 million. Merrill denied wrongdoing, according to the settlement, and no longer has any affiliation with Zest AI. Fair housing advocates say they are cautiously optimistic about the company’s current mission: to look more holistically at a person’s trustworthiness, while simultaneously reducing bias.

By entering many more data points into a credit model, Zest AI can observe millions of interactions between these data points and how those relationships might inject bias to a credit score. For instance, if a person is charged more for an auto loan — which Black Americans often are, according to a 2018 study by the National Fair Housing Alliance — they could be charged more for a mortgage.

“The algorithm doesn’t say, ‘Let’s overcharge Lisa because of discrimination,” said Rice. “It says, ‘If she’ll pay more for auto loans, she’ll very likely pay more for mortgage loans.’”

Zest AI says its system can pinpoint these relationships and then “tune down” the influences of the offending variables. Freddie Mac is currently evaluating the startup’s software in trials.

Fair housing advocates worry that a proposed rule from the Department of Housing and Urban Development could discourage lenders from adopting anti-bias measures. A cornerstone of the Fair Housing Act is the concept of “disparate impact,” which says lending policies without a business necessity cannot have a negative or “disparate” impact on a protected group. HUD’s proposed rule could make it much harder to prove disparate impact, especially stemming from algorithmic bias, in court.

“It creates huge loopholes that would make the use of discriminatory algorithmic-based systems legal,” Rice said.

HUD says that its proposed rule aligns the disparate impact standard with a 2015 Supreme Court ruling and that it does not give algorithms greater latitude to discriminate.

A year ago, the corporate lending community, including the Mortgage Bankers Association, supported HUD’s proposed rule. After COVID-19 and Black Lives Matter forced a national reckoning on race, the association and many of its members wrote new letters expressing concern.

“Our colleagues in the lending industry understand that disparate impact is one of the most effective civil rights tools for addressing systemic and structural racism and inequality,” Rice said. “They don’t want to be responsible for ending that.”

The proposed HUD rule on disparate impact is expected to be published this month and go into effect shortly thereafter.

‘Humans are the ultimate black box’

Many loan officers, of course, do their work equitably, Rice said. “Humans understand how bias is working,” she said. “There are so many examples of loan officers who make the right decisions and know how to work the system to get that borrower who really is qualified through the door.”

But as Zest AI’s former executive vice president, Kareem Saleh, put it, “Humans are the ultimate black box.” Intentionally or unintentionally, they discriminate. When the National Community Reinvestment Coalition sent Black and white “mystery shoppers” to apply for Paycheck Protection Program funds at 17 different banks, including community lenders, Black shoppers with better financial profiles frequently received worse treatment.

Since many clients still choose to talk with a loan officer, the company says it has prioritized staff diversity. Half of its employees are female, 54% identify as people of color and most loan officers are in their 20s, compared with the industry average age of 54. Unlike many of their competitors, the loan officers don’t work on commission. They say this eliminates a conflict of interest: When they tell you how much house you can afford, they have no incentive to sell you the most expensive loan.

These are positive steps. But fair housing advocates say government regulators and banks in the secondary mortgage market must rethink risk assessment: accept alternative credit scoring models, consider factors like rental history payment and ferret out algorithmic bias. “What lenders need is for Fannie Mae and Freddie Mac to come out with clear guidance on what they will accept,” McCargo said.

For now, digital mortgages might be less about systemic change than borrowers’ peace of mind. Anderson in New Jersey said that police violence against Black Americans this summer had deepened her pessimism about receiving equal treatment.

“Walking into a bank now,” she said, “I would have the same apprehension — if not more than ever.”

This article originally appeared in The New York Times.

© 2020 The New York Times Company

Source Article