Fair Appraisals Matter—To the White House, and to Us

Have you heard about PAVE? Most people haven’t. It stands for Property Appraisal and Valuation Equity.

When it comes to homeownership, fair appraisals are not just a matter of numbers—they’re a cornerstone of equity and financial prosperity. Enter PAVE. This initiative, championed by the White House, is more than just an acronym; it’s a pivotal step towards addressing long-standing biases in home valuations that disproportionately affect Black and Brown communities. While many are still unaware of PAVE, its impact could reshape the landscape of real estate fairness. Two years since its introduction by the Interagency Task Force, it’s crucial to examine the progress and understand why fair home appraisals are not just beneficial but essential for a more equitable society.

Why? We believe fair home appraisals can help promote good selling and buying experiences for more people. In short, PAVE is good for deed holders, good for title transactions, and helpful for ensuring fairness when deeds change hands.

Appraisal Bias Contributes to a “Sprawling Racial Wealth Gap”

Everyone who buys a home should get a fair shot at building wealth through homeownership. That doesn’t seem controversial. But here’s the thing.

“For far too long,” writes the White House, “bias in home valuations has limited the ability of Black and brown families to enjoy the financial returns associated with homeownership, thereby contributing to the already sprawling racial wealth gap.”

Lip service isn’t enough. California’s efforts to ensure fair appraisals — within the state, and nationwide — has pushed this point.

So the Biden administration introduced its plan to promote fair appraisals. Before checking in on the administration’s plan, let’s take a look at what goes on when the government takes no action.

What Appraisal Discrimination Looks Like: The Case of Connolly v. Lanham

Earlier this year, the federal government sued lenders for discriminatory appraisal use in violation of federal law. The lawsuit, Connolly v. Lanham, was filed in the U.S. District Court for the District of Maryland.

Here’s the gist of it. A Black couple applied to refinance their Baltimore home, to benefit from the super-low mortgage interest rates in 2021. The loan was denied, based on a lowball appraised value. The owners were forced to try a new mortgage lender. For this second appraisal, one of their white colleagues met the appraiser on their behalf. The newer appraisal showed the home’s appropriate and expected value.  

Lending institutions must treat applicants with respect, professionalism, and fairness, no matter their race, ethnicity, country of origin, sex or sexual orientation, or family status. Lenders have a duty to change course if they have disproportionately withheld loans for reasons that aren’t relevant to the ability to repay the loans. And the federal government is making sure this duty is met.

Now, Lenders Must Heed the Federal PAVE Action Plan

What headway has the PAVE initiative made so far?

  • Transparency. Homeowners have new tools to check up on, and address, biased appraisals. Now, the Federal Housing Administration (FHA), Fannie Mae, and Freddie Mac are publishing their appraisal data. An open data system lets homeowners, appraisers, and researchers investigate appraisal results. People whose homes are appraised take note: you’ll have channels to question a lower-than-expected valuation.
  • Dismantling barriers. The federal government is taking action to make the appraisal profession look like the population. It’s asking the states to facilitate entry for people who’d like to become appraisers. Requiring college degrees, for example, doesn’t produce better appraisals. But it does make it harder to pull new professionals from minority backgrounds.
  • Trend spotting. The administration is using data analysis to find patterns of appraisal bias so it can deal with repeating issues. This means the federal government has a new use for artificial intelligence (AI) tech.

We live in a digital age now. Companies rely on algorithms to underwrite loans. The software will be under scrutiny, the administration has indicated.

What About the Software? Bias In, Bias Out

Automation is becoming prevalent in appraisals and loan underwriting. Now, a half-dozen agencies are collaborating on a proposed rule regarding Automated Valuation Models (AVMs). These are the software systems that evaluate the worth of properties. Under the proposed rule, lenders’ systems would undergo random checks.

In late October, Joe Biden issued an Executive Order announcing new standards for safety, security, privacy, and civil rights protections. Building on prior work to protect people in the face of advancements in artificial intelligence (AI), the new Executive Order:

  • Assigns the Department of Justice and civil rights agencies to spot algorithmic discrimination.
  • Positions the Justice Department’s Civil Rights Division at the forefront of vetting AI and other innovations.

The Justice Department, along with the White House, affirms that technology is beneficial when used intentionally. Without well-channeled intent, artificial intelligence can repeat bias mistakes of the past.

In Austin, the DOJ Talks About Bias With the Lenders Themselves

This month, Assistant Attorney General Kristen Clarke, from the Housing and Civil Enforcement Section of the Justice Department’s Civil Rights Division, spoke on lending discrimination. The audience? A room full of lenders.

Clarke discussed the Justice Department’s Combating Redlining Initiative.

Read up on redlining — a highly harmful form of mortgage discrimination.

Clarke explained that the FHA actually drew red lines on maps to keep government financing off-limits to Black communities. The Fair Housing Act of 1968 came into being to redress that sort of thing, but redlining persisted, Clarke explained, in cities such as Houston, Memphis, Philadelphia, Camden, Wilmington, Los Angeles, Columbus, Tulsa, and others. Even today, white households make up the vast majority of homeowners.

“And the gaps in homeownership rates,” said Clarke, “contribute to staggering differences in family wealth.”

Clarke went on to describe several of the Department’s recent lawsuits and settlements:

  • The DOJ pressed Ameris Bank to resolve an alleged pattern of redlining in Florida. “Ameris knew of its redlining risk in Black and Hispanic communities in Jacksonville for years,” said Clarke. Now, Ameris Bank as agreed to invest $9 million for financing real estate in these communities.
  • In January 2023, a redlining settlement with City National Bank (HQ: Los Angeles) resulted in a $31 million payment. Then the DOJ settled for $9 million with Washington Trust Bank (Rhode Island) to resolve similar allegations.

These cases can transform cities. Back in 2011, Midwest BankCentre had to resolve federal redlining allegations. So it opened a branch where there was no bank. It then began working with minority-owned businesses. Today, Midwest runs five branches in minority neighborhoods, and small businesses are thriving.

Fair Lending, in a Nutshell

The DOJ says fair lending means:

  • Ensuring that fair lending analyses reach every level of the lending institution.
  • Being proactive. Looking around to find unserved communities that can be reasonably served.
  • Learning from, and engaging with, these communities.

We’ll give Attorney Kristen Clarke herself the final word:

You can reach historically marginalized communities and provide them new opportunities, all while increasing your institution’s overall lending activity – making this work a win-win for everyone.

Supporting References

U.S. Department of Justice, Office of Public Affairs via Justice.gov: Remarks by Assistant Attorney General Kristen Clarke at the Annual Community Reinvestment Act and Fair Lending Colloquium (Austin, Texas; Nov. 13, 2023; updated Nov. 14).

White House Fact Sheet via WhiteHouse.gov: Biden-⁠Harris Administration Takes Sweeping Action to Address Racial Bias in Home Valuations (Jun. 1, 2023).

White House Fact Sheet via WhiteHouse.gov: President Biden Issues Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence (Oct. 30, 2023).

And as linked.

More on topics: Race and fairness in housing, Fair home appraisals

Photo credits: Greta Hoffman and Cottonbro Studio, via Pexels.