Skip to main content

Algorithms, artificial intelligence, and fairness in home appraisals

Today, the CFPB is taking another step toward accountability for automated systems and models, sometimes marketed as artificial intelligence (AI). The CFPB is proposing a rule to make home appraisals computed by algorithms fairer and more accurate. This initiative is one of many steps we are taking to ensure that algorithms and AI are complying with existing law.

In conjunction with the Board of Governors of the Federal Reserve System, Federal Deposit Insurance Corporation, Federal Housing Finance Agency, National Credit Union Administration, and Office of the Comptroller of the Currency, the CFPB is proposing a rule that would, if finalized, ensure that automated home valuations are fair and nondiscriminatory. The proposed rule’s safeguards are not a panacea, but represent a recognition of the risks posed by algorithmic appraisals. The proposed rule supports our work as a member of the Interagency Task Force on Property Appraisal and Valuation Equity (PAVE) to promote accurate appraisals, rather than ones computed by models where bias is baked into the equation.

Algorithmic appraisals that use so-called automated valuation models can be used as a check on a human appraiser or in place of an appraisal. Unlike an appraisal or broker price opinion, where an individual person looks at the property and assesses the comparability of other sales, automated valuations rely on mathematical formulas and number crunching machines to produce an estimate of value.

While machines crunching numbers might seem capable of taking human bias out of the equation, they can’t. Based on the data they are fed and the algorithms they use, automated models can embed the very human bias they are meant to correct. And the design and development of the models and algorithms can reflect the biases and blind spots of the developers. Indeed, automated valuation models can make bias harder to eradicate in home valuations because the algorithms used cloak the biased inputs and design in a false mantle of objectivity.

Inaccurate or biased algorithms can lead to serious harm. A home valued too high can lock a homeowner into an unaffordable mortgage and increase the risk of foreclosure. A home valued too low can deprive homeowners of access to their equity and limit the mobility of sellers. In addition to harming homeowners, systemic biases in valuations, either too low or too high, hurt neighborhoods, distort the housing market, and impact the tax base. When it comes to buying or selling a home, we all need and deserve fair and nondiscriminatory home valuations.

The proposed rule would, if finalized, create basic safeguards to mitigate the risks associated with automated valuation models. Covered institutions that employ these models to help make home value decisions would have to take steps to boost confidence in valuation estimates and protect against data manipulation. The proposed rule would also require companies to have policies and processes in place to avoid conflicts of interest, to conduct random sample testing and reviews, and to comply with nondiscrimination laws.

This proposal complements recent work by the CFPB to stay ahead of the proliferation of AI-marketed technologies that can result in discriminatory outcomes and threaten families’ financial stability. For example, we have issued guidance to make clear that, if a lender does not understand how their AI model generates decisions and, as a result, they cannot comply with existing federal law requiring lenders to specify the reasons why an applicant received an adverse decision, then they cannot use the model. Similarly, we joined with other federal agencies earlier this year to make clear that automated systems and AI models must be developed and used in accordance with existing federal laws and to highlight the potentially harmful uses of automated systems.

Emerging AI-marketed technologies can negatively impact civil rights, fair competition, and consumer protection. Because technology has the power to transform our lives, we must ensure that AI-marketed technologies do not become an excuse for evasion of the law and consumer harm. It is critical that these emerging technologies comply with the law.

We will continue to monitor the development and use of automated systems, and work closely with our federal and state partners to enforce federal consumer financial protection laws. This includes protecting the rights of American homeowners, buyers, and sellers, regardless of whether legal violations occur by a human or by a machine.

Read the proposed rule, Quality Control Standards for Automated Valuation Models.

Comments must be received within 60 days after publication of the proposed rule in the Federal Register.

Join the conversation. Follow CFPB on X (formerly Twitter) and Facebook .