In conjunction with the Board of Governors of the Federal Reserve System, Federal Deposit Insurance Corporation, Federal Housing Finance Agency, National Credit Union Administration, and Office of the Comptroller of the Currency, the CFPB is proposing a rule that would, if finalized, ensure that automated home valuations are fair and nondiscriminatory.
Algorithmic appraisals that use so-called automated valuation models can be used as a check on a human appraiser or in place of an appraisal. Unlike an appraisal or broker price opinion, where an individual person looks at the property and assesses the comparability of other sales, automated valuations rely on mathematical formulas and number crunching machines to produce an estimate of value.
While machines crunching numbers might seem capable of taking human bias out of the equation, they can’t. Based on the data they are fed and the algorithms they use, automated models can embed the very human bias they are meant to correct. And the design and development of the models and algorithms can reflect the biases and blind spots of the developers. Indeed, automated valuation models can make bias harder to eradicate in home valuations because the algorithms used cloak the biased inputs and design in a false mantle of objectivity.
Inaccurate or biased algorithms can lead to serious harm. A home valued too high can lock a homeowner into an unaffordable mortgage and increase the risk of foreclosure. A home valued too low can deprive homeowners of access to their equity and limit the mobility of sellers. In addition to harming homeowners, systemic biases in valuations, either too low or too high, hurt neighborhoods, distort the housing market, and impact the tax base. When it comes to buying or selling a home, we all need and deserve fair and nondiscriminatory home valuations.
The proposed rule would, if finalized, create basic safeguards to mitigate the risks associated with automated valuation models. Covered institutions that employ these models to help make home value decisions would have to take steps to boost confidence in valuation estimates and protect against data manipulation. The proposed rule would also require companies to have policies and processes in place to avoid conflicts of interest, to conduct random sample testing and reviews, and to comply with nondiscrimination laws.
This proposal complements recent work by the CFPB and guidance is available here: guidance