Thank you all for being here this morning. The cost of housing is far too high for families across the country, and renters, in particular, are bearing the brunt.
Today’s renters increasingly must deal with large corporate property managers working for private equity funds, rather than local landlords living in the community. Corporate investor ownership has surged, climbing to more than 45 percent of rental units.
Renters don’t just face high rents, but they’re also getting hit with junk fees and other aggressive tactics. Research suggests that corporate investor owners, like private equity, are more likely to evict, even when controlling for other factors.
These entities are also more likely to deploy new technologies, including artificial intelligence and social scoring. AI has been creeping into the rental process and can lead to rent hikes and denial of housing. The social scoring of these algorithms produces what one company calls “high-quality tenants”1 – an opaque term that may keep out families that could afford the rent every month but may not fit the algorithm’s definition.
Consumer complaints, studies, and other data suggest tenants are not given sufficient opportunity to correct inaccurate information, even though the law requires companies that use a consumer report, including corporate landlords and their property managers, to let people know when they use that consumer report to make an adverse decision, like denying housing or increasing rent.
Together with the Federal Trade Commission, we have launched an inquiry asking the public to tell us about their experiences with tenant screening. Consumers have told us about instances of mistaken identity, incorrect criminal records, and false eviction information on background checks used for housing and employment.
A single mother and recent widow told us that after initially being selected for a unit, packing up, telling her current landlord she was leaving, and just waiting for a move-in date, the rug was pulled out from under her. Her landlord pulled a report showing two evictions. One purported eviction was in a state she had never even lived, and another was for a dismissed eviction. She and her family are at risk of homelessness because of false information and junk data.
The CFPB is charged with administering the Fair Credit Reporting Act, which governs many of the uses of artificial intelligence, social scoring, and algorithms for decisions on housing, employment, and more. Both landlords and property managers, along with the companies providing them with information on tenants, have obligations to tenants under the law.
When these algorithms produce thumbs up or thumbs down decisions, without a second look, without applicant review, and without consideration of what information is true and what is false, the landlord, property manager, or tenant screening company may be breaking the law.
We are on the lookout for inaccurate AI and illegal practices that lead to junk data. For example, relying on name matching alone is illegal because it is especially likely to result in inaccurate information. People with common last names are especially likely to be at risk from so-called “name-only” matching. There is no exemption in the Fair Credit Reporting Act that allows companies to break the law because their AI or other technology doesn’t work.
In particular, as the Federal Housing Finance Agency, the U.S. Department of Housing and Urban Development, and the U.S. Department of Agriculture will make clear this week, landlords and property managers must let tenants know when they use screening reports to deny housing or raise fees or rent. Prospective tenants have a right to know this and to challenge false information.
Regulators cannot tolerate corporate actors using opaque AI fed by inaccurate information to deny or gouge tenants on rental housing.
The CFPB is pleased to be part of this effort. Thank you.