The CFPB recently released a new proposed rule — Protecting Americans from Harmful Data Broker Practices — which would modernize the implementing regulations for the Fair Credit Reporting Act (FCRA) and significantly reshape the financial data economy.

As an unapologetic FCRA nerd, this is a very exciting development for me, so I am going to indulge my FCRA nerdery by dissecting the key takeaways from this proposed rule.

And, as is my habit, I will do so by posing a series of questions to myself.

First of all, what is the FCRA?

The Fair Credit Reporting Act, one of the first data privacy laws in the world, was enacted by Congress in 1970. It was the result of a decade-long freakout by Congress over the growth and consolidation of the credit reporting industry in the U.S. and the scary prospects of that industry becoming digitized.

Retail Credit Company, Atlanta, Georgia, 1949.

The FCRA established a framework for governing how Consumer Reporting Agencies (i.e., credit bureaus) collected, shared, and managed consumers’ credit reports. That framework had four core pillars: (1) a bright-line prohibition on using or disseminating consumer reports unless for one of the limited permissible purposes identified by Congress; (2) a requirement that consumer reporting agencies follow reasonable procedures to assure the maximum possible accuracy of consumer reports; (3) a consumer right to dispute inaccurate or incomplete information and have it corrected; and (4) a consumer right to see the information that a consumer reporting agency possesses about the consumer.

The FCRA has been amended a number of times over the last 50 years, but there are some folks (including, notably, Rohit Chopra) who feel that how the FCRA has been implemented is still insufficient for addressing the concerns posed by the modern data broker industry (especially in the absence of a more general federal data privacy law in the U.S.)   

At a high level, what would the CFPB’s proposed rule do?

The rule would significantly widen the scope of the FCRA, sweeping up many data brokers that don’t currently consider themselves to be Consumer Reporting Agencies (CRAs). It would do this by considering all companies that provide information about a consumer’s credit history, credit score, debt payments, or income or financial tier as CRAs, regardless of the purpose that the information is used or expected to be used for.

The rule would further specify that personal identifiers collected by CRAs to prepare a consumer report, such as names, addresses, dates of birth, and social security numbers (what industry stakeholders commonly refer to as “credit header data”), would constitute consumer reports by themselves. 

The rule would crack down on the sale of consumer reports for any reasons that aren’t permissible purposes as defined by the FCRA (e.g., credit and insurance risk decisioning, employment screening, etc.)

And the rule would require CRAs (or their customers) to provide clear and conspicuous disclosures and to obtain explicit consumer consent to share their data with a third party, as well as setting limits for the amount of time that consent persists and giving consumers the ability to revoke that consent at any time.        

How would this impact the credit bureaus?

Over the last couple of decades, the credit bureaus have built out a bifurcated product set built on top of a common dataset. One group of products is FCRA-compliant. They are used for evaluating consumers for credit and other permissible purposes under the FCRA. The other group of products is not FCRA-compliant. Even though they are based on the same core dataset, they are sold for non-permissible purpose use cases, such as marketing and advertising.

The CFPB’s proposed rule would severely constrain CRAs’ ability to sell consumer report data for non-permissible purpose use cases. As the bureau notes, this is by design:

The CFPB is concerned that consumer reporting agencies are monetizing consumer report information for use in marketing in ways that the FCRA prohibits. As noted, marketing and advertising generally are not permissible purposes for furnishing or obtaining consumer reports. Nevertheless, as technology has advanced, consumer reporting agencies have begun to employ techniques and business models designed to evade this restriction. The proposed rule would address these developments and would emphasize that the FCRA’s legitimate business need permissible purpose does not authorize consumer reporting agencies to furnish consumer reports to users for solicitation or marketing purposes.  

This would obviously be bad for the credit bureaus, which make quite a bit of money by repackaging their data for marketers. It would also create some significant challenges for large data brokers that currently operate primarily outside the FCRA sphere, like Epsilon and Acxiom.

However, I do think the CFPB has a point about the over-monetization of consumer report information for marketing.

There is a scam I have been hearing about a lot recently called an advanced fee scam. It’s a new twist on an old idea. A consumer recently rejected for a loan (say from Wells Fargo) is contacted by someone claiming to work for another lender. They will say something like, “I see you were rejected for a personal loan from Wells Fargo. We can approve you for a loan at those same terms. We just need to collect a small loan application fee to finalize the loan and get you your funds.” They then direct the consumer to purchase a prepaid debit card or gift card to pay the loan application fee, and then, obviously, the consumer never gets the funds, and they lose the money spent on the loan application fee (and any other fees the scammer tricks them into paying).

This version of the advanced fee scam works so well because the scammer can identify consumers who need liquidity but are unable to get it and because the scammer can establish trust and credibility with the victim by referencing an actual credit application (“I see you were rejected for a personal loan from Wells Fargo”).

The only way that a scammer could get that information would be from a CRA or data broker that has access to credit inquiry data. If CRAs and data brokers are selling this type of information to scammers (or intermediaries selling it to scammers), it’s easy to understand why the CFPB would want to crack down.

What about de-identified data?

Another way that CRAs and data brokers get around FCRA restrictions is through the aggregation and de-identification of consumer data. As it made clear in its rulemaking on open banking, the CFPB is highly suspicious of de-identification as a safeguard for consumer privacy, and the bureau expresses that same suspicion in this proposed rule as well: 

The CFPB is aware that consumer reporting agencies offer and sell a variety of products that include information that has been drawn from consumer reporting databases and that has been aggregated or otherwise purportedly de-identified. Some of these products include information that has been aggregated at a household or neighborhood level (e.g., a ZIP Code or ZIP-plus-four Code segmentation); others may include information aggregated according to specific behavioral characteristics (e.g., consumers who shop at high-end retailers). Given the potential ease with which household and other data can be re-identified, the sale of these types of data raises concerns that sensitive consumer reporting information may be disclosed in circumstances where no FCRA permissible purpose exists, such as for marketing.

Prohibiting the use of de-identified data, which would otherwise be considered consumer report data under the FCRA, from being used for non-permissible purpose use cases would be a significant blow to CRAs and large data brokers, which have built very sophisticated systems and products to facilitate modeling, customer segmentation, and targeting based on de-identified data.

It’s not clear if the CFPB would go that far however.

In the proposed rule, the bureau outlines three different options for dealing with de-identified data:

  1. Banning the use of de-identification to enable consumer report data to be used for non-permissible purpose use cases.
  2. Banning the use of de-identification if the de-identified data is still linked or linkable to the consumer.
  3. Banning the use of de-identification if the de-identified data is still linked or “reasonably” linkable to the consumer or if the information is used to inform a business decision about a particular consumer or if the party that receives the information uses it to identify the consumer.  

The third option is the closest to the current status quo, and I imagine it’s the option that the data broker industry would strongly prefer. 

Which companies that don’t want to become CRAs might be forced to become CRAs?

There are a bunch! As the CFPB notes:

If [the] proposed [rule] is finalized, a substantial number of additional data brokers operating today likely will qualify as consumer reporting agencies selling consumer reports under the FCRA.

This would be due primarily to the proposed rule’s designation of credit history, credit score, debt payments, and income data as consumer report data, regardless of how it is used, as well as a broadening of the definition of “assembling or evaluating,” which is a key component of the definition of a consumer reporting agency under the FCRA.

One group of impacted companies, which the CFPB calls out by name, are open banking data aggregators. 

Historically, data aggregators have taken very different stances on how the FCRA applies to their business. Some, like Finicity, were proactive in embracing the designation. Others, like Plaid, argued that it did not apply because they merely transmitted data rather than assembling or evaluating it. 

Recently, we’ve seen more data aggregators embrace the CRA designation (Plaid, for example) as cash flow underwriting has become increasingly popular and the CFPB has signaled its intentions to update the rules around the FCRA. This was wise, as the CFPB is now signaling that data aggregators would likely be considered CRAs under the proposed rule:

The CFPB understands that data aggregators often engage in such activities. The CFPB understands, for instance, that, when a data aggregator collects information from a consumer’s bank account, the data aggregator may apply its own taxonomy to group or categorize the collected information.      

What the bureau is saying here is that the basic work that all data aggregators do to categorize transaction data that is shared by consumers qualifies as “assembling or evaluating” under the proposed rule, and if that data is used to make credit decisions or includes debt payments or income (even if they aren’t used for making a credit decision), the data aggregator would be a CRA.

Bottom line — it would be very difficult, under this proposed rule, to operate as a data aggregator without also acting as a CRA.  

(Editor’s Note — Many other, less-obvious-but-very-interesting companies might also get pulled into the FCRA regulatory perimeter by this rule. One group I’m thinking about is the payroll system providers like ADP. These companies assemble data on consumers, including their income, and sell it to third parties such as Equifax. Under the proposed rule, I can’t see how this activity wouldn’t cause these companies to be considered CRAs.

How disruptive would the consumer disclosure and consent requirements be? 

Short answer — quite disruptive!

When a company wants to obtain a consumer’s permission to access their consumer report (which they generally are required to do, except in limited prescreen use cases authorized in the FCRA), the proposed rule would require that consumers be presented with a disclosure that is “clear, conspicuous, and segregated from other material.”

In other words, no more hiding FCRA disclosure and consent requirements behind a checkbox and within lengthy Ts&Cs that no one reads.

Interestingly, the proposed rule would also prohibit companies from getting consumers’ consent to pull their consumer reports from multiple CRAs. Instead, consumers would need to authorize each CRA individually. This would be a big change from how the process works today and would create a lot of additional friction for consumers.

Another point of additional friction is the requirement in the proposed rule that companies obtain reauthorization for the continued use of the consumer report from the consumer no longer than one year after the initial authorization. The impact of this change on services like credit score monitoring (and companies like Credit Karma that provide them) would be significant.

Finally, the proposed rule would require that consumers be given a method to revoke consent for their report to be furnished that is “as easy to access and operate as the method by which the consumer provided consent for their report to be furnished.” To my knowledge, credit bureaus and the companies they sell consumer reports to don’t offer this type of granular revocation mechanism today. 

(Editor’s Note — Perhaps this could be the excuse the industry needs to finally fix AnnualCreditReport.com?!?)

(Last Editor’s Note … I promise — If all these disclosure and consent requirements remind you of the CFPB’s final rule on open banking, that’s not an accident. The bureau is purposefully trying to move the industry towards a common framework for consumer-permissioned data sharing, regardless of whether that data is assembled and evaluated in advance and in aggregate or individually and in real-time.)   

How big a deal would the credit header data change be?

This is the one that folks are really up in arms over.

The position of the CFPB is that credit header data (name, date of birth, address, phone number, email, and social security number) is consumer report information and cannot be segmented off and sold for non-permissible purpose use cases.

This is disturbing to the industry because credit header data, as a standalone data product, is extraordinarily useful for a variety of beneficial use cases, especially identity verification, which is a requirement that companies are subject to both inside financial services (CIP & AML) and outside it (e.g., selling age-restricted products such as alcohol). 

Given their broad coverage and the lack of any other type of national identity infrastructure, the credit bureaus are the de facto source for identity verification in the U.S., and credit header data is how they fulfill that role.

I understand the industry’s concerns, but, in this case, I agree with the CFPB that they are (mostly) overstated.

The proposed rule doesn’t restrict the ability of companies to use credit header data for identity verification as long as that identity verification falls under an FCRA permissible purpose.

Obviously, this is the case if the IDV is happening within the context of a credit, insurance, or employment decision. 

Additionally, the FCRA extends permissible purpose to transactions in which the consumers’ written instructions authorize the data to be shared (this covers the age verification to buy alcohol use case and others like it) or where there is a “legitimate business need,” which includes consumer-initiated transactions like opening a checking account (this covers the bank CIP/AML use case).

The requirement to subject credit header data to FCRA requirements would be expensive for the credit bureaus to comply with and will potentially introduce some additional friction for consumers (particularly where written instructions are required), but I don’t think it would be a wholesale disaster for the industry.  

Is there any chance that this rule will get finalized as is?

This is the million-dollar question.

The current CFPB is slipping this proposed rule in just under the wire. However, per the Administrative Procedure Act, the rule cannot be finalized until after the comment period closes on March 3rd, 2025. This means that the rule’s finalization will be the responsibility of the Trump administration’s CFPB. That version of the CFPB, if it isn’t deleted entirely, will have very different priorities than the Chopra-era CFPB has had.

I think this political consideration is partially why Director Chopra chose to frame this proposed rule using national security language in his remarks:

Last month, hackers linked to the Chinese government targeted our telecommunications infrastructure – just the latest in a series of attacks on Americans’ personal data. But often, our adversaries don’t need to hack anything. Data brokers – the outfits that collect and sell detailed information about our personal and financial lives – are making this data available to anyone willing to pay. Today, the Consumer Financial Protection Bureau is proposing action to stop data brokers from enabling scammers, stalkers, and spies, undermining our personal safety and America’s national security.

The dangers of unfettered data brokering have become painfully clear in recent months. Last week, an investigation revealed how easily data brokers can track U.S. military personnel stationed in Germany, including their movements around sensitive facilities like nuclear storage sites and intelligence centers. In September, researchers demonstrated how they could purchase location data to track federal law enforcement as they conducted confidential investigations. This summer, we learned that hackers had accessed nearly 3 billion records of Americans’ sensitive data, including Social Security numbers, from a single data broker. These aren’t isolated incidents – they represent a systemic vulnerability in how our personal data is bought and sold.    

It’s a smart strategy, and privacy and data security are issues with genuine bipartisan support. However, I doubt it will be enough to convince the Trump administration and the next Director of the CFPB to finalize the proposed rule as it is currently written.

Alex Johnson
Alex Johnson
Join Fintech Takes, Your One-Stop-Shop for Navigating the Fintech Universe.

Over 36,000 professionals get free emails every Monday & Thursday with highly-informed, easy-to-read analysis & insights.

This field is for validation purposes and should be left unchanged.

No spam. Unsubscribe any time.