Remarks of CFPB Director Rohit Chopra at White House Roundtable on Protecting Americans from Harmful Data Broker Practices
Thank you to White House National Economic Council Director Lael Brainard and White House Office of Science and Technology Policy Director Arati Prabhakar for convening this roundtable on protecting Americans from harmful data broker practices.
The United States has a long history of recognizing the sanctity of protecting against unwanted intrusions into our homes and our lives. The Fourth Amendment protects against unreasonable search and seizures. State Peeping Tom laws prohibit looking into private places in person or through the use of devices. And there’s many more examples of how our country has sought to create boundaries.
Today, “artificial intelligence” and other predictive decision-making increasingly relies on ingesting massive amounts of data about our daily lives. This creates financial incentives for even more data surveillance. This also has big implications when it comes to critical decisions, like whether or not we will be interviewed for a job or get approved for a bank account or loan. It’s critical that there’s some accountability when it comes to misuse or abuse of our private information and activities.
The Consumer Financial Protection Bureau is pleased to be part of an all-of-government effort to tackle the risks associated with AI. After conducting an inquiry into the practices of data brokers in the surveillance industry, we have decided to launch a rulemaking to ensure that modern-day digital data brokers are not misusing or abusing our sensitive data. During our formal inquiry, the CFPB learned more about the significant harms – from the identification of victims for financial scams to the facilitation of harassment and fraud.
While these firms go by many labels, many of them work to harvest data from multiple sources and then monetize individual data points or profiles about us, sometimes without our knowledge. These data points and profiles might be monetized by sharing them with other companies using AI to make predictions and decisions.
In many ways, these issues mirror debates from over fifty years ago. In 1969, Congress investigated the then-emerging data surveillance industry. The public discovered the alarming growth of an industry that maintained profiles on millions of Americans, mining information on people’s financial status and bill paying records, along with information about people’s habits and lifestyles, with virtually no oversight.
Of course, today’s surveillance firms have modern technology to build even more complex profiles about our searches, our clicks, our payments, and our locations. These detailed dossiers can be exploited by scammers, marketers, and anyone else willing to pay.
While there are many efforts to expand personal data protections at the federal and state level, particularly when it comes to AI, we also have to make sure we’re using our existing laws on the books.
In 1970, Congress enacted the Fair Credit Reporting Act. The law covers a broad range of background reports assembled on consumers, even beyond those used for extending loans. The law granted people new rights and protections, including: (1) safeguards to ensure accurate information, (2) the right to dispute errors, (3) the right to access your own information, and (4) restrictions on how others can use your information.
To ensure that modern-day data companies assembling profiles about us are meeting the requirements under the Fair Credit Reporting Act, the CFPB will be developing rules to prevent misuse and abuse by these data brokers.
Two of the proposals under consideration are worth highlighting here:
First, our rules under consideration will define a data broker that sells certain types of consumer data as a “consumer reporting agency” to better reflect today’s market realities. The CFPB is considering a proposal that would generally treat a data broker’s sale of data regarding, for example, a consumer’s payment history, income, and criminal records as a consumer report, because that type of data is typically used for credit, employment, and certain other determinations. This would trigger requirements for ensuring accuracy and handling disputes of inaccurate information, as well as prohibit misuse.
A second proposal under consideration will address confusion around whether so called “credit header data” is a consumer report. Much of the current data broker market runs on personally identifying information taken from traditional credit reports, such as those sold by the big three credit reporting conglomerates – Equifax, Experian, and TransUnion.
This includes key identifiers like name, date of birth, and Social Security number that are contained in consumer reports generated by the credit reporting companies. The CFPB expects to propose to clarify the extent to which credit header data constitutes a consumer report, reducing the ability of credit reporting companies to impermissibly disclose sensitive contact information that can be used to identify people who don’t wish to be contacted, such as domestic violence survivors.
Any updated rules under the Fair Credit Reporting Act can be enforced by the CFPB and state law enforcement across sectors of the economy. The Federal Trade Commission, the Department of Transportation, the Department of Agriculture, and other agencies can enforce these rules for specific sectors under their jurisdiction.
The CFPB’s data broker rulemaking will complement other work occurring across levels of government, especially by the FTC, which is leading so many efforts on privacy and data security.
Next month, the CFPB will publish an outline of proposals and alternatives under consideration for a proposed rule. We’ll soon hear from small businesses, which will help us craft the rule. We are encouraging small businesses looking to participate in the process to contact us. We plan to propose the rule for public comment in 2024.
We look forward to obtaining public input on the proposals under consideration. More importantly, as AI increases the processing of sensitive personal data, we hope this will bring much-needed accountability to the dark corners of the data broker market.
Thank you.