Rebutting The FTC: Data Clean Rooms Are ‘Privacy-Preserving’
Unless you’re in adtech or are a privacy lawyer, you may have missed the FTC’s latest missive entitled “Data Clean Rooms: Separating Fact from Fiction”. Rather than separation, the FTC actually combines facts with fiction, seemingly in an effort to restate negative positions against the advertising industry.
Here are a few of my hot takes and rebuttals:
The FTC incorrectly defines data clean rooms by applying a singular, general reference to a technology with a diverse set of software, hardware, and business processes.
The FTC only provides one example in their effort to define data clean rooms - a grocery store advertiser measuring the effectiveness of their newspaper ads, without any reference to the methodology used by this fictitious example. How can the FTC encapsulate the complexity and scope of the technology without more fully describing the definition and use cases?
The FTC’s only other reference to define the scope of services of data clean rooms is a link to an article by The Markup (with a sub-headline about ‘dirt on data brokers’) with equally vague references to the clean room technologies used.
The FTC does not ‘separate the fact’ that data clean rooms have been in existence for many years prior to their focus on the advertising use cases, namely with medical clinical research, credit and financial services, and corporate M&A activity through dramatically different technologies and processes.
The FTC’s definition of ‘clean room’ should have distinguished between the many historical use cases (including business intermediaries) used in medical or other applications from their seeming focus on the more recent advertising-focused software applications – namely, as provided by companies such as InfoSum , Habu / LiveRamp , Snowflake , and AppsFlyer . A more helpful survey could have included similar ‘privacy preserving’ applications provided by large media companies such as Google, Meta, Snap and Bytedance/Tiktok, and notable partnerships such as between Meta and Firefox.
Sadly, the FTC does not indicate the scope of their business analysis, the company or individual participants in their market research, or the specific use cases they apply their generalizations to. In the least, they could have asked the Future of Privacy Forum to cite their recent 'Data Clean Rooms: A Taxonomy & Technical Primer'.
2. The FTC makes the following misstatements: “By default, most services that provide DCRs are not privacy preserving” and “Additionally, DCR services often default to allow both parties full access to all of the data.”
Setting aside the lack of any citations or source references to what is considered a ‘default DCR service’ or how they quantify ‘most services’, the most common interpretation of why a clean room is ‘clean’ is because neither of the contributing data sources can ‘by default’ reference or extract the other party’s personal or unique user data.
There should be no debate that an effort by a party to use software or services to disable another party from receiving its personal or unique data is ‘privacy preserving’ - and security preserving (see #4 below).
The FTC should be championing the use of clean room software and services for the mere fact that for much of the history of advertising and direct marketing, companies have engaged in the ‘dirty’ practice of sharing directly identifying or commonly re-identifiable personal information such as hashed emails, while relying exclusively on legal protections such as audit rights to protect against misuse. (To give the FTC its rightful credit, I wholeheartedly agree with their chastening that simple hashing is not anonymization.)
3. The FTC makes only one (negative) reference to the use of Privacy Enhancing Technologies (PETs), which data clean rooms use ‘by default’.
The FTC links to their previously cheeky ‘PETs Cemetery’ article about the pitfalls with these technologies, but neglects to reference the significant developments the data clean room software and service providers are making with privacy-advancing controls for data providers, which vary by use case.
As I wrote last November, there are significantly more ways data clean rooms can enable PETs for measurement where the matched data is technically impracticable (if not ‘impossible’) to be used for ad retargeting or reverse engineered for individual profiling purposes.
Most notably, the IAB Tech Lab recently introduced its ‘Attribution Data Matching Protocol’ which self-regulates various uses of PETs to protect privacy with advertising measurement. The FTC should not only have referenced this IAB Tech Lab effort (and tailored their post to the IAB’s open comment period), but also commended the IAB Tech Lab for advancing global privacy interests through Privacy Enhancing Technologies.
4. The FTC fundamentally misunderstands the ‘clean’ part with the security of data clean room software and services.
There are two common types of data clean room implementations:
(1) the data clean room provider creates an isolated cloud instance on behalf of the data provider; and
(2) the data provider enables the data clean room access to their existing cloud instance.
In either case, the data clean room software provider does not store or centralize the data provider’s data in the software provider’s own multi-tenant hosted environment with other data providers’ data, as is the common practice with the vast majority of all Software as a Service (SaaS) applications.
Rather than the standard ‘logical separation of data’ as with SaaS platforms, data clean rooms use ‘physical separation’ and the data clean room software matches the data through a proprietary methodology, with a ‘majority’* of use cases using privacy enhancing technologies or even enabling the data provider to directly control the level of anonymization.
As a result, data clean rooms are actually more secure than the vast majority of common SaaS tools, such as those provided by big enterprises such as Salesforce. Nor are they distributed workforce applications like Slack where two-factor authentication by employees is critical to avoid ‘backdoor’ access, since the cloud encryption key management should be maintained by the data provider rather than the clean room provider.
(*Based on my clearly biased position of being a privacy lawyer/consultant who has seen dozens of use cases and implementations.)
Conclusion - data clean rooms are on the right track
I have been advising many advertisers, media platforms, agencies, data brokers, and clean room software providers for years about best practices and legal/compliance requirements implementing data clean rooms and can unquestionably state that every participant in this ecosystem is actively engaged in improving user privacy.
The FTC’s report is an unfortunate misinterpretation of what I’ve personally been waiting for after more than a quarter of a century of being a privacy professional, and am excited by how much progress has been made with data clean room innovations and adoption in such a short period of time.
I’ll leave you with an apt passage by privacy advocate and Electronic Frontier Foundation Founder John Perry Barlow (of blessed memory):
“Come wash the nighttime clean.
Come grow the scorched ground green
Blow the horn, and tap the tambourine
Close the gap of the dark years in between You and me”
…FTC
Thanks for the perspective, but mostly, thanks for the Cassidy reference!
Senior Director, U.S. Policy at The Future of Privacy Forum
9moIt's an irony that certain PETs can be used to increase learning and inferences from disparate datasets, done in ways that may protect individual privacy but that overall expand the scope of analysis - which feels (or is?) at odds with data minimization. IMO good that regulators are thinking about this inherent tension, even if it's at an early stage.
CEO: The Martech Weekly & Martech World Forum
9moKeanu Taylor Danny Tyrrell
CEO and Board Director at InfoSum | Leading Strategy and Ops at Tech Companies Large and Small | Doing Something about Diversity and Inclusion
9mo👏🏻👏🏻👏🏻
Data Privacy and Cybersecurity Attorney @ Dentons | CIPP/US | Technology and Law | Financial Services and Fintechs | US Army Veteran | Opinions expressed are my own
9moIn my reading of their blog post, I was interested in their views that firms would somehow use DCRs market to consumers or evade existing privacy rules. Having been involved in acquisitions in the past there were very tight restrictions about use or misuse of personal data. Just my .02 on a lazy Friday. 😉