Broken Promises in a Test Tube: 23andMe and the New Data Reckoning
In the age of digital transformation, few resources have become as valuable and as vulnerable as data. From predictive analytics in AI to direct-to-consumer genomics, we are entrusting companies with the most intimate facets of our identities. Yet, the very infrastructures that promise innovation and empowerment often leave us exposed. The recent Chapter 11 bankruptcy filing of 23andMe (a direct-to-consumer genetic testing company that provides personalized reports on ancestry, health, and traits based on a saliva sample) is a sobering reminder that the battleground for the future is not just technological—it is ethical and legal.
For those new to the case, I recommend reading the official company announcement here: https://investors.23andme.com/news-releases/news-release-details/23andme-initiates-voluntary-chapter-11-process-maximize
23andMe was, at one point, the darling of personalized health technology. It promised the everyday individual access to their genetic blueprint, the ability to explore ancestral origins, and insight into potential health predispositions. The model was simple yet revolutionary: spit in a tube, mail it back, and get a detailed report about what makes you, you. Millions subscribed to the promise. And in doing so, they entrusted the company with their most sacred asset: their DNA.
If you are still missing the picture or gravity of what data this company holds, just imagine your whole family history, DNA samples and that of many others in one place. See below:
But the collapse of the company in early 2025 flipped the script. While the press release accompanying its Chapter 11 filing reassured the public that there would be no immediate changes to how data is managed or protected, the mere possibility of that data—genetic sequences, familial connections, health predispositions—being included in a future asset sale set off a storm. Legal analysts quickly weighed in, pointing to Section 363(b)(1) of the U.S. Bankruptcy Code, which mandates that if a company has a privacy policy prohibiting the transfer of personally identifiable information (PII) without consent, a Consumer Privacy Ombudsman must be appointed to review and report on the sale.
This is a well-meaning provision. But it is not a safeguard in practice. In fact, 23andMe’s own privacy policy—available publicly—states that in the event of a business transition such as a bankruptcy, acquisition, or sale of assets, user data may be transferred, but only under the condition that the acquiring entity agrees to uphold the existing privacy commitments.
While this may appear reassuring, it subtly places the burden of continuity on a future party's agreement rather than on any binding mechanism enforceable by users themselves. Most users likely missed this nuanced caveat. They were comforted by language like "your privacy comes first" and "you’re in control," yet buried within those assurances was a clause that effectively allows for the transfer of their most personal data in the context of corporate distress—so long as the acquiring entity claims compliance. It’s not just a legal loophole. It’s a glaring ethical blindspot that undermines the notion of informed consent. It assumes compliance, good faith, and, most worryingly, that PII is static. But genetic data is not static. It is irrevocable.
You can change your email address or your phone number. You can even change your name. But you cannot change your DNA. It is not just data—it is destiny. And when such data is treated as a tradable asset, bundled into bankruptcy auctions or acquisition negotiations, the implications are existential.
That is why the intervention of Federal Trade Commission Chairman Andrew Ferguson was so critical. In a formal letter to the U.S. Trustee, Ferguson reminded the court that 23andMe had made explicit representations to its users regarding how their data would be protected. These were not vague marketing slogans. These were policy statements, contractual in nature, where the company assured users that their information would not be shared without consent, even in cases of bankruptcy, acquisition, or asset transfer. Ferguson argued that any purchaser must not only comply with applicable laws but be expressly bound by the original privacy agreements made between 23andMe and its users. This, he emphasized, was not just good practice. It was a matter of public trust.
Those interested in reading the actual FTC letter can find it here: https://www.ftc.gov/legal-library/browse/cases-proceedings/staff-letters/chairman-ferguson-letter-regarding-23andme
It would be convenient to treat this as an isolated case, a cautionary tale of a once-promising company undone by financial mismanagement. But to do so would be dangerously shortsighted. The truth is, this case sets a legal and ethical precedent that could ripple across industries. What happens to user data when a healthtech company goes under? When a fintech startup collapses? When an AI lab is acquired by a multinational? Are we prepared for the possibility that sensitive personal data could be seen not as a responsibility, but as a liquidatable asset?
There are some who argue that regulation is sufficient—that the Bankruptcy Code, federal privacy laws, and market pressures will prevent abuse. But I disagree. Regulation, while essential, is not a substitute for design.
We need a new framework that treats certain classes of data—especially biometric and genomic data—as sacrosanct.
This means elevating data privacy to the level of a human right, protected not only by compliance checklists but by foundational principles enshrined in law and upheld in practice. Consent must not be a one-time checkbox. It must be dynamic, enduring, and portable across business cycles, including bankruptcies and mergers. When individuals give companies access to their genetic data, they do so with the understanding that their trust is non-transferable. We must legally reflect that understanding.
I also believe it is time for the industry—especially founders and investors in tech-driven healthcare and AI companies—to rethink what it means to be a data custodian. Ethical foresight must be embedded into business models from day one. Privacy cannot be a bolt-on feature or a post-crisis PR strategy. It must be part of the product-market fit.
Equally, we need a cultural shift. Data literacy should be viewed as a civic imperative. Just as we teach financial literacy and digital hygiene, individuals should understand the implications of sharing their biometric data. They should know how to request deletions, understand what anonymization actually means, and have access to tools that help them monitor their digital footprint.
The 23andMe bankruptcy is a pivotal moment in our collective journey through the information age. It challenges us to look beyond the hype of innovation and ask deeper questions about who we become when the systems we build begin to fail. We can either treat this as an aberration—or we can confront it for what it truly is: a signpost pointing to the need for urgent reform!
The future will not be written by those who collect the most data, but by those who safeguard it with integrity.
Let this be our wake-up call. We must go further than expressing concern—we must act. This moment presents a unique opportunity to redefine what we mean by 'data' and who it truly belongs to. I urge policymakers to establish a new class of data protection for immutable, predictive, and shared data types like genetic information. Biotech and biomedical companies should be legally required to submit data stewardship and ethical exit plans alongside their business filings, particularly when dealing with biological data. This data category must carry with it irrevocable protections, even in scenarios of corporate distress.
Equally, this is a call to legal scholars, civil society, and technologists: let us convene to question whether our current frameworks treat human data with the dignity it deserves. If we accept that our DNA is more than a consumer record—that it is a core representation of who we are and who we could become—then we must build the ethical, legal, and technological systems to reflect that. It’s time we moved from privacy policies to data principles. From transactions to trust. From passive consent to active digital rights.
My 2 cents...Let me hear your thoughts!