How Data Breaches in Private Services Are Eroding Trust in Science
Have you ever paused to think about the ethical trade-offs we make when using private services that require our data? From genetic ancestry kits to grammar correction tools, our personal information is often the price we pay for convenience. But the ripple effects of careless data practices go far beyond individual privacy—they’re stalling scientific research in ways that could shape the future of humanity.
This isn’t just a rant about data breaches. It’s about how we, as a society, are caught in an ethical paradox, one that’s harming science and slowing progress on issues that truly matter.
Two Ethical Camps: The Starting Point
In the Social Context of Business course I’ve been teaching at McGill, we explore two foundational ethical frameworks:
Utilitarianism – A consequentialist approach focused on maximizing societal benefits, asking, “What will result in the greatest good for the greatest number?”
Deontological Ethics – A principle-driven approach that emphasizes duty, rules, and individual rights, asking, “What is the ethical duty or principle in this situation?”
At first glance, these frameworks seem straightforward. But the real challenge lies not in choosing one over the other—it’s in maintaining consistency. As we move between individual, organizational, and societal levels, our ethical perspective often shifts, leading to contradictions and clashes that are difficult to reconcile.
A striking example of this inconsistency emerges in the realm of data security. Research institutions are bound by strict ethical protocols, such as obtaining informed consent and safeguarding participants’ data, to ensure individual rights are fully protected. These measures often prioritize individual well-being over expediency, reflecting a deontological approach.
In contrast, private companies frequently take a utilitarian approach—but one skewed towards their own interests. They prioritize speed, innovation, and profit over comprehensive data protection, often leaving users' sensitive information exposed. When breaches occur, the consequences ripple far beyond individual harm, eroding public trust in data-sharing systems as a whole.
This erosion of trust doesn’t just affect private services; it creates a chilling effect on scientific research. With people increasingly wary of sharing their data, even ethically rigorous studies—designed for societal benefit—struggle to recruit participants, ultimately stalling progress on crucial scientific advancements.
The Rigor of Research Ethics: Why Science Plays by the Rules
Academic and medical researchers operate under strict ethical oversight.
Ethics boards ensure no harm comes to study participants, especially vulnerable groups like pregnant women.
Researchers must detail exactly how they’ll use data and ensure privacy before receiving approval—a process that can take months.
This level of scrutiny reflects the belief that science should never harm individuals, even in the pursuit of societal benefits. But this rigorous approach makes it difficult to collect data, particularly in areas where public participation is already low.
Enter Private Companies: A Data Goldmine with Few Rules
Contrast this with private companies offering services like genetic ancestry reports or grammar correction. Their incentives are clear:
Profit is the priority.
Users willingly (and often unknowingly) hand over sensitive data for services that may not even deliver accurate or meaningful results.
What happens when these companies experience a data breach?
Best-case scenario: They notify users and pay a fine.
Worst-case scenario: Sensitive data, including genetic information, is leaked to the dark web, and no one is held accountable.
A Personal Investigation: What I Learned from Two Companies
Over a relaxed lunch with a brilliant scholar Monica Sârghie (Popa) , the conversation took an unexpected turn. She raised a simple but provocative question: "How secure is the data you’re sharing with the tools you use every day?"
Inspired, I decided to dig deeper. I reached out to two companies I regularly rely on:
Antidote (a grammar and writing tool): My inquiries about their data security policies were met with… silence. Not a single reply.
Grammarly: After a few rounds of emails, they finally admitted that their data policies align only with the minimum legal requirements in each country. In other words, if your local laws allow it, your personal data could be used to train their AI models—without your explicit consent.
As a non-native English speaker and researcher, I often depend on these tools to refine my work. But the dilemma is real: Do I trust them with sensitive data, or do I spend countless hours manually polishing my writing?
This convenience comes at a cost, and the question lingers: How much are we willing to trade our privacy for efficiency?
The Bigger Picture: When Private Failures Hurt Public Trust
Here’s where the ethical paradox deepens.
Private companies’ negligence fuels public anxiety. Every new data breach adds to the mistrust people feel about sharing their information.
Science pays the price. Researchers—bound by ethical rules—struggle to recruit participants, especially when the societal benefits of their work are not immediately visible to individuals.
Take medical research on pregnant women. We lack critical data in this area because recruiting participants is already tough. Add in public fears about data security, and the challenge becomes nearly insurmountable.
Who Is Responsible for Fixing This?
The harm caused by these inconsistencies raises tough questions:
Should governments enforce stricter regulations on private companies?
Are private companies ethically obligated to go beyond the bare minimum?
Can universities and researchers rebuild public trust in data sharing?
While governments and academia move slowly, private companies are pushing out products at lightning speed, often without fully considering the societal consequences. But should speed come at the expense of trust and progress?
Every careless data breach sends a ripple through society, creating barriers for scientific research and, ultimately, societal progress.
So, here’s my question for you: Who bears the greatest responsibility for these harms—private companies, governments, or society at large? And how can we prevent this from continuing?
Let’s start the conversation. Share your thoughts in the comments! 👇
Assistant Professor at Stockholm School of Economics, House of Innovation | Qualitative social scientist of creativity and aesthetics | Trentina ⛰️🇮🇹
8moFully agree!