The Drum Awards Festival - Official Deadline

-d -h -min -sec

Health & Pharma Data & Privacy Marketing

What apps do with your health data when you're not looking


By Jonathan Joseph, Board member

May 3, 2024 | 7 min read

You shouldn’t be so quick to trust your pregnancy app or others with your health data. The Ethical Tech Project’s Jonathan Joseph explains.

Surgeons looming over a patient

The story of how our health data became weaponized starts in 2022, when the Supreme Court overruled Roe v Wade in the landmark Dobbs decision.

What followed didn’t just impact reproductive rights. The decision intensified public scrutiny on how personal data – especially sensitive information – is managed and manipulated. Specifically, the case highlighted how location and search data could be exploited to track visits to reproductive health services, such as Planned Parenthood. People began to worry about the extent of data collection and how that data might be used.

They were right to worry. For example, despite Google’s commitments to enhance the privacy of health-related data by removing location entries deemed “personal”, a report published just months after the Dobbs decision revealed that the tech giant still retained detailed records of searches and directions to abortion clinics for weeks.

There’s one key takeaway here: even with increased privacy assurances, some businesses or interests will continue to collect and retain your data – which can be used to infer sensitive details about your health – for nefarious purposes, or just to feed the advertising beast.

The Dobbs case and the ensuing concerns about health privacy really put into the spotlight how location and health data can be used against us. It underlines the double-edged nature of our digital footprint, where convenience can very quickly, and unexpectedly, morph into vulnerability.

Let’s look at some examples.

Period tracking and fertility apps

In 2023, the fertility app Premom misled users by secretly sharing their sensitive personal information with advertisers. Similarly, Flo, a popular period-tracking app, was caught selling data about women's pregnancies.

These apps, ostensibly designed to support women during a transformative life phase, doubled as troves of marketing data. Moms are a lucrative demographic for marketers because they often make the purchasing decisions for the home. But expecting moms are especially vulnerable - marketers know that during pregnancy, women often switch brands, even ones they’ve been loyal to for years.

Premom and Flo were flagged by regulators, but don’t be lulled into a false sense of security just because your chosen cycle or pregnancy tracking app hasn’t faced similar scrutiny.

Regulators usually focus on a priority area where they see rampant infringement or to make a statement in an industry where there might be widespread practices. pregnancy apps the playful element can be disarming. These engaging apps might feature fun comparisons of a baby’s size to various fruits or manage your pregnancy shopping list. But behind this charming front, something sinister could be happening. These apps are harvesting extensive data–your due date, ultrasound images, location, baby’s gender, and potential names—and share it with platforms and third-party advertisers. The sheer volume of data being collected is mind-blowing.

Don’t let the charming exterior of these quirky, cutesy apps deceive you. Apps marketed as supportive health tools could be covers for aggressive data exploitation.

A wake-up call in mental health

The breach of trust in digital mental health services is equally alarming.

Companies like BetterHelp and Cerebral have incurred hefty fines from the FTC for their irresponsible disclosure and use of health data. Despite their assurances of confidentiality, these companies have been caught sharing sensitive user information with advertisers, deeply betraying the trust placed in them.

Imagine this: you’ve scheduled therapy sessions online, naturally believing it’s confidential, but behind the scenes, your pursuit of mental health support–along with private information about your problems–is being tracked and exploited by advertisers.

The FTC revealed last year that BetterHelp shared the information of over 7 million consumers with platforms like Facebook and Snapchat for advertising purposes, among other serious data privacy transgressions. This disturbing practice represents a predatory approach to vulnerable individuals seeking help.

Just this month, the FTC has required Cerebral, another mental health telehealth firm, to pay a $7m fine for carelessly handling patient data and actively sharing it with third parties for advertising without clear and proper disclosure.

You might hope these companies would learn from their competitors’ mistakes, and see a chance to gain a competitive edge by prioritizing user data privacy. BetterHelp was raked over the coals by regulators and suffered extensive reputational damage. Did those consequences deter Cerebral from following in BetterHelp’s footsteps?


Unfortunately, Cerebral made the same mistakes. While promising “safe, secure, and discreet” services to users, Cerebral instead buried details in its privacy policies about sharing sensitive data with third parties for advertising.

As patients or individuals seeking medical advice, we naturally expect a veil of privacy. We trust that our interactions with medical professionals – whether in-person or online – are safeguarded, that our intimate health data is protected as if behind a fortress.

These brands have exploited that trust.

Toward transparency and trust

In the state of Washington, the My Health, My Data Act strengthens protections for health data and offers a private right of action for citizens. At the federal level, the proposed American Privacy Rights Act, which generally provides “opt-out” controls for target advertising, adopts stricter “opt-in” consent rules for sensitive data such as biometrics.

The FTC and State AGs are working with the tools they have available to help close the gaps in data protection for our health data. The FTC action in particular, regulates mostly through the misleading and deceptive conduct provisions in the FTC Act. A federal privacy law, with specific protections for health-related data will help tremendously.

As people’s awareness of how their health and related data is used, and with regulators gaining more power to hold businesses accountable - there is hope of an ethical internet that respects our data dignity.

What we do–or don’t do–right now, will define how much we can trust digital health services moving forward. Let’s make sure they earn that trust.

This piece ran as part of The Drum's Health and Pharma Focus.

Health & Pharma Data & Privacy Marketing

More from Health & Pharma

View all


Industry insights

View all
Add your own content +