Youâre not doing well. Maybe you havenât been for a while. Youâve thought about seeing a therapist, but the barriers feel enormous â finding one, waiting months for an appointment, filing insurance claims that become permanent records, sitting in a waiting room where someone might recognize you. The stigma. The paper trail.
Then you find BetterHelp. Itâs on your phone. Itâs affordable. Itâs private. No insurance. No waiting room. No paper trail. Just you, your phone, and a licensed therapist who canât tell anyone youâre there.
You tell them things youâve never said out loud. About the anxiety that wakes you at 3 AM. About the depression youâve been managing alone for years. About the trauma. The diagnosis. The medication youâre ashamed of needing. This is what therapy is supposed to look like â finally accessible, finally safe.
Except BetterHelp was selling everything you told them. Not to researchers. Not in anonymized aggregate. To Facebook. To Snapchat. To Criteo. So that advertisers could target more people like you.
Two million people trusted them with their darkest moments. Those moments became inventory.
The Promise That Made It Irresistible
BetterHelp launched in 2013 with a pitch that was genuinely revolutionary: therapy for people who couldnât access traditional therapy. The therapist shortage in America is real. Wait times of three to six months are common in major cities. In rural areas, a therapist might not exist within a hundred miles. Insurance coverage for mental health is notoriously patchy. Sliding-scale practices have long waitlists. For millions of people, traditional therapy simply wasnât an option.
BetterHelp offered a door that was always open. A network of over 30,000 licensed therapists. Sessions via text, video, phone, or live chat. Plans running $240 to $360 per month â real money, but still less than cash-pay therapy in most cities. And critically: no insurance required. No claims filed with your employerâs health plan. No psychiatric codes permanently embedded in your medical record. No paper trail linking your name to a mental health diagnosis.
That last part mattered enormously. The reason millions of people avoid therapy isnât just cost â itâs fear. Fear that your employer will find out. Fear that your insurer will raise your rates or deny future coverage. Fear that seeking help will be used against you somehow. BetterHelpâs model directly addressed that fear. Weâre private. Weâre discreet. Weâre just you and your therapist.
By 2023, they had approximately two million active users. The most downloaded mental health app on earth. A billion-dollar company, acquired by Teladoc Health, generating hundreds of millions annually. They had built something people desperately wanted: accessible mental healthcare that felt safe.
That safety was an illusion they were actively monetizing.
What They Actually Did
Hereâs what BetterHelp disclosed in the fine print almost no one read: they were using your data for advertising.
Not generic demographic data. Not âusers who downloaded a health app.â Individual-level data, tied to your actual identity. Your email address. Your IP address. The fact that you had specifically sought mental health treatment. The fact that youâd previously been in therapy. Whether you had a specific diagnosis.
This information was shared with Facebook, Snapchat, Pinterest, and Criteo â major advertising platforms whose entire business model is building psychographic profiles of individuals and selling advertisers access to them. BetterHelp used these platforms for a practice called âlookalike audience targetingâ: you upload a list of your existing customers, the platform finds millions of people who share their characteristics, and you advertise to all of them.
The FTCâs complaint detailed the specifics with clinical precision. BetterHelp shared the email addresses of users who had identified as having previously been in therapy â directly enabling Facebook to match those individuals to their Facebook accounts and their broader behavioral profiles. They sent Snapchat 5.5 million data points. Five and a half million individual records, each representing a person who had trusted a mental health platform with some of the most sensitive information a person can share.
Let that number sit for a moment: 5.5 million data points to Snapchat alone.
None of these users consented to this. Many of them had specifically chosen BetterHelp because they wanted to avoid the kind of institutional data trail that comes with traditional healthcare.
The Lie Written Into the Policy
Here is the most damning part of this story. BetterHelpâs own privacy policy, during the period they were sharing this data, contained explicit language that directly contradicted what they were doing.
Their policy stated: âWe will never use your information for advertising purposes.â
Not âwe limit how we use your data.â Not âwe may share aggregated insights.â A flat, unambiguous promise: never. The word does real work in that sentence. It is the kind of language companies include specifically to reassure users who have privacy concerns â users who are, perhaps, choosing an app precisely because they want to avoid having their information used against them.
BetterHelp made that promise. Then they sent 5.5 million records to Snapchat.
This wasnât a gray area. It wasnât a legal technicality or a terms-of-service clause buried in paragraph forty-seven. It was a specific written promise they made to their users, and they violated it at industrial scale while the promise was still live on their website.
The FTC Stepped In (and Then What?)
In March 2023, the Federal Trade Commission filed a complaint against BetterHelp and announced a $7.8 million settlement. FTC Chair Lina Khan called it a landmark case â the first FTC action specifically targeting a telehealth company for privacy violations. The agency required BetterHelp to stop sharing data with third parties for advertising, to obtain explicit consent for any future data sharing, and to implement a comprehensive privacy program.
The $7.8 million would be used to provide partial refunds to affected consumers.
It was, by every measure of how the FTC operates, a serious response. The agency used the tools it had, moved faster than it often does, and made a public statement that this behavior was illegal and would be punished.
Here is the problem: BetterHelp generated revenues of approximately $780 million in 2022 alone. The fine represented less than one percent of a single yearâs revenue. If you ran a store and the penalty for shoplifting was one percent of daily sales, you would not redesign your store layout to prevent shoplifting. You would pay the fine and keep the layout.
BetterHelp paid. They did not admit wrongdoing. They are still operating. They are still the most downloaded mental health app on earth. The business model that made them worth acquiring for hundreds of millions of dollars remained fundamentally intact.
And the data? The 5.5 million records sent to Snapchat? The Facebook custom audiences built from mental health data? That information doesnât disappear when a company pays a fine. It lives in advertising platformsâ databases, contributing to profile scores and targeting parameters, indefinitely. You cannot unshare what has been shared. You can cancel your BetterHelp account, but you cannot reach into Snapchatâs servers and remove the fact that you once sought therapy.
The HIPAA Loophole That Covers All of Them
Many people assumed HIPAA â the federal health privacy law â protected them from exactly this kind of thing. It doesnât. And not because of some obscure exception. Itâs because HIPAA was designed around a specific model of healthcare that no longer describes how most people interact with mental health services.
HIPAA applies to âcovered entitiesâ: hospitals, insurance companies, doctorsâ offices, pharmacies, and the businesses that directly support them. BetterHelp is a technology company. It markets itself as a technology platform that connects users with therapists. The therapists on BetterHelp may themselves be HIPAA-covered entities, but the platform that hosts the connection, stores your intake forms, processes your payments, and controls your account? Not covered.
This isnât specific to BetterHelp. Talkspace, another major online therapy platform, operates the same way. Calm and Headspace, which offer guided meditation and mental wellness content, are not HIPAA-covered. Woebot, the AI therapy chatbot, is not HIPAA-covered. Spring Health, Modern Health, Ginger â the entire consumer mental health app ecosystem exists outside HIPAAâs jurisdiction.
HIPAA was passed in 1996. The iPhone was released in 2007. The law was written to govern paper records, fax machines, and billing systems â not apps that run on devices we carry everywhere and that know more about our emotional states than our own families do. Congress has not meaningfully updated it to reflect this reality, which means that the most intimate digital data most people will ever generate â the conversations they have about their mental health, their diagnoses, their crises â sits outside the strongest privacy protections the federal government has to offer.
Why You Can Cancel But Not Unshare
Thereâs a specific cruelty to how data violations like this one play out in practice that the $7.8 million fine doesnât address.
When you delete your BetterHelp account, BetterHelp may delete their copy of your records. When the FTC order takes effect, BetterHelp stops sharing new data. But the data already shared? Itâs in Facebookâs advertising infrastructure, where it was used to train targeting algorithms. Itâs in Snapchatâs database, contributing to the profile that Snapchat built for everyone who matched your demographic. Itâs in Criteoâs network, informing ad-serving decisions across thousands of websites.
Data, once shared, is effectively permanent in a way that money is not. You can pay back a fine. You cannot un-leak information about two million peopleâs mental health histories. Every user who signed up for BetterHelp before the FTC action is permanently in a slightly different position than they were before they downloaded the app. Their mental health history is somewhere they didnât put it, associated with identifiers that connect it to everything else they do online.
This is what the FTCâs enforcement tools cannot fix. They can punish what happened and try to prevent future violations. They cannot reach back in time and remove data from systems that have already ingested it.
The Dark Irony at the Heart of This
Think for a moment about who specifically chose BetterHelp over traditional therapy because they wanted more privacy. These are people who had reasons to be worried about a mental health paper trail. People in jobs with security clearances. People in custody disputes. People in conservative communities where mental health stigma runs deep. People who are simply intensely private and value their psychological autonomy.
The desire for privacy was not incidental to their choice of BetterHelp. It was the reason for the choice.
And it was exactly this â the signal that you cared about discretion, that you had something private to protect, that you were willing to pay for a platform specifically because it promised not to create a record â that made you valuable to advertisers. You were not just a therapy user. You were a demonstrated mental health consumer who had revealed, by the act of signing up, your specific sensitivities and vulnerabilities. You were the ideal target for certain advertisers. Your privacy concern was the product.
You didnât just trust them. You trusted them because they promised to be trustworthy. That promise, to the people who needed it most, was precisely what they commodified.
What You Can Actually Do
Check your FTC refund eligibility. If you were a paying BetterHelp user between August 1, 2017 and December 31, 2023, you may be eligible for a partial refund from the $7.8 million settlement fund. The FTC has a claims process at ftc.gov/betterhelp. The amounts are modest â typically tens to low hundreds of dollars â but the eligibility is real.
If youâre currently using mental health apps, ask the right questions before you continue. Look for explicit HIPAA Business Associate Agreements. Look for apps that do not use advertising-based revenue models. Look for apps that have undergone third-party privacy audits. Look for apps where the business model depends on you staying well, not on selling your data.
Red flags to watch for in any mental health app: advertising on the platform, free tiers supported by âpersonalization,â vague language about âsharing with partners,â no explicit HIPAA compliance claims, and â crucially â any language about âimproving our servicesâ that doesnât define exactly what data is used for that purpose. âImproving our servicesâ is the data-sharing industryâs favorite euphemism.
Alternatives with stronger protections: In-person or traditional telehealth therapists who bill through insurance or direct pay are fully HIPAA-covered. Some therapy apps have explicit HIPAA compliance and business associate agreements available on request â ask before signing up. Community mental health centers, university training clinics, and non-profit counseling services typically operate under stronger confidentiality obligations than commercial apps.
Be especially cautious with AI therapy tools. Apps like Woebot, Wysa, and similar AI-driven mental health chatbots are not covered by HIPAA, often have less regulatory scrutiny than human therapist platforms, and collect detailed conversational data about your emotional state over time. The regulatory framework for AI therapy is essentially nonexistent.
The need for accessible mental healthcare is real. BetterHelp identified a genuine problem and built something millions of people needed. That makes what they did worse, not better. They earned trust specifically from the people most vulnerable to having that trust violated, and then they violated it at scale.
The FTC fine sent a message. It was the right message. But at less than one weekâs revenue for a billion-dollar company, it was not a message that changed the fundamental economics of what BetterHelp did.
You were never the customer. You were the inventory. And unlike a product pulled from a shelf, the inventory has already been distributed, and thereâs no recall coming.
The next time an app promises you privacy as its selling point, ask yourself: what exactly is their business model? Because if the answer is âadvertising,â then privacy isnât what theyâre selling. Privacy is what theyâre selling you.


