The EU Just Put Snapchat on Trial: What the DSA Investigation Means for Your Kids
On March 26, 2026, the European Commission opened a formal investigation into Snap Inc. under the Digital Services Act — and the findings it suspects are deeply alarming. Regulators believe adults are actively masquerading as teenagers on Snapchat to recruit children into sexual exploitation and criminal activity. Half of all 10-year-olds in Denmark are on the platform. A third of France’s 11-year-olds use it regularly. And Snapchat’s own age-verification system? Regulators say it’s essentially an honor system that bad actors can bypass in seconds.
This is not a minor regulatory complaint. It is the opening shot of what could become the most consequential child safety case in European tech law history.
What the European Commission Is Actually Alleging
The Commission’s investigation, announced in Brussels on March 26, 2026, falls under the Digital Services Act (DSA) — the EU’s sweeping framework that holds large online platforms accountable for the harms their systems enable. Snapchat, with 94.7 million monthly active users in the EU, qualifies as a Very Large Online Platform (VLOP) under the DSA and is subject to its strictest obligations.
The Commission says it has reason to believe Snapchat has breached multiple DSA obligations. Specifically:
1. Adults Impersonating Minors for Grooming and Exploitation
The single most disturbing allegation is that Snap’s platform design actively enables adult predators to pretend to be teenagers. The DSA requires platforms to conduct thorough risk assessments of their systems and implement mitigation measures. Regulators found those mitigations are insufficient — and that the platform’s architecture makes predatory behavior easy.
Investigators allege that:
- Adults can create accounts claiming to be minors without any meaningful verification
- There is no mechanism for users to report suspected underage accounts — meaning if you encounter an adult pretending to be a child (or a child who shouldn’t have an account), you have no direct way to flag it
- The platform also fails to provide clear guidance on how to report illegal content, leaving victims without obvious recourse
2. The “Find Friends” Feature — A Direct Line to Children
One of the Commission’s most pointed concerns involves Snapchat’s “Find Friends” and account recommendation features. According to regulators, these features actively recommend child and teen accounts to other users — including adults who have no legitimate connection to them.
This is not a theoretical risk. Grooming cases almost always begin with initial contact, and recommendation algorithms that surface children’s accounts to unknown adults create exactly the opportunity predators need. The feature, investigators suggest, may directly contradict Snapchat’s stated child safety policies.
3. The Age Verification System Is Broken
Snapchat requires users to be at least 13 to create an account. But the Commission examined how that requirement is enforced and found it relies almost entirely on self-declaration — users simply typing in a date of birth. There is no technical verification, no document check, no cross-reference with any external data.
The result: half of Denmark’s 10-year-olds are using a platform they’re not supposed to be on. EU regulators also questioned whether the platform adequately differentiates between users under 13, users aged 13-17, and adults over 18 for the purpose of delivering age-appropriate experiences. Snapchat offers an “age-appropriate experience” for users who declare themselves under 17, but if users can simply lie about their age, the safeguard is essentially useless.
4. Drug and Illegal Product Information
Beyond sexual exploitation, the Commission also identified concerns that Snapchat is being used as a channel for information about illegal drugs, alcohol, and vapes — age-restricted products that minors should not be able to access. The investigation will examine whether Snap’s content moderation systems adequately detect and remove this material before it reaches young users.
The Legal Framework: What the DSA Demands
To understand why this investigation is significant, it’s worth understanding what the Digital Services Act actually requires.
The DSA came into force for Very Large Online Platforms in August 2023. It imposes a strict risk management framework: platforms must identify, analyze, and mitigate systemic risks their services create — including risks to children, to fundamental rights, and to civic discourse.
For child safety specifically, the DSA requires:
- Robust age assurance mechanisms that go beyond self-declaration
- Minor-specific privacy defaults that limit who can contact children
- Accessible reporting tools for illegal content and suspected abuse
- Annual risk assessments reviewed by independent auditors
- Cooperation with regulators including document production and interviews
The Commission’s investigation is based in part on its review of Snapchat’s risk assessments from 2023 to 2025 — three years of documentation — as well as an information request sent to Snap in October 2025. What those documents revealed apparently gave Brussels enough concern to escalate to a formal investigation.
What Fines Does Snap Face?
Under the DSA, fines for violations can reach 6% of a company’s total annual worldwide turnover. For Snap Inc., which reported approximately $5.4 billion in revenue in 2025, that means potential fines of up to $324 million for a single violation.
But the financial penalty may not be the most impactful consequence. The DSA also empowers the Commission to:
- Order temporary suspension of features (including the “Find Friends” algorithm)
- Mandate specific technical changes to age verification systems
- Require independent audits of Snap’s child safety measures
- In extreme cases of systemic non-compliance, temporarily ban access to the service for EU users
This last option — a platform ban — is an unprecedented nuclear option that the EU has not yet used, but that regulators have not ruled out. The EU is also currently weighing whether to follow Australia’s lead in banning social media entirely for users under 16. An adverse finding in this investigation would add significant momentum to that debate.
Context: A Platform Under Siege from All Directions
The Snapchat investigation did not happen in isolation. The Commission announced the DSA action on the same day it also opened proceedings against four major pornographic websites for failing to prevent minors from accessing adult content — a coordinated signal that Brussels is escalating its enforcement posture across the entire digital landscape.
More broadly, the timing is no coincidence:
- March 25, 2026: A Los Angeles jury found Meta and YouTube liable for creating addictive products that harmed a young user — the first such verdict in the United States, and one that sent shockwaves through Silicon Valley
- 2025: Both Snap and TikTok settled lawsuits alleging their platforms caused addiction and harm to minors
- 2025: The European Parliament passed a resolution supporting a social media ban for under-16s, following Australia’s controversial but widely-supported move in late 2025
- 2025: The UK’s Online Safety Act is actively reshaping how platforms handle child safety in Britain, creating additional regulatory pressure on US companies operating in Europe
Snap is not alone in facing scrutiny. TikTok, Meta, YouTube, and Discord have all faced DSA-related proceedings or investigations in the past 18 months. But the specificity of the Snapchat investigation — the explicit allegation that adults are impersonating children on the platform — makes this case particularly stark.
What Snapchat Says
Snap’s official response has been measured and cooperative in tone. A company spokesperson said:
“The safety and wellbeing of all Snapchatters is a top priority, and our teams have worked for years to raise the bar on safety. As online risks evolve, we continuously review, strengthen, and invest in these safeguards.”
The company also noted it has acted “proactively and transparently” to meet DSA requirements and pledged full cooperation with the Commission’s investigation.
It’s worth noting that Snap has taken some child safety steps in recent years. In 2023, the platform introduced features to limit interaction between teenagers and strangers, including increasing the number of mutual friends required before strangers can search for or find teen accounts. It also launched “Family Center,” a parental supervision tool that allows parents to see who their children are messaging (though not the contents of messages).
Regulators, however, appear to believe these measures are insufficient given the scale of the risk — and the data showing millions of underage children actively using the platform regardless of its stated age limits.
The Age Verification Problem Is Industry-Wide
It would be unfair to single out Snapchat without acknowledging that effective age verification on social media platforms remains an unsolved technical and policy problem across the entire industry.
Self-declaration — asking users to type in a birthdate — is nearly universal because it is easy to implement and requires no collection of sensitive identity documents. The alternatives are genuinely difficult:
Document verification (uploading a passport or ID) is privacy-invasive, creates data security risks, and excludes young people in households where parents control documents.
Biometric age estimation (using AI to guess age from a selfie) is imprecise, raises significant racial bias concerns, and is itself a form of biometric data collection that many privacy laws restrict.
Device-level age verification (parents confirming through a trusted device) has shown promise in limited implementations but requires ecosystem-wide adoption that doesn’t yet exist.
Credit card or financial verification (requiring a payment method linked to an adult) effectively bans low-income users who lack cards and creates financial data collection risks.
The EU’s own Age Appropriate Design Code and the emerging eIDAS 2.0 digital identity framework may eventually provide workable technical solutions — but those systems are years away from broad deployment. In the meantime, Snap and other platforms are caught between regulatory demands for robust verification and a technical landscape that makes truly robust verification genuinely hard.
This doesn’t excuse Snapchat’s failures — it contextualizes them.
What Parents and Guardians Should Do Right Now
Regardless of how the investigation concludes, the data speaks for itself: millions of children under 13 are using Snapchat today, and millions more aged 13-17 are using it with minimal safety guardrails. Here’s what you can do now:
Check if your child has an account
Open the Snapchat app on their device or search for their phone number or email at accounts.snapchat.com. Minors often create accounts with parents’ email addresses or throwaway emails.
Enable Family Center
Snapchat’s Family Center feature (Settings > Family Center) allows a linked parent account to see who a teen is communicating with, see who has sent friend requests, and report concerning content on the teen’s behalf. Crucially, parents cannot read message content — but they can see the contact list.
Review Friend Lists and Privacy Settings
Make sure your child’s account is set to Friends Only for everything: who can contact them, who can see their Story, and who can find them via search. The “Quick Add” and “Find Friends” features — the ones regulators are scrutinizing — can be disabled in settings.
Talk about impersonation specifically
Have an explicit conversation about the fact that people online are not always who they claim to be. The EC’s investigation confirms this isn’t just a theoretical concern — adults are actively pretending to be teenagers on this platform. Your child needs to know this.
Report and block immediately
If your child reports anything uncomfortable, use the in-app report function (press and hold on a message or profile) and then contact the National Center for Missing & Exploited Children (NCMEC) at CyberTipline.org or in the UK, the Internet Watch Foundation at iwf.org.uk.
What Comes Next
The Commission has now sent interview invitations and information requests to Snap as part of the formal investigation process. There is no fixed timeline, but DSA investigations typically take between 12 and 24 months to conclude.
During the investigation, the Commission can issue interim measures if it believes there is a risk of serious harm — meaning it could order Snapchat to make specific changes before any final ruling. Given the child safety context, interim measures are a realistic possibility.
For the broader digital safety landscape, this case sets a critical precedent. The DSA has teeth. The EU has demonstrated willingness to use them against the largest platforms in the world. And the specific allegation — that a platform’s design enables predators to impersonate children — is exactly the kind of systemic risk the law was written to address.
The snapchat ghost, it turns out, isn’t just a mascot. It reflects something real about how accountability has worked on this platform for a decade: there, but invisible.
The European Commission intends to change that.
Sources: European Commission press release IP/26/723 (March 26, 2026), The Guardian, Euronews, Dataconomy, US News & World Report. All statistical figures attributed to official EC communications.


