Signal has the best encryption in the messaging business. Its protocol is so trusted that WhatsApp, Google Messages, and even Facebook Messenger have licensed it. Cryptographers praise it. Security researchers recommend it. Governments fear it.
And yet, on March 20, 2026, the FBI and CISA issued a joint public service announcement confirming that Russian intelligence-linked operatives had compromised thousands of Signal accounts worldwide — without breaking a single line of its encryption code.
The attackers didn’t crack Signal. They used it exactly as designed. They just made sure they were the ones on the authorized device list.
This is the uncomfortable truth that every privacy-conscious user needs to understand: your tools can be perfect and you can still lose everything. Here’s what happened, how it worked, and what you need to do about it right now.
The Campaign: A Global Operation Against High-Value Targets
The FBI’s public service announcement — the first to directly attribute these campaigns to Russian intelligence services — describes a sustained, coordinated operation targeting what the bureau calls “individuals of high intelligence value.”
That phrase covers a lot of ground: current and former US government officials, active military personnel, political figures, diplomats, journalists, and activists. Essentially anyone whose private conversations would be valuable to a foreign intelligence service conducting surveillance, influence operations, or disinformation campaigns.
“This global campaign has resulted in unauthorized access to thousands of individual commercial messaging application (CMA) accounts,” the joint advisory states.
The campaign wasn’t new — Dutch intelligence agencies had flagged similar activity in early March 2026, German security authorities had warned about state-linked phishing targeting European journalists in February, and France’s Cyber Crisis Coordination Center (C4) issued its own alert on March 20. But the FBI’s advisory was the first to definitively point at Russian intelligence as the source.
Earlier reporting from Google’s Threat Intelligence Group, published in February 2025, identified multiple Russia-aligned threat actors targeting Signal — including APT44, also known as Sandworm or Seashell Blizzard. APT44 is a unit within Russia’s GRU (military intelligence), specifically the Main Centre for Special Technologies (GTsST). This group has been behind some of the most destructive cyberattacks in history, including NotPetya and attacks on Ukrainian infrastructure.
The March 2026 campaign represents an escalation: a broader, more systematic operation that has moved beyond Ukraine-focused targeting to encompass anyone the Russian state views as strategically useful.
How It Worked: The Linked Device Exploit
Signal’s encryption is end-to-end and rock-solid. No one — not Signal, not governments, not Russian intelligence — can decrypt messages in transit. So the attackers didn’t bother trying.
Instead, they exploited a completely legitimate Signal feature: Linked Devices.
Signal lets you use your account on multiple devices simultaneously — your phone and a desktop app, for example. To link a new device, you scan a QR code in the Signal app. It’s a useful feature. It’s also, as it turns out, a weapon.
Here’s how the attack played out in practice:
Method 1: Malicious QR Code Phishing
Attackers sent targets carefully crafted phishing messages impersonating Signal support, trusted contacts, or organizations the target would recognize. These messages claimed the target needed to take urgent action — verify their account, accept a group invitation, or complete a security check.
The message included a QR code. But instead of being a legitimate Signal group invite or verification code, the QR code was a device linking code that would silently add the attacker’s device as an authorized linked device on the target’s Signal account.
The moment the target scanned that code, the attacker’s device gained full access to their Signal account — including the ability to read all incoming and outgoing messages in real time. The target would see no warning, no notification, no indication anything had changed. Their conversations would continue appearing encrypted and private, while being simultaneously mirrored to an attacker halfway around the world.
Method 2: Verification Code Theft and Account Takeover
In other cases, attackers went for complete account takeover rather than silent access. Posing as Signal support or trusted parties, they convinced targets to share their one-time verification codes or PINs — often framed as a “security verification” or “account recovery” step.
With a verification code, the attacker could register the target’s phone number on a new device, effectively stealing the account entirely and locking the original user out.
Battlefield Exploitation
Google’s threat intelligence reporting documented a particularly alarming use case: GRU-linked forces using malicious QR codes on captured devices on the front lines of the Ukraine conflict. When Ukrainian soldiers’ phones were seized, Russian forces would link those accounts to attacker-controlled infrastructure, gaining access to the battlefield communications of Ukrainian military personnel.
It’s a stark illustration of how these techniques translate from espionage to active warfare.
Why This Attack Was So Effective
Effectiveness here came from multiple converging factors:
Trust in the brand. Signal’s reputation for security is so strong that users let their guard down. “If I’m on Signal, I’m safe” — that belief became a vulnerability. People who would scrutinize a suspicious email were more likely to comply with a “Signal security” request without hesitation.
Legitimate feature abuse. The linked devices feature isn’t a bug. It works exactly as intended. There’s no malware to detect, no exploit to patch, no anomaly to flag. A successful QR code link looks identical to a legitimate one.
Targeted, high-quality attacks. These weren’t mass spam campaigns. Attackers identified specific high-value targets, researched their contacts and organizational affiliations, and crafted messages that appeared plausible and urgent to that specific person. A journalist might receive a message appearing to come from a source. A government official might receive something that looked like an official inter-agency communication.
No technical barrier. The attack requires no technical sophistication from the attacker beyond social engineering and access to Signal’s linked-devices functionality. Which means it scales.
Who Was Targeted — and Why You Should Care Even If You’re Not “High Value”
The official advisories focus on high-profile targets: government officials, military, journalists. But the implications extend much further.
First, the definition of “high value” is broader than it sounds. Activists, researchers, lawyers handling sensitive cases, healthcare workers in politically sensitive contexts, whistleblowers, dissidents from countries with authoritarian governments — all of these people have legitimate reasons to use Signal and all could be targets of a sophisticated state actor.
Second, you may be a vector. If a contact of a high-value target is compromised, the attacker gains access to conversations with that target. Your device doesn’t need to be the primary objective; it just needs to be a bridge.
Third — and this is the uncomfortable part — this campaign almost certainly won’t be the last. The success of this technique against thousands of users means it will be replicated, refined, and deployed again. By Russian intelligence, yes. But also by other state actors, organized crime groups, and sophisticated individual attackers who recognize a working playbook when they see one.
Signal was so concerned about this attack vector that it updated its app to add additional visual warnings and friction to the device-linking process specifically to counter QR code phishing. That update is live — but it only helps people on updated versions of the app who are paying attention to the warnings.
What Every Signal User Must Do Right Now
This is the part that actually matters. Here are concrete, actionable steps you should take immediately:
1. Audit Your Linked Devices — Right Now
Open Signal → Settings → Linked Devices. You should see every device currently connected to your account. If you see anything you don’t recognize, remove it immediately. Do not wait.
If you’ve never checked this list before, check it today. Attackers who successfully linked a device have been quietly reading your messages without you ever knowing.
2. Enable Registration Lock
Signal’s Registration Lock feature ties your account to a PIN, preventing anyone from re-registering your phone number on a new device without it.
Go to Signal → Settings → Account → Registration Lock → Enable it. Choose a strong PIN (not a birthday, not a simple pattern). Store it somewhere you won’t lose it — this is important, because if you forget it and need to recover your account, you’ll face a delay.
This single step makes account takeover via stolen verification codes dramatically harder.
3. Update Signal Now
Make sure you’re running the latest version of Signal on all your devices. The linked-device warning UI improvements are in recent versions. Updates also patch security vulnerabilities that might be used alongside social engineering.
4. Never Scan a QR Code You Didn’t Explicitly Request
This is the hardest habit to build but the most important. If someone sends you a QR code and says “scan this for Signal” — stop. Ask yourself: did I request this? Is there any legitimate reason this person would be sending me a Signal QR code?
The legitimate use case for scanning a Signal QR code is when you are deliberately adding a new device to your own account by going to Settings → Linked Devices → Link New Device. If a QR code appears in a message, an email, or a website without you initiating that process, treat it as hostile.
5. Be Skeptical of “Signal Support” Contacts
Signal does not proactively contact users. There is no Signal support team that will message you on Signal asking you to verify your account, scan a code, or share your PIN. Any message claiming to be from Signal support is a social engineering attempt. Report it and ignore it.
6. Enable a Screen Lock
If your device is ever lost, stolen, or seized, a screen lock prevents immediate physical access. Signal also has its own app lock (Settings → Privacy → Screen Lock) that requires biometric or PIN authentication to open the app, separate from your device lock.
7. Use a Strong PIN for Signal (Not Tied to Your Phone PIN)
If you enable Registration Lock, make sure the PIN is unique to Signal and not the same as your phone’s unlock PIN or other commonly used codes. Layers of different authentication reduce the blast radius of any single compromise.
The Bigger Lesson: Operational Security Is Not Optional
This campaign is a case study in a truth that security professionals have been saying for years: technology cannot save you from yourself.
Signal’s encryption is among the most robust ever designed for a consumer product. Its protocol has been audited repeatedly by independent researchers. It stores minimal metadata. It’s designed from the ground up to protect user privacy.
None of that matters if you scan a malicious QR code.
This is the fundamental gap between technical security and operational security (OpSec). Technical security refers to the properties of the tools you use — encryption strength, vulnerability patching, secure coding practices. Operational security refers to how you use those tools — the habits, awareness, and behaviors that determine whether those technical properties actually protect you.
You can have Fort Knox in your pocket and still hand a stranger the key if they ask nicely enough.
The Russian campaign against Signal didn’t break encryption. It broke operational discipline — and it did so at scale, against thousands of people who believed that using Signal was sufficient protection.
It isn’t. Signal is a necessary component of secure communications. It is not a complete solution.
True privacy protection requires:
- Awareness of social engineering and the specific techniques being used against your category of user
- Habits around verification — not trusting unexpected requests, verifying through known-good channels
- Regular security hygiene — auditing linked devices, keeping software updated, reviewing account settings
- Threat modeling — understanding who might target you, why, and through what vectors
None of these are complicated. All of them require conscious attention that most people don’t apply to their messaging apps.
What Signal Has Done — and What Comes Next
To Signal’s credit, the company responded quickly when Google researchers disclosed the linked-device attack vector in early 2025. Signal updated its app to add more prominent warnings when a device-linking QR code is scanned, making it harder to be tricked without noticing.
But app updates are not retroactive, and they don’t reach every user immediately. The thousands of accounts compromised in this campaign included people on updated devices who were simply successfully deceived before the UI warnings were in place.
Going forward, we can expect Russian intelligence (and other state actors who now see this playbook in action) to adapt. They’ll try new social engineering lures, target new categories of users, and look for the next feature in the next app that can be abused in the same way.
This is an arms race, and users are the frontier.
Conclusion: Privacy Requires Participation
For years, the privacy community has focused on getting people to use better tools. Signal instead of SMS. End-to-end encryption instead of plaintext. That work matters and has made the world meaningfully more private.
But the Russian Signal campaign is a watershed moment. It demonstrates, at documented scale, that the tools are no longer the weak link. We are.
Russian intelligence wasn’t defeated by encryption. It went around it. And it will keep going around it until users understand that privacy is a practice, not an app.
The good news is that the practice isn’t hard. Audit your linked devices. Enable registration lock. Never scan a Signal QR code you didn’t request. Update your apps. Be skeptical of “support” contacts. These steps take minutes and close the door on the exact attack vector that compromised thousands of accounts.
The encryption is fine. Now it’s your turn to hold up your end of the deal.
Sources: FBI/CISA Joint Public Service Announcement (March 20, 2026); France C4 Alert (March 20, 2026); Dutch AIVD advisory (March 2026); Google Threat Intelligence Group — “Signals of Trouble” (February 2025); BleepingComputer; Help Net Security.


