If you use Signal on Windows, you might notice something new: you can’t take screenshots of your chats anymore. That’s not a bug — it’s a privacy feature.
Signal has just rolled out an update that blocks screen captures by default on its Windows desktop app. The move is a direct response to a controversial new Windows 11 feature called Recall — and it’s raising big questions about AI, privacy, and the future of digital security.
What Is Windows Recall, and Why Is It Controversial?
Imagine your PC taking a snapshot of your screen every few seconds — quietly and automatically.
That’s what Recall does. It’s an AI-powered feature in Windows 11 that builds a searchable timeline of everything you’ve looked at: websites, emails, even private apps like Signal. While it might sound helpful for finding that file you forgot the name of, it also means your sensitive conversations could be stored without your knowledge.
There’s currently no way for apps to opt out of being captured. That’s what pushed Signal to act.
How Signal’s New Screen Security Works
Here’s what Signal’s update does under the hood:
What It Does | Why It Matters |
---|---|
Blocks screenshots by default | Prevents Recall or manual captures from revealing chat content |
Shows a black screen if captured | Similar to how Netflix prevents screen recording of movies |
Can be toggled off manually | Users still have control if they need to take screenshots |
Think of it as DRM for privacy — only instead of protecting content from piracy, it’s protecting your conversations from being logged by your operating system.
Want Screenshots Back? You Still Have That Option
Signal knows not everyone will want this restriction. If you rely on screenshots for accessibility, troubleshooting, or documentation, here’s how to turn the feature off:
- Open the Signal desktop app.
- Go to Settings > Privacy.
- Toggle off Screen Security to allow screenshots again.
This way, Signal puts privacy first — but leaves the final choice up to you.
A Wake-Up Call for App Developers
What Signal’s doing here isn’t just about one feature or one app. It’s part of a larger shift: as operating systems start logging more data to support AI features, apps will need to fight back to keep user information private.
We may start seeing more end-to-end encrypted tools adopt similar protections — especially if OS-level AI tools can access things users assumed were off-limits.