The Signal Creator Just Launched an AI That Can't Read Your Chats—Even If You Pay It To
Inside Confer: How Signal's encryption model works for AI (and what it costs)
Remember that moment of hesitation before you paste something personal into ChatGPT?
Maybe it’s a health question you’re too embarrassed to ask your doctor. A work dilemma that could get awkward if leaked. Or just the kind of half-formed thought you’d normally keep in a journal—except now you’re typing it into what’s essentially a very smart API endpoint connected to someone else’s server farm.
That pause? It’s your privacy instinct kicking in. And according to Moxie Marlinspike—the engineer who built Signal and made encrypted messaging mainstream—we’ve been ignoring it for way too long.
His solution is Confer, a new AI chatbot that treats your conversations like Signal treats your texts: encrypted end-to-end, with keys that never leave your device.
Why Today’s AI Is a Privacy Nightmare
Let’s not sugarcoat this: when you use ChatGPT, Gemini, or Claude, you’re handing over your thoughts in plaintext to companies that store every word you type (even “deleted” chats aren’t really gone). They can be forced by courts to preserve and hand over your logs, use human contractors to review conversations, and may train future models on your inputs.
A court order in May 2025 required OpenAI to preserve all ChatGPT user logs, including deleted chats. OpenAI CEO Sam Altman has admitted that even therapy sessions on the platform may not stay private.
Marlinspike puts it bluntly in his launch post: using today’s AI is like “confessing to a data lake”—one that never forgets, can be searched by your employer’s legal team, and might someday be monetized in ways we haven’t even imagined yet.
How Confer Actually Works
Confer flips the entire model.
Your prompts are encrypted on your device before they’re sent anywhere. The AI processes them inside something called a Trusted Execution Environment (TEE)—basically a secure hardware vault that even Confer’s own engineers can’t peek into. Responses get encrypted on the way back out.
Instead of passwords, Confer uses passkeys like Face ID, Touch ID, or your device unlock PIN to generate encryption keys. These keys stay on your phone or laptop. Period.
The clever part? Confer’s entire codebase is open source and verifiable. Anyone can clone the repository, rebuild it, and confirm the measurements match what’s actually running on the servers. There’s even a public transparency log so Confer can’t quietly swap in a different version for specific users.
It’s cryptographic proof, not just a promise in a privacy policy.
The Reality Check: Bare-Bones and Expensive
Here’s where things get complicated.
Confer offers a free “Guest” tier that’s brutally limited: 20 messages per day, 5 active chats, and a basic AI model. That might work for occasional questions, but if you’re planning to use it as your daily AI assistant, you’ll hit the ceiling fast.
Want unlimited access? That’ll be $34.99 per month—which puts it at nearly double the cost of ChatGPT Plus ($20) or Claude Pro ($20), and in the same ballpark as Microsoft’s Copilot add-on ($30).
For that premium, you get unlimited messages and chats, advanced AI models, personalization features, and the same end-to-end encryption (which is the whole point). But you don’t get image upload or generation, project organization, the polish and ecosystem of established alternatives, or years of refinement and bug fixes.
Confer has web search integration and outputs clean markdown, which is great. The AI itself appears capable for a brand-new service. But the feature set is minimal, and at $35/month, you’re paying a significant privacy premium for a product that’s still finding its footing.
Who Should Actually Use This?
Confer makes sense if you work with genuinely sensitive information (therapists, lawyers, doctors, journalists), handle proprietary code or business strategy where a single leak could be catastrophic, operate in high-risk regions where AI surveillance isn’t paranoia but reality. Or have the budget to prioritize privacy over features and refinement.
It does not make sense if you need the best AI performance available, want features like image generation or document analysis or team collaboration, expect the free tier to be usable for regular work, or are price-sensitive (you can get two mainstream AI subscriptions for roughly the same cost).
The Bigger Question Confer Raises
We’ve normalized something strange: handing our most private thoughts to companies optimized for data extraction.
We treat AI like a therapist, a strategist, a coding partner—but unlike an actual therapist, ChatGPT stores transcripts of every session and shares them with whoever runs the server. As Marlinspike warns, when advertising comes to AI assistants, they’ll be armed with total knowledge of your context, your concerns, your hesitations.
Confer proves we don’t have to accept that trade-off. Strong encryption and powerful AI can coexist—the tech exists, it’s just that most AI companies have chosen not to build it this way.
Is $35/month too much for that peace of mind? That depends entirely on what you’re protecting.
If you’re using AI for casual questions and creative projects, probably yes. But if you’re a therapist taking session notes, a lawyer strategizing case approaches, or anyone handling information that could genuinely harm you or others if exposed, the premium might be worth it.
The real test isn’t whether Confer becomes the next ChatGPT. It’s whether the idea catches on—whether enough people demand privacy-by-design that bigger players feel pressure to adopt it.
Signal didn’t become ubiquitous overnight. It took years for encrypted messaging to go mainstream. Confer might follow a similar path. Or it might carve out a niche as the privacy option for professionals who can’t afford data leaks.
Try It Yourself
Want to test Confer? The free tier gives you 20 messages a day to see if the interface and AI quality work for you.
If you decide to upgrade to a paid membership, use referral code PR1V4T3 during signup—you’ll get your first month free, and it helps support independent privacy-focused journalism like this. (Full disclosure: I also get a free month, which lets me continue testing and writing about privacy tools without worrying about subscription costs stacking up.)
Just know that after the free month, you’re looking at $35/month. You’re not buying polish or a massive feature set—you’re buying privacy infrastructure that actually works.
Question for You
Have you checked what you've actually shared with AI assistants this week? And if someone else read those conversations tomorrow, would you be comfortable with that?




