Signal co-founder Moxie Marlinspike’s new AI assistant Confer enters the chat with a very specific promise: it doesn’t want to know you.
That alone makes it worth a closer look.
This tool positions itself as a privacy-conscious alternative to mainstream AI assistants—and unlike most launches, the value isn’t in flashier outputs or bigger models. It’s in the architecture.

So let’s talk features.
What It Does (and Intentionally Doesn’t)
1. No conversation storageChats aren’t saved, logged, or reviewed later. Once the session ends, it’s gone. That means no long-term memory, no personal history building quietly in the background.
For users who want an AI to help but not remember, this is a meaningful distinction.
2. No training on your dataYour prompts aren’t used to improve future models. This is a clear departure from most AI tools, where usage feeds optimization.
The tradeoff? Less personalization over time.The benefit? Clearer boundaries.
3. Encrypted, limited visibility infrastructureThe system is designed so even the host can’t easily access user conversations. This mirrors Signal’s philosophy: if you can’t see the data, you can’t misuse it.
This isn’t a “trust us” feature—it’s a structural choice.
4. Simpler by designYou won’t find persistent memory, emotional mirroring, or companion-style engagement here. The assistant does the task, answers the question, and steps back.
It feels more like a tool than a presence—and for some users, that’s the point.
Who This Is (and Isn’t) For
This isn’t the AI you use to build a long-term creative collaborator or a deeply personalized workflow assistant. If you want continuity, memory, or emotional resonance, this may feel sparse.
But if you want:
One-off thinking support
Research help without a data trail
A calmer, lower-stakes interaction
This model makes sense.
The Glitchy Genius Take
What stands out here isn’t that this AI is “better.” It’s that it’s clearer.
Clear about limits.Clear about tradeoffs.Clear about what it refuses to do.
In a moment when AI tools are racing to become more personal, more embedded, more intimate, this one quietly asks a different question: What if helpful didn’t require proximity?
Not every system needs to know you.Some just need to work—and then let you go.
That restraint is the feature.What would “helpful” look like if privacy came first?
Source: Brandom, R. (2026, January 18). Moxie Marlinspike has a privacy-conscious alternative to ChatGPT. TechCrunch. https://techcrunch.com/2026/01/18/moxie-marlinspike-has-a-privacy-conscious-alternative-to-chatgpt/

Want to go deeper? Aligned, Not Automated is for anyone who wants to live and work with intention in systems designed to optimize, track, and influence us. If this piece resonated, the book expands the conversation.