Moxie’s Confer: End-to-End Encrypted AI Assistant

Confer: Moxie Marlinspike's E2EE AI Assistant
Private AI Chats

Key Takeaways:

  • Moxie Marlinspike launched Confer, an open-source AI assistant designed with end-to-end encryption so only account holders can read prompts and responses.
  • Confer uses passkeys, client-side keys, and a Trusted Execution Environment (TEE) with remote attestation and transparency logs to prove the server runs auditable code.
  • Conversations sync across devices in encrypted form; macOS, iOS, and Android have native support, while Windows and Linux require workarounds.
  • Alternatives such as Proton’s Lumo and Venice offer privacy-focused LLMs, but major platforms still lack true E2EE protections and face subpoena/human-review carve-outs.

What is Confer and why it matters

Confer is a privacy-first AI assistant built by Moxie Marlinspike, the engineer behind Signal. It aims to make interacting with large language models (LLMs) as private and user-controlled as modern secure messaging.

Unlike mainstream LLM services, Confer’s design prevents the operator — or anyone with server access — from reading user prompts, model outputs, or stored conversations. The project is open source so third parties can inspect the entire stack.

How Confer protects your data

Confer combines device-resident passkeys with client-side encryption to produce keys that never leave a user’s hardware. Those keys decrypt conversation data locally, so stored chats remain unreadable on Confer’s servers.

A central technical guardrail is a Trusted Execution Environment (TEE) on Confer’s servers. TEEs encrypt data and code in the CPU, and Confer uses remote attestation to cryptographically prove which software is running inside the enclave.

That attestation — plus signed releases published to a transparency log — lets users verify the public proxy and image code are running unmodified.

Passkeys and an example

Confer leverages passkeys (FIDO2/WebAuthn) to generate robust, service-specific keypairs whose private portion is stored in protected device hardware. The code that derives an internal key in the browser looks like this:

const assertion = await navigator.credentials.get({
    mediation: "optional",
    publicKey: {
      challenge: crypto.getRandomValues(new Uint8Array(32)),
      allowCredentials: [{ id: credId, type: "public-key" }],
      userVerification: "required",
      extensions: { prf: { eval: { first: new Uint8Array(salt) } } }
    }
  }) as PublicKeyCredential;

  const { prf } = assertion.getClientExtensionResults();
  const rawKey  = new Uint8Array(prf.results.first);

Availability and usability

Confer has native clients on the latest macOS, iOS, and Android builds. Windows requires a third-party authenticator and Linux lacks official support, though community bridges exist. Design focuses on a simple login flow and device sync without exposing plaintext to operators.

Where Confer sits in the market

Privacy-first alternatives include Proton’s Lumo, which uses Proton’s established encryption model, and Venice, which stores data only locally. Major providers (OpenAI, Google) still include legal and human-review carve-outs; courts can compel logs and platforms may permit human reviewers for safety.

Bottom line

Confer shows a practical path to private, auditable AI interactions by combining passkeys, TEEs, remote attestation, and transparent publishing. Until large LLM platforms adopt similar guarantees, these smaller projects will set the standard for confidential AI use.

Read more