Coming soon

EngramMCP

Your AI's memory. Local, semantic, multi-source.

In design. Not yet shipped. No install date yet. Drop your email below to be among the first.

The problem

AI agents have no memory between sessions. You reexplain the same context every time: who you are, what you're working on, yesterday's decisions, that important Notion doc, last week's Slack thread, the YouTube video you watched.

Notes are scattered: Notion, Drive, voice thoughts, AI conversations lost at session end, YouTube links forgotten. Obsidian indexes markdown you type by hand. Mem.ai and Notion AI keep your memory in their cloud. No tool talks natively to agents.

What EngramMCP is

A local MCP server that gives any tool-use capable agent access to a unified semantic memory living on your machine. One install. Connect sources once. The agent calls search_notes, search_conversations, search_drive, search_notion, search_youtube, remember_exchange. Engram does the rest.

What it ingests

  • Voice notes from the mobile app, transcribed and indexed.
  • Agent conversations from Claude Code, Cursor, or custom bots, captured via remember_exchange.
  • Google Drive docs shared in the app, ingested, then the cloud copy is deleted.
  • Notion pages watched and re-indexed on change.
  • YouTube videos: paste a URL, the transcript is indexed.
  • Text notes from mobile or the local dashboard.

All stored locally. LanceDB vectors, Ollama embeddings (nomic-embed-text), SQLite base. Cloud only during transit, encrypted, deleted after pull.

Privacy promise

  • Local by default. Vectors, files, and transcripts all live on your PC.
  • Cloud equals transit only. Items pass through during ingestion, then deleted.
  • Encrypted snapshots. Multi-PC sync uses a client-encrypted blob. Server can't read it.
  • User holds the key. Passphrase-derived.
  • Local embeddings by default. No third-party embedding API sees content unless you opt in.
  • Open-source MCP core. Cloud transit and mobile are source-available but proprietary.

Tiers

Free, $0

Full local MCP, local dashboard, Ollama embeddings, agent capture, manual ingestion, custom memory types, all search_* tools.

For the power user who wants local memory for agents, no cloud at all.

Pro, $9/month or $90/year

Everything in Free, plus mobile app, cloud transit (voice, Drive, Notion shares), auto-connectors (Notion watch, Drive watch, YouTube auto-ingest), encrypted cloud save for multi-PC sync, online dashboard, email support.

Premium embeddings, $0.20 per 1M tokens

Voyage AI or OpenAI embeddings instead of Ollama. For more semantic precision. Passthrough cost plus a small margin.

Vs the alternatives

ObsidianNotion AIMem.aiEngram
Local-first storage✓ markdown on disk
Vector searchvia plugin✓ native
Local embeddingsvia plugin✓ native
MCP nativevia community plugin✓ designed for it
Continuous multi-source watchone-shot importpartialpartial✓ native
Agent conversation capturecustom plugin✓ first-class
E2E encrypted syncpaid add-on✓ Pro

What Obsidian does better

Markdown editing, graph view, daily notes, canvas, themes, hotkeys, thousands of plugins, mature mobile editor. Engram doesn't compete on any of that. If you already have a happy plugin stack, Engram's pitch is the unified install with continuous multi-source watch baked in.

Waitlist

We're not collecting emails through this site yet. For now, the cleanest path is a DM on @LeRaviole_ or the Raviole Labs Discord. Tell us what you'd want it to remember and we'll add you to the early-access list.