Built a local-first MCP memory layer because my agent kept forgetting me. Feedback wanted.
Posted by Skynetter@reddit | LocalLLaMA | View on Reddit | 0 comments
Every morning my Claude Code (or Gemini CLI) session came back a stranger. CLAUDE.md helped with static rules but couldn't retrieve, resolve contradictions, or sync across devices. Mem0/Letta/Zep all wanted my data in their cloud.
But with m3-memory - MCP server, 100% local, SQLite + FTSS + local embeddings.
What makes it different from the other MCP memory servers out there:
- Hybrid retrieval - FTSS + vector + MMR. Pure vector was scoring exact keyword matches at \~65% for me. Hybrid enhances that dramatically.
- Contradiction resolution - new fact supersedes old one automatically, with bitemporal valid-from/valid_to. Ask "what did this project use on March 1st?11 and get the right answer.
- GDPR primitives - gdpr export, gdprfforget. Article 17+20 actually implemented.
- Windows (Linux & macos)-first tested — CRLF, paths, port conflicts all handled.
25 MCP tools, drop-in for Claude Code, Gemini CLI, Openclaw, Cursor, Aider, anything MCP.
Install: pip install m3-memory
Looking for:
- If you've used mem0/Letta, how does retrieval quality compare on your workload? Rather hear "worse at X" than silence.
- Bug reports on weird setups.
Issues + Discussions open: https://github.com/skynetcmd/m3-memory