Has anyone found a Python library that handles LLM conversation storage + summarization (not memory systems)?
Posted by sarvesh4396@reddit | LocalLLaMA | View on Reddit | 3 comments
What I need:
- store messages in a DB (queryable, structured)
- maintain rolling summaries of conversations
- help assemble context for LLM calls
What I don’t need:
- full agent frameworks (Letta, LangChain agents, etc.)
- “memory” systems that extract facts/preferences and do semantic retrieval
I’ve looked at Mem0, but it feels more like a memory layer (fact extraction + retrieval) than simple storage + summarization.
Closest thing I found is stuff like MemexLLM, but it still feels either early or not exactly focused on this use case.
Is there something that actually does just this cleanly, or is everyone rolling their own?
EffectiveCeilingFan@reddit
Yes. My favorite DB is PostgreSQL. If you need vector storage at some point, just add pgvector.
In general, there’s no reason to use anything other than Postgres unless Postgres is actively failing you. YAGNI is strong advice here.
Joozio@reddit
For my use case, file-based memory with layered markdown won out over DB-backed solutions.
sarvesh4396@reddit (OP)
I'm kinda building for mass users so would need something scalable and maintenable.