Show HN: A local-first memory store for LLM agents (SQLite)
15 comments
·December 14, 2025bilekas
zffr
Yeah it’s strange that the project does not mention using redis, or even SQLite with a vector DB extension.
CharlesW
How would you compare and contrast this to Steve Yegge's Beads (https://github.com/steveyegge/beads/), or to ordinary file-based memory following vendors' guidelines (https://code.claude.com/docs/en/memory)?
catketch
not the OP, but beads is trying to solve a different problem, namely task organization/prioritization/coordination.
This looks more like a straight agent knowledge base to be used with or instead of .md files you might have in the repo that have information about the codebase. To use a bad analogy confluence vs jira.
A4ET8a8uTh0_v2
Parts of this weekend is alloted for a local inference build. It genuinely looks interesting. This is kinda what I hoped for local llm scene would become: everything becomes modular and you just swap pieces you want or think would work well together.
koakuma-chan
This does not look interesting. This is AI slop.
A4ET8a8uTh0_v2
Ok. Why it does not look interesting? It does seem to solve a problem. Have you actually looked into what it takes to build your own equivalent of ollama? It gets into fascinating trade offs real fast.
koakuma-chan
Because this is the output of "Hey cursor, write a memory store for AI agents." This is by no means an equivalent of ollama. I don't know where you got this from.
Check this out: https://github.com/CaviraOSS/OpenMemory/blob/17eb803c33db88a...
davidarenas
It would awesome if this could be part of AgentFS which also runs on SQLite.
You would be able to easily offer agents that have all of a tenants data and agent state in a single file which can be synced onto s3.
This looks interesting, and will try it out to see what it can do, I like the idea of using temporal values as a significant weight, but one thing isn't really clear to me.
> Traditional Vector DBs require extensive setup, cloud dependencies, and vendor lock-in:
Is this really true ? What's wrong with running your own local Redis vector db? They have their open source version that's separate to their hosted offering..
> https://redis.io/docs/latest/operate/oss_and_stack/
Am I missing something ?