Been using LLM Wiki Compiler since it's early days, it’s getting better!
Posted by knlgeth@reddit | LocalLLaMA | View on Reddit | 1 comments
So I’ve been using LLM Wiki Compiler since it first launched, inspired by Andrej Karpathy’s LLM knowledge base idea. Early version was promising but rough. This 0.02.0 update makes it feel way more usable.
Key upgrades:
- Paragraph level citations Every paragraph links to its source, so you can actually verify outputs.
llmwiki lintFinds broken links, orphaned pages, and inconsistencies as your wiki grows.- Obsidian integration Works with existing PKM workflows, no need to switch tools.
- Multi provider support Not locked to one model, easier to switch based on cost or setup.
- Semantic search Finds content by meaning, not just keywords.
- MCP server support Agents can read and update the wiki directly.
Overall:
Still the same Karpathy style LLM wiki idea, just much more solid now. Feels less like an experiment and more like real infra. In case you have some more reco with the same core loop and features, lmk and will surely test it out as well!
riddlemewhat2@reddit
The Obsidian + LLM combo is smart because Obsidian's graph view becomes a visual interface for the AI's compiled knowledge graph. That's genuinely clever UX.