Validating a visual orchestration tool for local LLMs (concept feedback wanted)

Posted by HarjjotSinghh@reddit | LocalLLaMA | View on Reddit | 6 comments

Hey r/LocalLLaMA,

Before I build this, I want to know if it's actually useful.

The Problem (for me): Running multiple local models in parallel workflows is annoying: - Writing Python scripts for every workflow - Managing async execution - Debugging when things break - No visual representation of what's happening

What I'm considering building:

Visual orchestration canvas (think Node-RED but for LLMs):

Features (planned): - Drag-and-drop blocks for Ollama models - Parallel execution (run multiple models simultaneously) - Real-time debugging console - Export to Python (no lock-in) - Local-first (API keys never leave the machine)

Example workflow: Question → 3 local models in parallel: - Llama 3.2: Initial answer - Mistral: Fact-check - Mixtral: Expand + sources

All running locally. Target: <10 seconds.

Tech stack (if I build it): - Mext.js + React Flow (canvas) - Express.js/Hono backend - WebSockets + SSE (real-time updates) - LangChain (orchestration layer) - Custom Ollama, LMStudio, and vLLL integrations

Why I'm NOT building yet:

Don't want to spend 3 months on something nobody wants.

The validation experiment: - IF 500 people sign up → I'll build it - If not, I'll save myself 3 months

Current status: 24/500 signups

Questions for local LLM users:

  1. Is visual orchestration useful or overkill?
  2. What local-model workflows would you build?
  3. Missing features for local deployment?
  4. Would you PAY $15/month for this? Or should it be open-source?

What I need from r/LocalLLaMA:

Brutal technical feedback: - Is this solving a real problem? - What integrations matter most? - Performance concerns with Ollama? - Should I open-source the Ollama connector?

Mockups: Link in comments - concept only, no product yet.

The ask:

If this sounds useful, sign up (helps me validate) If this sounds dumb, roast it (saves me 3 months)

Thanks for the feedback!