Want your LLM to use the internet? Here's an MCP server for that.
Posted by Total-Resort-3120@reddit | LocalLLaMA | View on Reddit | 6 comments
The showcased examples were made using Gemma 4 31b.
Any LLM with tool calling support should work.
Check the README for setup instructions: https://github.com/BigStationW/Local-MCP-server
Upstairs-Review8405@reddit
Thank you so much for this project. This was my first time trying to use tools with a local language model, and I encountered many unknown errors during installation. Fortunately, Gemini helped me fix them. This project performs very well on Gemma 3.6. Thank you very much.
Total-Resort-3120@reddit (OP)
You're welcome o/ Btw, what was those errors exactly? I could use them to make a more solid installation script.
Upstairs-Review8405@reddit
Haha, actually, I'm a complete novice and have absolutely no idea where to begin. I wish you all the best with your project!
FatheredPuma81@reddit
I no longer need a cloud LLM to do quick web research : r/LocalLLaMA
ambient_temp_xeno@reddit
Seems cool. Why does it need to go in comfyui though?
Total-Resort-3120@reddit (OP)
Nice catch, my b, I copy pasted the wrong line during the process, it's fixed now.