Another way to use local llm, have an MCP server that talk to a Qemu computer. What do you think?
Posted by leonardosalvatore@reddit | LocalLLaMA | View on Reddit | 11 comments
I think is nice to contain the MCP into a Qemu enviroment where the LLM can do whatever ... here is doing GDB on a LVGL program.
https://github.com/leonardosalvatore/llama.cpp.debugger
CalligrapherFar7833@reddit
You dont need an mcp to instruct your llm to use virsh remotely/locally
leonardosalvatore@reddit (OP)
yes if I need to call GRPC, MQTT methods that are not part of this open source project.
leonardosalvatore@reddit (OP)
Also I found useful that the tools "description" will give a context to the llm.
But I'm still learning =]
nonerequired_@reddit
LLM probably knows what each parameter do. Even it is not it can look at help and man pages
leonardosalvatore@reddit (OP)
Yes. For standard tool you are right. Not for a custom API.
CalligrapherFar7833@reddit
Again virsh is not a custom api. You dont need mcp for it.
leonardosalvatore@reddit (OP)
I know what is for but doesn't work when you need to call custom API.
CalligrapherFar7833@reddit
I dont need to call a custom api for a qemu which your post is for.
leonardosalvatore@reddit (OP)
exactly you are right.
you don't need to control a linux os.
But yes for GRPC and MQTT which are not part of this post but are part of project i'm working on.
But i already mention in my first reply.
CalligrapherFar7833@reddit
No need to call api anything shit you cant just use virsh cli and any llm can figure it out without any mcp
wasnt_in_the_hot_tub@reddit
I agree. And I find that MCP is unnecessary for more than half the uses I encounter