is there a way for a local model to independently seek advice from larger one online (claude or gemini)
Posted by bonesoftheancients@reddit | LocalLLaMA | View on Reddit | 4 comments
I was wondering if there is any model that is built to ask for help when it is stuck, specifically for coding
RedParaglider@reddit
I read something form google a while back that said how they expected these systems to work eventually.
EndlessZone123@reddit
Given proper tool calls and instructions, any model can do this.
Outline how many time it can fail to produce desired goal. If met load up a cli of a bigger model to get help.
Equivalent_Job_2257@reddit
Actually a good question, dunno why downvoted. Another variant - when you have limited VRAM, when to load big model, that will think and write slower? I think, you just need to provide tool for the model that encapsulates API call, describe when to use , and that's it.
Badger-Purple@reddit
Yes, create an mcp tool to do this, or use tools like Context7 for access to additional knowledge, etc.