Openclaw context limit exceeded

Posted by Certain_Pen_1982@reddit | LocalLLaMA | View on Reddit | 13 comments

I’m trying to run glm 4.7 flash with llama.cpp on openclaw but I can’t seem to get past and issue where whenever I try to ask it any questions, it responds by telling me my context limit was exceeded, I’ve tried changing the limit in the json and in my commas to run llama-server, but it’s always the same error and I cant seem to find any documentation, any help/advice is appreciated