Anthropic's Claude remote uses GLM-4.7
Posted by bobbiesbottleservice@reddit | LocalLLaMA | View on Reddit | 45 comments
I just noticed this after a bug wasn't getting fixed. If you start a Claude code remote environment the default model (hidden on mobile) is glm 4.7 I assumed anthropic only used their own models for everything so it was interesting to me that they are actually serving open weights models (which I assume they have accused of distilling from their own proprietary models)
Ell2509@reddit
Are you sure you didn't point CC at it, somehow?
If so, either you just blow the lid off some global scale plagiarism, or you found a clue about current or future commercial collaborations between the 2 labs.
Likeatr3b@reddit
Can you do this yet? I’ve predicted that local AI will eat their lunch. But haven’t seen them fold and allow local models to use their products.
ThoreaulyLost@reddit
That's a very, very, very optimistic prediction. While I personally would love to see that, there are too many political and financial forces in play that will prevent that from happening at scale. Just look at the chip and drive shortages.
The reason corporate AI will win is because old money us entrenched in it. They don't like not getting their investment back. Right now we're in the "let's pretend there's competition" (think 1980s computers, or early car manufactures) when it's in reality a pretty incestuous little network of "leaders". Given 20 years it will be a consolidation of monopolistic partnerships, who probably divide the field like kings after a battle (lower AIs in cars or vacuums, medium AIs in civilian products, top AIs in government surveillance, traffic control, aiports, etc).
My long term bet is on whomever does the next AI merger, which is why, if OP is right, I'm off to buy some stocks.
Ell2509@reddit
Sadly, you are right.
Anyone who is unsure or hasnt heard of it, Google "techno feudalism"
yinepu6@reddit
or google "google techno feudalism"
imonlysmarterthanyou@reddit
You have been able to for a long while.
bobbiesbottleservice@reddit (OP)
I was using clother with Claude code in another terminal (not on a remote session), but I never used glm-4.7 and I had other local models I was using too, but none of them showed up.
jatjatjat@reddit
That lower case g not lining up with the other model names is sus.
AnotherSoftEng@reddit
Don’t be ridiculous, even Anthropic’s senpai model does this!
john0201@reddit
It’s not a fixed width font. No idea if this is real or not but the font looks real.
_supert_@reddit
Why has this got 68 upvotes? With no evidence?
mp3m4k3r@reddit
And what does it have to do with Local other than you could potentially run GLM locally?
my_name_isnt_clever@reddit
"Closed cloud provider using open weights model" is absolutely relevant to this sub...if it were true.
mp3m4k3r@reddit
Good call, on both parts!
Enough_Big4191@reddit
yeah wouldn’t read too much into that yet, these setups often mix models behind the scenes depending on task, latency, or cost. also worth double checking it’s not just a label or fallback behavior. i’ve seen cases where the exposed name doesn’t fully match what’s actually running.
hidden2u@reddit
Great Large Mythos 4.7
bobbiesbottleservice@reddit (OP)
The user deleted the one post where I replied with the screenshot, here it is again for visibility
Recoil42@reddit
How do you know it's GLM 4.7?
bobbiesbottleservice@reddit (OP)
It was listed in the dropdown for model selection on the browser on the desktop
Recoil42@reddit
Take a screenshot...?
AnotherSoftEng@reddit
Darn it! My screenshot keys literally just broke as you said that.
Here I’ll take a picture from my phone instead— shoot, I just dropped my phone into a vat of acid
Limp_Classroom_2645@reddit
Show
Howdareme9@reddit
He doesn’t
Ell2509@reddit
I saw a photo they posted. Looks like it is definitely GLM in the list. But why, we don't know.
BraveBrush8890@reddit
if you're going to make a claim, you need evidence, otherwise it's just words you typed.
bobbiesbottleservice@reddit (OP)
CognitiveSourceress@reddit
Every time someone posts a picture of their screen it looks like they just went off roading with it or something 😕
LinkSea8324@reddit
He took shitposting to another level
TakuyaTeng@reddit
How does anyone even function like that?
jesus_fucking_marry@reddit
Dario talk shit about Chinese LLMs( in general Open Source LLMs) like they can be used for harmful stuffs etc and his company is offering their LLM model in their own product. Very hypocrite.
anomaly256@reddit
I'm starting to think this Dario guy might not be scrupulous. Unscrupulous. Devoid of scruples.
Ell2509@reddit
Satisfying expression!
Howdareme9@reddit
There’s no way you believe this
jesus_fucking_marry@reddit
Lol, I don’t believe what he says. I am just pointing out the blatant hypocrisy.
FranticBronchitis@reddit
Does their definition of "harmful" even match that of AI safety research or is it just "stuff we'd rather our model not talk about to prevent lawsuits"?
HenkPoley@reddit
And what app is this GUI menu in?
Hydroskeletal@reddit
notoriously missing is the Opus 4.7 1M -- which, I dunno I don't see what you see. I think you've got something out of whack and Anthropic is not serving a superceded open weight
miklosp@reddit
Remote is just Remote control of your own session? If you wired up Claude with a different model, I imagine it will work with remote control too?
ZealousidealBadger47@reddit
Save cost.
Alrightly@reddit
Is this real? 😳
g_rich@reddit
It’s not, OP is likely connecting through an LLM gateway which has additional models available or has an additional model configured in their local configuration.
LinkSea8324@reddit
mf just edited his config file and took a screenshot, same shit with claude code
kzoltan@reddit
Interesting… are they “redirecting” to third party hosting or they host their own GLM instance? That might say something about their own models..
Arcosim@reddit
It would be extremely ironic if they self-host it considering how rabidly anti-China Amodei is.
puppymaster123@reddit
This is just not true sigh. I am starting to wonder about all the other negative reports if this is the level of technical we are dealing with