Updated Minimax m2.7 still doesn't allow coding a product. But before the next riot starts, Ryan Lee has already confirmed that they are still working on the license, and sale of products built by m2.7 is permitted.
Posted by zenmagnets@reddit | LocalLLaMA | View on Reddit | 24 comments
someone383726@reddit
Good! I was just about to publish my iOS task tracker app to the App Store that I used 2.7 running locally to build!
One-Replacement-37@reddit
What do you expect of a FREE MODEL?
Stop complaining on Reddit
Buy tokens from commercial models - they transfer all rights of the output to you.
Spend a few millions to make your own model.
Velocita84@reddit
This is LOCAL llama
colin_colout@reddit
local doesn't always mean a free and open license unfortunately.
/r/selfhosted is filled with posts about software with limited licenses. Sometimes the source isn't even available... Plex is one of the top self hosted softwares
I commend your passion about FOSS, but gatekeeping doesn't really help.
Maybe it's time for an /r/openlicensellms sub. ... tho I'd argue "is it FOSS if you can't build from source yourself?" (like training/eval data plus automation scripts), or is it just shareware?
Velocita84@reddit
I don't have anything against buying tokens from providers in and of itself, it's just that i think, in a local focused sub, it's pretty stupid to tell someone "oh you don't like the license? Just spend money on cloud providers!" When instead you can simply recommend using a different open weights model with a more permissive licence.
Recoil42@reddit
Yes, and if the LOCAL options don't meet your needs, you have the option of buying tokens from commercial models. If that doesn't please you, tough titties. The universe doesn't owe you free models, you need to make the choices that are right for you with the options on the table.
emprahsFury@reddit
honestly, a free model should be free. If it's not free, shouldn't call it free.
One-Replacement-37@reddit
Did you have to pay? No? So it is free.
__JockY__@reddit
Sounds fair enough to me: I build internally using the offline weights and I can recommend the API to some customers and local weights to others, as fit for purpose.
Can’t blame MiniMaxAI for wanting to protect against pure race-to-the-bottom hosting services that can run on cheaper margins! I guess MiniMax and other Chinese OG model-dropping companies are getting disrupted by their costs of training plus hosting vs their competition who can “just” focus on hosting the latest free model from MiniMaxAI et al. and avoid that burden entirely.
suicidaleggroll@reddit
Legalese is tricky, I probably wouldn't be able to draft a license that properly controlled this kind of thing either. Looks like all they care about is stopping 3rd party inference providers from providing API access to MiniMax for cheaper than MiniMax themselves can serve it, cutting into their customer base. That shouldn't affect 99.9% of the people here.
BriguePalhaco@reddit
There's an MIT version in the commit history 👀
LienniTa@reddit
hey i have to say that their api service is not good at all. It says stuff about heavy load isntead of working.
Finanzamt_Endgegner@reddit
The license is irrelevant in that case since ai content doesnt have copyright, they literally have nothing against you if you use it to build something with the output of it.
eli_pizza@reddit
That is incorrect. The restriction is on the license for the model, which most people agree is copyrightable.
It could say you promise not to stand on one foot and that would be legally enforceable (though maybe not practically)
teleprint-me@reddit
I've posted this before.
Generally speaking, the outputs of the model are non-copyrightable. They've already set precedence for in multiple cases.
eli_pizza@reddit
Once again: that doesn’t matter!
These are the terms you agree to in order to legally have a copy of the model. If you violate the terms, the model output still is copyrighted - but you have infringed the license of the model.
emprahsFury@reddit
the license and the copyright are two different things. You can break a license without breaking a copyright
Recoil42@reddit
Did you... did you read the tweet?
mr_il@reddit
Well, yes, but commercial inference providers see this as a legal risk and don't offer the model. Not that it matters much in the LocalLLaMA context, but it's not irrelevant
kataryna91@reddit
Those are unrelated concepts. The outputs not being copyrightable doesn't give inference providers a license to host and serve the models.
silenceimpaired@reddit
Has this ever been tested in court? All the arm chair lawyers on Reddit indicate this, but I have yet to see any company act like it’s true.
One-Replacement-37@reddit
What do you expect of a $free$ model?
ilintar@reddit
I'm not surprised at companies not wanting competition at their core service (offering the LLM via API). I'll take it if it stays open weights thanks to that.
ambient_temp_xeno@reddit
Oh well if we can use it for writing actual code without restrictions, happy days.
People wanting to use it as some grifty 'online product' are SOL.