Open weights + an OSI approved license is generally what is used to refer to models as open source. the with that said, Deepseek R1 is am MIT license, and this one is Apache 2. Technically that makes Deepseek less restrictive, but who knows.
Comment on [JS Required] MiniMax M1 model claims Chinese LLM crown from DeepSeek - plus it's true open-source
LWD@lemm.ee 1 day ago
What exactly makes this more “open source” than DeepSeek? The linked page doesn’t make that particularly clear.
DeepSeek doesn’t release their training data (but they release a hell of a lot of other stuff), and I think that’s about as “open” as these companies can get before they risk running afoul of copyright issues. Since you can’t compile the model from scratch, it’s not really open source. It’s just freeware. But that’s true for both models, as far as I can tell.
fmstrat@lemmy.nowsci.com 1 day ago
NGnius@lemmy.ca 1 day ago
Yup, this is open weights just like DeepSeek. Open source should mean their source data is also openly available, but we all know companies won’t do that until they stop violating copyright to train these things.
LWD@lemm.ee 1 day ago
I figured as much. Even this line…
… is right above a chart that calls it “open-weight”.
I dislike the conflation of terms that the OSI has helped legitimize. Up until LLMs, nobody called binary blobs “open-source” just because they were compiled using open-source tooling. That would be ridiculous