Definitions are tricky, and especially for terms that are broadly considered virtuous/positive by the general public (cf. “organic”) but I tend to deny something is open source unless you can recreate any binaries/output AND it is presented in the “preferred form for modification” (i.e. the way the GPLv3 defines the “source form”).
A disassembled/decompiled binary might nominally be in some programming language–suitable input to a compiler for that langauge–but that doesn’t actually make it the source code for that binary because it is not in the form the entity most enabled to make a modified form of the binary (normally or original author) would prefer to make modifications.
vrighter@discuss.tchncs.de 3 weeks ago
I view it as the source code of the model is the training data. The code supplied is a bespoke compiler for it, which emits a binary blob (the weights). A compiler is written in code too, just like any other program. So what they released is the equivalent of the compiler’s source code, and the binary blob that it output when fed the training data (source code) which they did NOT release.
pishadoot@sh.itjust.works 3 weeks ago
This is probably the best explanation I’ve seen so far and really helped me actually understand what it means when we talk about “weights” for LLMs.