Comment on AI's Future Hangs in the Balance With California Law
WalnutLum@lemmy.ml 4 months agoIn regards to the open source models, while it makes sense that if a developer takes the model and does a significant portion of the fine tuning, they should be liable for the result of that…
This kind of goes against the model that open source has operated on for a long time, as providing source doesn’t represent liability. So providing a fine-tuned model shouldn’t either.
Th4tGuyII@fedia.io 4 months ago
I didn't mean in terms of providing. I meant that if someone provided a base model, someone took that and but on of it, then used it for a harmful purpose - of course the person modified it should be liable, not the base provider.
It's like if someone took a version of Linux, modified it, then used that modified version for a similar person - you wouldn't go after the person who made the unmodified version.
WalnutLum@lemmy.ml 4 months ago
You wouldn’t necessarily punish the person that modified Linux either, you’d punish the person that uses it for a nefarious purpose.
Important distinction is the intention to deceive, not that the code/model was modified to be able to be used for nefarious purposes.