This is why people who are gung ho about AI policing need to slow their role.
If they got their way, what they don’t realize is that it’s actually what the big AI companies have wanted and been begging for all along.
They want AI to stay centralized and impossible to enter as a field.
This is why they want to lose copyright battles eventually such that only they will have the funds to actually afford to make usable AI things in the future (this of course is referring to the types of AI that require training material of that variety).
What that means is there will be no competitive open source self hostable options and we’d all be stuck sharing all our information through the servers of 3 USA companies or 2 Chinese companies while paying out the ass to do so.
What we actually want is sanity, where its the end product that is evaluated against copy right.
For a company selling AI services, you could argue that this is service itself maybe, but then what of an open source model? Is it delivering a service?
I think it should be as it is. If you make something that violates copyright, then you get challenged, not your tools.
Tehdastehdas@lemmy.world 5 months ago
Democratisation of powerful tools won’t work because it’s easier to use for destruction than for the opposite. Every psychopath designing superbugs and inventing future weapons.
www.youtube.com/watch?v=86k8N4YsA7c
survirtual@lemmy.world 5 months ago
Irrelevant.
AI is here. Either people have access to it and we trust it will balance, or we become slaves to the people who own it and can use it without restrictions.
The premise that it is easier for destruction is also an assumption. Nature could have evolved to destroy everything and not allow advanced life, yet we are here.
The solution to problems doesn’t need to always be a tighter grip and more control. Believe it or not that tends to backfire catastrophically worse than if we allowed the possibility of the thing we fear.