I see what you mean, but is there any evidence that the models are biased in a way that affirms the world view of the owners? If I understood you correctly? I couldn’t find any.
I’m as sceptical of the capitalist fuckwits as you seem to be, but their power seems to me to be more political/capitalist through the idea of AGI, than through the models themselves. Something that simple sequestration could solve. But that’s on the government and the voters.
I’m not sure about the point you are trying to make with CLIP. It’s not a topic I’m familiar with, but also seems to be more a problem of the usage and people that the technology itself. Naïve usage by people who want to follow trends/the cheapest option/just something that works in any capacity.
For me, the issue lies first in the overhyped marketing which is par on course for basically anything, unfortunately, as well as the fact, that suddenly copyright infringement is fine, if you make enough money off of it and lick powerful boots. If it was completely open for everyone, it wouls be a completely different story IMO.
Also, I do not think that the models were created with the goal of pushing a certain narrative. They lucked into it being popular, completely unexpectedly, and only then the vultures started seeing the opportunity. So we will see how it evolves in that regard, but I don’t think this is what we’re seeing currently.
lime@feddit.nu 1 week ago
so, this is an interesting point. we know they are biased because we’ve done fairness reviews, and we know that that bias is in line with the bias of silicon valley as a whole. whether that means the bias is a) intentional, b) coincidentally aligned or c) completely random is impossible to tell.
as long as the product is sold as it is today, i believe it reinforces that power.
i don’t really understand what you mean by sequestration here. like, limit who is allowed to use it? i feel like that power lies with the individual user, even though regulation definitely can help.
agreed, which is why i as an “abolish copyright law” person am so annoyed to find myself siding with the industry in the cases ongoing against the ai companies. then again, we have “open weight” models that can still be used for the same thing, because the main problem was never copyright itself but the system it exists within.
the purpose of a system is what it does. some people with a certain ideology made a thing capable of “expressing itself”, and by virtue of the thing being made by those people it expresses itself in a similar way. whether it is intentional or not doesn’t really factor into it, because as long as the people selling it do not see a problem with it it will continue to express itself in that fashion. this connects back to my first point; we know the models have built in bias, and whether the bias was put there deliberately or is in there as a consequence of ingesting biased data (for which the same rule holds) doesn’t matter. it’s bias all the way down, and not intentionally working against that bias means the status quo will be reinforced.