When a machine moderates content, it evaluates text and images as data using an algorithm that has been trained on existing data sets. The process for selecting training data has come under fire as it’s been shown to have racial, gender and other biases.
The problem isn’t AI itself, it’s that comapies are willing to do that and then fire any customer support or human you could ever talk to. They let their automod ruin people’s lives and accounts, then barricades itself, impossible to reach.
aard@kyu.de 1 year ago
It’s been shown over and over again for the last two decades that you can’t build a reliable archive on platforms outside of your control. Stuff like Instagram can be useful for trying to draw traffic to your own platform - but you always should treat those platforms as throwaway content.
mtcerio@lemmy.world 1 year ago
I came here to say exactly this. IG and all the others are private companies, with their own terms and conditions one agrees on, and also agrees those terms can be changed by them at any time. Moderation of content is part of it. Deal with it, or don’t use them at all.