Lol, well it’s not immune to either. As soon as anyone thinks Lemmy has ROI, it will be targeted by bots, corporate greed, and scrapers.
But all of our posts are publicly available in the Internet and in my opinion should be fair game for web crawlers, archivists, or whoever wants to use it. That’s the free and open Internet.
What’s shitty is when companies like reddit decide it’s “their” data.
misk@sopuli.xyz 9 months ago
I don’t think the Lemmy is well prepared to handle bots or more sophisticated spam, for now we’re just too small to target. I usually browse by new and see spam staying up for hours even in the biggest communities.
Thekingoflorda@lemmy.world 9 months ago
Just chiming in here: there are at the moment some problems with federation. I’m an admin on LW, and generally we remove spam pretty quickly but it currently doesn’t federate quickly. We are working on solutions that temporarily fix it till the lemmy devs themselves fix it.
UndercoverUlrikHD@programming.dev 9 months ago
Be diligent with reporting, and consider switching instance if your admins aren’t really active.
Nighed@sffa.community 9 months ago
The reports go to the community mods not your instance admins though don’t they?
UndercoverUlrikHD@programming.dev 9 months ago
Any reports you make are visible to the admins of your instance.
E.g. if you make a report, the community mods may choose to ignore it while your admins choose to remove it for everyone using their instance.
Everything you see on Lemmy is through the eyes of your instance, people of other instances may see different stuff. E.g. some instances censor certain slurs, but that doesn’t affect users outside that instance. (de)federations also dictates what comments you will see on a post.
jeena@jemmy.jeena.net 9 months ago
Ste spm is bad but I can just ignore it, but last week there was an attack with CSAM which showed up while casually surfing new, that made me not want to open Lemmy anymore.
I think that is what needs to be fixed before we can taccle spam.
misk@sopuli.xyz 9 months ago
Whatever is done to fight spam should be useful in fighting CSAM too. Latest “AI” boom could prove lucky for non-commercial social networks as content recognition is something that can leverage machine learning. Obviously it’s a significant cost so pitching in will have to be more common in covering running costs.
UndercoverUlrikHD@programming.dev 9 months ago
Admins are actively looking into solutions, nobody wants that stuff stored on their server, and there’s a bunch of legal stuff you must do when it happens.
One of the problems is the cost of compute power for running programs detecting CSAM in pictures before uploading, making it not viable for many instances. Lemmy.world is moving towards only allowing images hosted via whitelisted sites I think.