You can do that! You just have to block known cloud service providers, known scraper ASNs, and while this is not at the firewall level, a captcha or other challenge like cloudflare or anubis.
Comment on Digg Shut Down
comador@lemmy.world 12 hours ago
TL;DR:
AI bota and AI agents desteoyed it.
Sincerely a real problem and I as a sysadmin for various www sites: I loathe them daily.
If Cisco, F5 etc could invent a way to block these bots at the firewall and load balancer level, they’d make billions.
frongt@lemmy.zip 12 hours ago
silverneedle@lemmy.ca 9 hours ago
lmao
Skavau@piefed.social 12 hours ago
Indeed, but zero mod tools other than “delete post” 2 months in was genuinely laughable. To be frank, it should’ve launched with proper moderation: delete posts, ban users, sticky posts, filters for post-types etc. This is standard stuff that users shouldn’t even have to haggle for.
If they gave community moderators proper tools to help them here and put up walls - they could’ve mitigated a lot of this.
otter@lemmy.ca 12 hours ago
From what I remember, they were going to “use AI” to handle moderation. It felt like a grift from the beginning
Skavau@piefed.social 12 hours ago
A Reddit-styled site where AI handles community moderator decisions isn’t reddit. Communities aren’t communities, they’re just hashtags.
daychilde@lemmy.world 9 hours ago
They certainly didn’t have enough coders for the project. It needed a hell of a lot more features more quickly.
It also didn’t take off with users. Maybe because of the features, maybe just standard network effects, hard to say.
I believe bots were part of the failure, but I don’t think that was the whole reason at all. I think that’s the part of the reason they thought they’d focus on.
It was not a successful site.
Skavau@piefed.social 9 hours ago
I think you can reasonably blame the lack of features here, honestly. I’m not saying if they had them they would have challenged Reddit, but they’d have been much more active. Community moderators almost certainly lost interest when they realised they had no real control over their community, and the longer the time elapsed with no tools to do so - the more drifted away leaving abandoned communities where AI and bots and trolls move-in compounding it even further.
They also, on day 1 of their community launch, allowed day 1 old accounts to make communities. Even if each account could only moderate 2 communities, that wasn’t smart at all.