Because of AI bots ignoring robots.txt (especially when you don’t explicitly mention their user-agent and rather use a * wildcard) more and more people are implementing exactly that and I wouldn’t be surprised if that is what triggered the need to implement robots.txt support for FediDB.
Comment on FediDB has stoped crawling until they get robots.txt support
Semi_Hemi_Demigod@lemmy.world 8 months ago
Robots.txt is a lot like email in that it was built for a far simpler time.
It would be better if the server could detect bots and send them down a rabbit hole rather than trusting randos to abide by the rules.
- poVoq@slrpnk.net 8 months ago
- jagged_circle@feddit.nl 8 months ago- It is not possible to detect bots. Attempting to do so will invariably lead to false positives denying access to your content to what is usually the most at-risk & marginalized folks - Just implement a cache and forget about it. Of read only content is causing you too much load, you’re doing something terribly wrong. - bhamlin@lemmy.world 8 months ago- While I agree with you, the quantity of robots has greatly increased of late. While still not as numerous as users, they are hitting every link and wrecking your caches by not focusing on hotspots like humans do. - jagged_circle@feddit.nl 8 months ago- You need a bigger cache. If you dont have enough RAM, host it on a CDN - bhamlin@lemmy.world 8 months ago- Sure thing! Help me pay for it? 
 
 
- Fredthefishlord@lemmy.blahaj.zone 8 months ago- False positives? Meh who cares … That’s what appeals are for - jagged_circle@feddit.nl 8 months ago- Every time I tried to appeal, I either got no response or met someone who had no idea what I was talking about. - Occasionally a bank fixes the problem for a week. Then its back again. 
 
 
swizzlestick@lemmy.zip 8 months ago
Already possible: Nepenthes.
Semi_Hemi_Demigod@lemmy.world 8 months ago
I’m sold
Skepticpunk@lemmy.world 8 months ago
Ooh, nice.
merthyr1831@lemmy.ml 8 months ago
that website feels like uncovering a piece of ancient alien weaponry