Comment on Codeberg: army of AI crawlers are extremely slowing us; AI crawlers learned how to solve the Anubis challenges.

<- View Parent
umbraroze@slrpnk.net ⁨10⁩ ⁨hours⁩ ago

The problem isn’t that the data is already public.

The problem is that the AI crawlers want to check on it every 5 minutes, even if you try to tell all crawlers that the file is updated daily, or that the file hasn’t been updated in a month.

AI crawlers don’t care about robots.txt or other helpful hints about what’s worth crawling or not, and hints on when it’s good time to crawl again.

source
Sort:hotnewtop