Search engines been going relatively fine for decades now. But the crawlers from AI companies basically DDOS hosts in comparison, sending so many requests in such a short interval. Crawling dynamic links as well that are expensive to render compared to a static page, ignoring the robots.txt entirely, or even using it discover unlinked pages.
Servers have finite resources, especially self hosted sites, while AI companies have disproportinately more at their disposal, easily grinding other systems to a halt by overwhelming them with requests.
drspod@lemmy.ml 12 hours ago
What’s best for the website owners is to have people actually visit and interact with their website. Blocking AI tools is consistent with that.
panda_abyss@lemmy.ca 11 hours ago
For a lot of AI search I actually end up reading the pages, so I don’t know hope much this stops that
AstralPath@lemmy.ca 11 hours ago
You’re the outlier, I promise. People are literally forfeiting their brains in favor of an LLM transplant hese days.
pennomi@lemmy.world 11 hours ago
On the flip side, most websites are so ad-ridden these days a reader mode or other summary tool is almost required for normal browsing. Not saying that AI is the right move, but I can understand not wanting to visit the actual page any more.