Even better. Build a WordPress plugin to do this.
Comment on AI companies are violating a basic social contract of the web and and ignoring robots.txt
lvxferre@mander.xyz 10 months agoGood old honeytrap. I’m not sure, but I think that it’s doable.
Have a honeytrap page somewhere in your website. Make sure that legit users won’t access it. Disallow crawling those pages through robots.txt.
Then if some crawler still access it, you could simply record it in the logs and ban the relevant IPs. Or you could be really nasty, and fill the page with poison - nonsensical text that would look like something that humans would write.
thefactremains@lemmy.world 10 months ago
KairuByte@lemmy.dbzer0.com 10 months ago
I’m the idiot human that digs through robots.txt and the site map to see things that aren’t normally accessible by an end user.
lol@discuss.tchncs.de 10 months ago
[deleted]lvxferre@mander.xyz 10 months ago
For banning: I’m not sure but I don’t think so. It seems to me that prefetching behaviour is dictated by a page linking another, to avoid any issue all that the site owner needs to do is to not prefetch links for the honeytrap.
For poisoning: I’m fairly certain that it doesn’t. At most you’d prefetch a page full of rubbish.
CosmicTurtle@lemmy.world 10 months ago
I think I used to do something similar with email spam traps. Not sure if it’s still around but basically you could help build NaCL lists by posting an email address on your website somewhere that was visible in the source code but not visible to normal users, like in a div that was way on the left side of the screen.
Anyway, spammers that do regular expression searches for email addresses would email it and get their IPs added to naughty lists.
I’d love to see something similar with robots.
lvxferre@mander.xyz 10 months ago
Yup, it’s the same approach as email spam traps. Except the naughty list, but… holy fuck that’s a great idea, if people can share their bot IP banlists with each other the damage to those web crawling businesses would be even bigger.
Nighed@sffa.community 10 months ago
but with all of the cloud resources now, you can switch through IP addresses without any trouble. hell, you could just browse by IP6 and not even worry with how cheap those are!
lvxferre@mander.xyz 10 months ago
Yeah, that throws a monkey wrench into the idea. That’s a shame, because “either respect robots.txt or you’re denied access to a lot of websites!” is appealing.