Comment on AI companies are violating a basic social contract of the web and and ignoring robots.txt
CosmicTurtle@lemmy.world 9 months agoI think I used to do something similar with email spam traps. Not sure if it’s still around but basically you could help build NaCL lists by posting an email address on your website somewhere that was visible in the source code but not visible to normal users, like in a div that was way on the left side of the screen.
Anyway, spammers that do regular expression searches for email addresses would email it and get their IPs added to naughty lists.
I’d love to see something similar with robots.
lvxferre@mander.xyz 9 months ago
Yup, it’s the same approach as email spam traps. Except the naughty list, but… holy fuck that’s a great idea, if people can share their bot IP banlists with each other the damage to those web crawling businesses would be even bigger.
Nighed@sffa.community 9 months ago
but with all of the cloud resources now, you can switch through IP addresses without any trouble. hell, you could just browse by IP6 and not even worry with how cheap those are!
lvxferre@mander.xyz 9 months ago
Yeah, that throws a monkey wrench into the idea. That’s a shame, because “either respect robots.txt or you’re denied access to a lot of websites!” is appealing.
Nighed@sffa.community 9 months ago
That’s when Google’s browser DRM thing starts sounding like a good idea 😭