Comment on What steps can be taken to prevent AI training and scraping of my public facing website?

toothbrush@lemmy.blahaj.zone ⁨1⁩ ⁨week⁩ ago

Best practice right now is Anubis, and if you want to do a little bit extra and fight against robots.txt violating bots you could set up a infinite web of garbage with links to more garbage in a hidden part of your website. Be aware that it will cost you bandwith keeping them busy.

source
Sort:hotnewtop