Comment on What steps can be taken to prevent AI training and scraping of my public facing website?
Dekkia@this.doesnotcut.it 2 days agoThe idea behind anubis is that a browser needs to deliver proof-of-work before accessing a website.
If you’re doing it one-off with puppeteer, your “browser” will happily do just that.
But if you’re scraping millions of websites, short challenges like this add up quickly and you’ll end up wasting lots of compute on them. As long as scrapers decide that those websites are not worth it anubis works.
snoons@lemmy.ca 2 days ago
The only stable invidious instance I know of is now a heck of a lot more stable thanks to it also.