Comment on What steps can be taken to prevent AI training and scraping of my public facing website?

<- View Parent
Dekkia@this.doesnotcut.it ⁨2⁩ ⁨days⁩ ago

The idea behind anubis is that a browser needs to deliver proof-of-work before accessing a website.

If you’re doing it one-off with puppeteer, your “browser” will happily do just that.

But if you’re scraping millions of websites, short challenges like this add up quickly and you’ll end up wasting lots of compute on them. As long as scrapers decide that those websites are not worth it anubis works.

source
Sort:hotnewtop