I got into the self-hosting send this year via my own website run on old recycled thinkpad on my own home connection. I a lot of time was spent learning about ufw, reverse proxies, header security hardening, fail2ban.
Despite all that I still had a problem with bots knocking on my ports spamming my logs. I tried some hackery getting fail2ban to read caddy logs but that didnt work for me. I nearly considered giving up and going with cloufdlare like half the internet does. But my stubbornness for open source self hosting and the recent cloud flare outages this year have encouraged trying alternatives.
Coinciding with that has been an increase in exposure to seeing this thing in the places I frequent like codeberg. This is Anubis, a proxy type firewall that forces the browser client to do a proof-of-work security check and some other nice clever things to stop bots from knocking. I got interested and started thinking about beefing up security.
I’m here to tell you to try it if you have a public facing site and want to break away from cloudflare It was VERY easy to install and configure with caddyfile on a debian distro with systemctl. In an hour its filtered multiple bots and so far it seems the knocks have slowed down.
non_burglar@lemmy.world 3 weeks ago
Anubis is an elegant solution to the ai bot scraper issue, I just wish the solution to everything wasn’t just spending compute everywhere. In a world where we need to rethink our energy consumption and generation, even on clients, this is a stupid use of computing power.
Dojan@pawb.social 3 weeks ago
It also doesn’t function without JavaScript. If you’re security or privacy conscious chances are not zero that you have JS disabled, in which case this presents a roadblock.
On the flip side of things, if you are a creator and you’d prefer to not make use of JS (there’s dozens of us) then forcing people to go through a JS “security check” feels kind of shit. The alternative is to just take the hammering, and that feels just as bad.
SmokeyDope@piefed.social 2 weeks ago
Theres a compute option that doesnt require javascript. Its on site owners to configure IMO, though you can make the argument its not default I guess.
https://anubis.techaro.lol/docs/admin/configuration/challenges/metarefresh
From docs on Meta Refresh Method
Meta Refresh (No JavaScript)
The
metarefreshchallenge sends a browser a much simpler challenge that makes it refresh the page after a set period of time. This enables clients to pass challenges without executing JavaScript.To use it in your Anubis configuration:
This is not enabled by default while this method is tested and its false positive rate is ascertained. Many modern scrapers use headless Google Chrome, so this will have a much higher false positive rate.
natecox@programming.dev 3 weeks ago
I feel comfortable hating on Anubis for this. The compute cost per validation is vanishingly small to someone with the existing budget to run a cloud scraping farm, it’s just another cost of doing business.
The cost to actual users though, particularly to lower income segments who may not have compute power to spare, is annoyingly large. There are plenty of complaints out there about Anubis being painfully slow on old or underpowered devices.
Some of us do actually prefer to use the internet minus JS, too.
Plus the minor irritation of having anime catgirls suddenly be a part of my daily browsing.
cecilkorik@piefed.ca 2 weeks ago
I’m with you here. I come from an older time on the Internet. I’m not much of a creator, but I do have websites, and unlike many self-hosters I think, in the spirit of the internet, they should be open to the public as a matter of principle, not cowering away for my own private use behind some encrypted VPN. I want it to be shared. Sometimes that means taking a hammering. It’s fine. It’s nothing that’s going to end the world if it goes down or goes away, and I try not to make a habit of being so irritating that anyone would have much legitimate reason to target me.
I don’t like any of these sort of protections that put the burden onto legitimate users. I get that’s the reality we live in, but I reject that reality, and substitute my own. I understand that some people need to be able to block that sort of traffic to be able to limit and justify the very real costs of providing services for free on the Internet and Anubis does its job for that. But I’m not one of those people. It has yet to cost me a cent above what I have already decided to pay, and until it does, I have the freedom to adhere to my principles on this.
To paraphrase another great movie: Why should any legitimate user be inconvenienced when the bots are the ones who suck. I refuse to punish the wrong party.
quick_snail@feddit.nl 2 weeks ago
This is why we need these sites to have .onions. Tor Browser has a PoW that doesn’t require js
cadekat@pawb.social 2 weeks ago
Scarcity is what powers this type of challenge: you have to prove you spent a certain amount of electricity in exchange for access to the site, and because electricity isn’t free, this imposes a dollar cost on bots.
You could skip the detour through hashes/electricity and do something with a proof-of-stake cryptocurrency, and just pay for access. The site owner actually gets compensated instead of burning dead dinosaurs.
Obviously there are practical roadblocks to this today that a JavaScript proof-of-work challenge doesn’t face, but longer term…
natecox@programming.dev 2 weeks ago
The cost here only really impacts regular users, too. The type of users you actually want to block have budgets which easily allow for the compute needed anyways.
artyom@piefed.social 2 weeks ago
Maybe if the act of transferring crypto didn’t use a comparable or greater amount of energy…
daniskarma@lemmy.dbzer0.com 2 weeks ago
I think the issue is that many sites are too aggressive with it. Anubis can be configured to only ask for challenges if the site is under unusual load, for instance when a botnet it’s actually ddosing the site. That’s when it shines.
Making it constantly ask for challenges when the service is not under attack is just a massive waste of energy. And many sites just enable it constantly because they can defer bot pings from their logs that way. That’s for instance what op is doing. It’s just a big misunderstanding of the tool.
quick_snail@feddit.nl 2 weeks ago
We have memory hard cryptographic functions