[…] reCAPTCHA […] isn’t to detect bots. It is more of stopping automated requests […]
which is bots. bots do automated requests and every automated request doer can also be called a bot (i.e. web crawlers are called bots too and -if kind- also respect robots.txt which has “bots” in its name for this very reason and bots is the shortcut for robots) use of different words does not change reality behind it, but may add a fact of someone trying something on the other.
nickwitha_k@lemmy.sdf.org 3 months ago
There are much better ways of rate limiting that don’t steal labor from people.
serenissi@lemmy.world 3 months ago
hCaptcha, Microsoft CAPTCHA all do the same. Can you give example of some that can’t easily be overcome just by better compute hardware?
nickwitha_k@lemmy.sdf.org 3 months ago
The problem is the unethical use of software that does not do what it claims and instead uses end users for free labor. The solution is not to use it. For rate limiting a proxy/load-balancer like HAProxy will accomplish the task easily. Ex:
haproxy.com/…/haproxy-forwards-over-2-million-htt…
haproxy.com/…/four-examples-of-haproxy-rate-limit…