Severely disrupting other people’s data processing of significant import to them. By submitting malicious data requires intent to cause harm, physical destruction, deletion, etc, doesn’t.
Why the hell would an ISP have a look at this. And ever if, they’re professional enough to detect zip bombs. Which btw is why this whole thing is pointless anyway: If you class requests as malicious, just don’t serve them. Much more sensible to go the anubis route and demand proof of work as that catches crawlers which come from a gazillion IPs with different user agents etc.
MimicJar@lemmy.world 3 hours ago
I wonder if having a robots.txt file that said to ignore the file/path would help.
I’m assuming a bad bot would ignore the robots.txt file. So you could argue that you put up a clear sign and they chose to ignore it.