If you have billions of targets to scan, there’s generally no need to handle each and every edge case. Just ignoring what you can’t understand easily and jumping on to the next target is an absolutely viable strategy. You will never be able to process everything anyway.
Of course, it changes a bit if some of these targets actually make your bot crash. If it happens to often, you will want to harden your bot against it. Then again, if it just happens every now and then, it’s still much easier to just restart and continue with the next target.
catloaf@lemm.ee 1 day ago
Competent ones, yes. Most developers aren’t competent, scraper writers even less so.
idriss@lemm.ee 1 day ago
That’s true. Scrapping is a gold mine for the people that don’t know. I worked for a place which crawls the internet and beyond (fetches some internal dumps we pay for). There is no chance a zip bomb would crash the workers as there are strict timeouts and smell tests (even if a does it will crash an ECS task at worst and we will be alerted to fix that within a short time). We were as honest as it gets though, following GDPR, honoring the robots file, no spiders or scanners allowed, only home page to extract some insights.
I am aware of some big name EU non-software companies very interested in keeping an eye on some key things that are only possible with scraping.