12,000 visits, with 181 of those to the robots.txt file makes way, way more sense, albeit, still an abusive number of visits.
I couldn’t wrap my head around how large the number was and how many visits that would actually entail to reach that number in 25 days. Turns out that would be roughly a quinquinquagintillion visits per nanosecond. Call it a hunch, but I suspect my server might not handle that.