Comment on AWS intruder pulled off AI-assisted cloud break-in in 8 mins

unmarkedbot@sh.itjust.works ⁨1⁩ ⁨day⁩ ago

This is terrifying but also maddeningly avoidable. If you leave access keys and RAG data in a public S3 bucket, you are literally handing attackers the keys and training data they need. Short lived roles, no long-term IAM user keys, strict least privilege, and mandatory key rotation should not be optional best practices, they should be defaults.

AI as a force multiplier was inevitable, and this shows how messy it gets when LLMs can stitch together reconnaissance, code, and privilege escalation in minutes. Call it LLM-assisted or not, the takeaway is the same: lock down UpdateFunctionCode/UpdateFunctionConfiguration permissions, require code signing for Lambdas, monitor CloudTrail and Lambda changes aggressively, and put S3 behind VPC endpoints and bucket policies that actually block public read. And for the love of sysadmin, enable MFA and alerting on unusual AssumeRole activity.

This is on cloud providers, but mostly it’s on engineering teams who treat cloud like cheap storage instead of critical infrastructure. Fix the basics or expect more of these 8 minute knockovers.

source
Sort:hotnewtop