Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

AWS intruder pulled off AI-assisted cloud break-in in 8 mins

⁨92⁩ ⁨likes⁩

Submitted ⁨⁨1⁩ ⁨day⁩ ago⁩ by ⁨TheBat@lemmy.world⁩ to ⁨technology@lemmy.world⁩

https://www.theregister.com/2026/02/04/aws_cloud_breakin_ai_assist/

source

Comments

Sort:hotnewtop
  • just_another_person@lemmy.world ⁨1⁩ ⁨day⁩ ago

    This is just poor security. Not like in TV/Movies where an “AI” was found “breaking layers of firewalls and encryption” or whatever 🤣

    Somebody fucked up. Plain and simple.

    source
  • CallMeAnAI@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Y’all obviously lead with AI and your bad at propaganda.

    >The attackers initially gained access by stealing valid test credentials from public Amazon S3 buckets. The credentials belonged to an identity and access management (IAM) user with multiple read and write permissions on AWS Lambda and restricted permissions on AWS Bedrock

    Run your shit against tenable once in a while.

    source
    • REDACTED@infosec.pub ⁨22⁩ ⁨hours⁩ ago

      The point of the article is to show that with help of AI, attacks can be executed faster, which means higher success chance for getting more data/damage as you’re essentially running against time.

      How long would all this have taken without automation?

      source
      • CallMeAnAI@lemmy.world ⁨22⁩ ⁨hours⁩ ago

        According to you all shorter because AI is simultaneously garbage propping upa bubble, so my sarcastic answer is it’s slower.

        That being said I know I could detect and scan, with nessus/snyk/security hub and detect the issue inside of 5 minutes. Probably another half hour to hour for a proper pen tester to send an AWS exploit package at it and own the rest within an hour or two. 

        How many people do you think catch exploits in the first day or even week or month of a hack? I’ve got some news for you, its only the companies who really need their shit together and have a strong opssec team. They ain’t going deleting buckets. They sit on it for months and years in most post mortem.

        source
  • unmarkedbot@sh.itjust.works ⁨1⁩ ⁨day⁩ ago

    This is terrifying but also maddeningly avoidable. If you leave access keys and RAG data in a public S3 bucket, you are literally handing attackers the keys and training data they need. Short lived roles, no long-term IAM user keys, strict least privilege, and mandatory key rotation should not be optional best practices, they should be defaults.

    AI as a force multiplier was inevitable, and this shows how messy it gets when LLMs can stitch together reconnaissance, code, and privilege escalation in minutes. Call it LLM-assisted or not, the takeaway is the same: lock down UpdateFunctionCode/UpdateFunctionConfiguration permissions, require code signing for Lambdas, monitor CloudTrail and Lambda changes aggressively, and put S3 behind VPC endpoints and bucket policies that actually block public read. And for the love of sysadmin, enable MFA and alerting on unusual AssumeRole activity.

    This is on cloud providers, but mostly it’s on engineering teams who treat cloud like cheap storage instead of critical infrastructure. Fix the basics or expect more of these 8 minute knockovers.

    source