Microsoft leaks 38TB of private data via unsecured Azure storage::The Microsoft AI research division accidentally leaked dozens of terabytes of sensitive data starting in July 2020 while contributing open-source AI learning models to a public GitHub repository.
Because, of course they did.
pavnilschanda@lemmy.world 1 year ago
This will definitely make customers less trustful of Microsoft when dealing with their privacy-focused AI projects. Here’s to hoping that open-source LLMs become more advanced and optimized.
Tatters@feddit.uk 1 year ago
I am not sure. This was mostly a case of human error in not properly securing urls/storage accounts. The lack of centralised control of SAS tokens that the article highlights was a contributing factor, but not the root cause, which was human error.
If I leave my front door unlocked and someone walks in and robs my house, who is to blame? Me, for not locking the door? Or the house builder, for not providing a sensor so I can remotely check whether the door is locked?
NeoNachtwaechter@lemmy.world 1 year ago
In a private environment, one person’s mistake can happen, period.
A corporate environment absolutely needs some robust procedures to prevent all their clients from such huge impact of one person’s mistake.
But that’s a looong tradition at M$ - not having it, I mean.
whitecapstromgard@sh.itjust.works 1 year ago
Aure has a huge problem with SAS tokens. The mechanism is so bad, that it invites situations like this.
Sethayy@sh.itjust.works 1 year ago
if you live in an apartment and the landlord doesnt replace the front door locks when they break is a better analogy
hemko@lemmy.dbzer0.com 1 year ago
Root cause is whatever is allowing the human error to happen.