Microsoft AI Researchers Accidentally Exposed Terabytes of Internal Sensitive Data (techcrunch.com) 17
Microsoft AI researchers accidentally exposed tens of terabytes of sensitive data, including private keys and passwords, while publishing a storage bucket of open source training data on GitHub. From a report: In research shared with TechCrunch, cloud security startup Wiz said it discovered a GitHub repository belonging to Microsoft's AI research division as part of its ongoing work into the accidental exposure of cloud-hosted data. Readers of the GitHub repository, which provided open source code and AI models for image recognition, were instructed to download the models from an Azure Storage URL. However, Wiz found that this URL was configured to grant permissions on the entire storage account, exposing additional private data by mistake. This data included 38 terabytes of sensitive information, including the personal backups of two Microsoft employees' personal computers. The data also contained other sensitive personal data, including passwords to Microsoft services, secret keys and more than 30,000 internal Microsoft Teams messages from hundreds of Microsoft employees.
Good to see Microsoft stays true to its old self (Score:4, Insightful)
Microsoft has been trying for the past 48 years, but they've just never been able to make anything bug free or secure.
It was less concerning when they weren't in the business of hosting other people's data, especially when a lot of that data is the US government's and the US military's...
Re: (Score:3)
I keep saying, they've made all of this stuff way too complicated. I've been in I.T. for 30-some years now, and watched the whole thing evolve from 8-bit home computers connected only by a 300 baud modem to what we've got today. Data security is just too prone to people making mistakes, overlooking something, or just too much confusion trying to implement it.
I don't even fault Microsoft here any more than I fault the rest of the industry.
I mean, as just one example? I run my own TrueNAS server at home (for
Re: (Score:2)
Oh, yes. It is like nobody at Microsoft has even heard of KISS and looking what works in other systems like diverse UNix and Unix-like systems? They think they are too good for that. Instead they reinvent the wheel, badly. And they do it over and over and over again.
I do agree that, to a somewhat lesser extent, this is an industry-wide problem. KISS is the very basis of all reliable and secure engineering and it gets ignored left and right in the IT space, _despite_ about half a century of evidence how bad
Re: (Score:3)
Re: (Score:2)
Microsoft has been trying for the past 48 years, but they've just never been able to make anything bug free or secure.
That's a good cheap shot and first post at Microsoft's expense,
It is also pretty much the truth. MS has _always_ done bad engineering. They were always 2nd rated and they continue to be. And they fail at anything that is a bit more complicated (see WinFS which has failed several times) or deliver overly complex, insecure, unreliable and hard to use "solutions".
Re: (Score:2)
No kidding
Microsoft AI researchers accidentally exposed tens of terabytes of sensitive data, including private keys and passwords, while publishing a storage bucket of open source training data on GitHub.
I am so happy M/S enabled 2FA, wasn't that suppose to prevent things like this /s
Re:Good to see Microsoft stays true to its old sel (Score:4, Funny)
Microsoft has been trying for the past 48 years, but they've just never been able to make anything bug free
The only product Microsoft could make bug free is an ant farm.
Re: (Score:2)
Microsoft has been trying for the past 48 years, but they've just never been able to make anything bug free
The only product Microsoft could make bug free is an ant farm.
They would probably mess that up as well.
Anthill Inside (Score:2)
Re: (Score:2)
Indeed. The security and reliability problems with Microsoft are systematic and long-standing. They have never done well, but today we have a lot more attacker pressure and a lot more prefents on complex computer networks working. I think the only solution to this is to dump Microsoft altogether, they just do not have what it takes. The later it is done, the more painful. And it will be late because too many people have unwisely allowed themselves to get captured and have "standardized" on MS crap.
Preferred (Score:1)
It's better that this is exposed by Microsoft themselves rather than stolen later due to their world-class security. I think.
They had private keys in GitHub? (Score:3)
Really?
how to stop this (Score:2)
Re: (Score:2)
Does not work for this case. And you are looking for "Data Leakage Prevention" systems. This does require careful data classification and labelling, and we all know Microsoft cannot do that because they are too incompetent.
Train AI to Detect Leaked Data :)))) (Score:2)
That is why they try to Train an AI to Detect Leaked Data :))))
Re: (Score:2)
So
1. You leak data
2. You train the AI on the public response to it
3. You at least find out yourself when you leaked data again!
Pretty dumb, pretty incompetent, overall crappy, so _perfect_ for Microsoft!
These idiots never learn (Score:2)
You would think that just having what was effectively a cloud master key for Azure stolen from an old crash-dump that was not sanitized and not secured would make them a tiny bit careful. But no, they again mess up sanitization. Microsoft is just broken and the longer the world waits to move away from them, the more painful it will be. It will be necessary, these people are incapable of doing better and what they can do does nut not cut it. At all.