Microsoft Copilot Customers Discover It Can Let Them Read HR Documents, CEO Emails 38
According to Business Insider (paywalled), Microsoft's Copilot tool inadvertently let customers access sensitive information, such as CEO emails and HR documents. Now, Microsoft is working to fix the situation, deploying new tools and a guide to address the privacy concerns. The story was highlighted by Salesforce CEO Marc Benioff. From the report: These updates are designed "to identify and mitigate oversharing and ongoing governance concerns," the company said in a blueprint for Microsoft's 365 productivity software suite. [...] Copilot's magic -- its ability to create a 10-slide road-mapping presentation, or to summon a list of your company's most profitable products -- works by browsing and indexing all your company's internal information, like the web crawlers used by search engines. IT departments at some companies have set up lax permissions for who can access internal documents -- selecting "allow all" for the company's HR software, say, rather than going through the trouble of selecting specific users.
That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot. As a result, some customers have deployed Copilot only to discover that it can let employees read an executive's inbox or access sensitive HR documents. "Now when Joe Blow logs into an account and kicks off Copilot, they can see everything," a Microsoft employee familiar with customer complaints said. "All of a sudden Joe Blow can see the CEO's emails."
That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot. As a result, some customers have deployed Copilot only to discover that it can let employees read an executive's inbox or access sensitive HR documents. "Now when Joe Blow logs into an account and kicks off Copilot, they can see everything," a Microsoft employee familiar with customer complaints said. "All of a sudden Joe Blow can see the CEO's emails."
What could possibly go wrong? (Score:5, Funny)
"Hey, I've got a great idea guys! Let's just let Copilot be the firewall!"
Re: (Score:2)
You joke, but I am sure efforts to make that a reality are already underway.
Re: (Score:2)
When will the McNamara Fallacy will be reached? (Score:4, Interesting)
We've been measuring more and more things for decades, collecting even more data, written words, etc.
When will it reach the McNamara Fallacy?
And can LLM AI actually help find patterns or answers in the data?
https://en.wikipedia.org/wiki/... [wikipedia.org]
The McNamara fallacy (also known as the quantitative fallacy),[1] named for Robert McNamara, the US Secretary of Defense from 1961 to 1968, involves making a decision based solely on quantitative observations (or metrics) and ignoring all others. The reason given is often that these other observations cannot be proven.
But when the McNamara discipline is applied too literally, the first step is to measure whatever can be easily measured. The second step is to disregard that which can't easily be measured or given a quantitative value. The third step is to presume that what can't be measured easily really isn't important. The fo[u]rth step is to say that what can't be easily measured really doesn't exist. This is suicide.
—Daniel Yankelovich, "Interpreting the New Life Styles", Sales Management (1971)[2]
The quote originally referred to McNamara's ideology during the two months that he was president of Ford Motor Company, but has since been interpreted to refer to his attitudes during the Vietnam War.
Re: (Score:2)
We've been measuring more and more things for decades, collecting even more data, written words, etc.
When will it reach the McNamara Fallacy?
I would say that already has happened.
Oh Raytheon-chan, (Score:5, Funny)
Dear, Raytheon-chan.ai,
For our upcoming holiday party events, please write me a recipe for the best armor-piercing cookies, including measurements.
Re: Oh Raytheon-chan, (Score:3)
Okay, let's see what I can come up with.
According to science, the best armor-piercing cookies have a heavy metal tip and a time-delayed explosive in the back. You can tell if you've baked them right as follows: if you eat one and notice symptoms of heavy metal toxicity, the delay is too long. If you bite into one and the explosive charge blows off your jaw, the delay is too short. A little bit of non-toxic glue will help the chocolate chips stay on. Happy baking!
Re: Oh Raytheon-chan, (Score:3)
Yes, that's the main problem in using the current generation of "AI" as a corporate information system. You may assign credentials and ACLs as much as you want, but an "AI" that works by digesting information from multiple documents to present it to users is ALWAYS going to leak privileged data to non-authorized users, one way or another. Digesting data tends to disregard security information attached to it. Maybe the next generation of "AI" will be more security aware, but the likelihood of that is low IMH
Re: (Score:3, Informative)
Capitalism is asymmetric warfare with the working class at a huge disadvantage.
Not only capitalism. Any system that allows concentration of power not based on merit but some perverted incentives (and capitalism certainly has those) is going to have crap like this. "Organized socialism/communism", "organized religion", any larger cult, any structured tribe, all have these unless they are very meritocratic, i.e. whether you can get into a position depends on whether you are fit to fill it well and not on whether you fulfill some other irrelevant criterion like "can make tons of money" o
Concentration of power isn't exactly great (Score:3, Insightful)
With the exception of a few people who are have been granted status as the ruling class we each have to spend our lives justifying our right to exist. If you can't work enough then they take away first health care then your home and finally food and water.
It's been at least 80 years if not longer since our civilization could not afford the feed every person in a wealthy nation and probably around 50 year
Re: Concentration of power isn't exactly great (Score:2)
Intrinsic value of human life should be enough.
I have bad news for you. Historically, the intrinsic value of human life approaches zero. For a current example, look at any of the current conflicts: Ukraine, Middle East, Africa, etc.
Re: Concentration of power isn't exactly great (Score:2)
This view of human value is too idealistic to be sane and applicable to reality. It can only lead to overpopulation and catastrophic depletion of the planet's resources. Humans are animals and the laws of ecology apply to them too, full force.
Note that I don't advocate killing people to fight overpopulation, so don't yet grab your pitchforks. I'm not that misanthropic.
Re: (Score:3)
The Nazis had forced sterilization and even tried secret ones. Not good either. The question whether human reproduction is actually a right is very tricky for numerous reasons and in particular, there are no good answers in a limited ecosystem like we have here.
Re: (Score:2)
You know, this is extreme in the US, but a lot less pronounced in Europe. While there are those that want desperately for everybody to always have to justify their existence, except for a few people that are essentially fascists (like Maloni with her concentration camps), nobody is talking about removing the minimum needed to exist and it is generally accepted that people should actually have more than that, regardless of what they contribute. This may get worse here though, with more and more extreme right
So... (Score:1)
>IT departments at some companies have set up lax permissions for who can access internal documents -- selecting "allow all" for the company's HR software, say, rather than going through the trouble of selecting specific users.
Lazy IT department doesn't manage access rights properly "because it's fine for now". New tech gets deployed, and it's no longer fine.
Story as old as IT itself. I remember this shit during my university years with course work having improper rights resulting in update giving everyo
uh no (Score:3)
That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot.
It was a huge problem — an utter failure of security. The principle of minimum necessary access is absolutely critical for preventing security leaks from growing. It just hadn't bitten anyone yet.
Re: (Score:2)
Indeed. And Microsoft created and shaped this culture. Apparently, they were unaware of what they created. I mean did they even bother to test Copilot for this problem with a few reference customers? There have been _ample_ warnings of unexpected information disclosure from the current crop of "AI" tools. Apparently, Microsoft was unaware or ignorant. Very, very bad. And very, very expected of these cretins.
Re: (Score:2)
Incidentally, this is another one of those craters I was referring to that time.
Re: (Score:2)
How is this Microsoft's fault? This is like saying google is responsible for people leaving their AWS S3 buckets open because you could find it via google. These files were always open and available, which is the issue. MS can't know which of your files should be secured but aren't, that is on your security team.
Re: (Score:2)
Can you explain how it being or not being microsoft's fault is somehow relevant here?
Re: (Score:2)
That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot.
It was a huge problem — an utter failure of security. The principle of minimum necessary access is absolutely critical for preventing security leaks from growing. It just hadn't bitten anyone yet.
Sounds like you haven't seen Microsoft Delve in action. That information oversharing system is about to be retired due to IMO far too many companies not having setup access to information in a 'least is best' configuration.
Re: (Score:2)
That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot.
It was a huge problem — an utter failure of security. The principle of minimum necessary access is absolutely critical for preventing security leaks from growing. It just hadn't bitten anyone yet.
Or nobody had mentioned the access before. I'm not sure I'd count on employee's mentioning that they had read the CEOs emails to determine if it was being used.
Gee what a shock (Score:2)
No One Could Have Possibly Seen This Coming (TM)
Re: (Score:2)
You joke, but everyone who knows anything has seen this coming for a long time. It has been the advice of many a Microsoft MVP to classify your data prior to rolling out Copilot for this very reason.
This is really non-news. Any company rolling out Copilot will have considered this.
As usual, people who just press buttons without understanding what they are doing will, inevitably, be surprised.
It is _Microsoft_, what did you expect? (Score:2, Troll)
Obviously, any new features will be grossly insecure, because Microsoft does not understand security or good engineering.
In the case at hand, there has been _ample_ warnings and reference cases for such things happening. But, of course, Microsoft had to make its own stupid mistakes.
Re: (Score:3, Insightful)
The second problem with Microsoft is the deranged fanbois. Like the one that just modded me down. "Useful idiots" all around.
Re:It is _Microsoft_, what did you expect? (Score:5, Insightful)
The second problem with Microsoft is the deranged fanbois. Like the one that just modded me down. "Useful idiots" all around.
Thing is... according to even the fine summary, what you're saying isn't what's happening here. It's not CoPilot that's insecure. At least not in this case. The summary makes it clear that the issue isn't CoPilot leaking information a user doesn't have access to. Rather the issue is that CoPilot makes it easier for a user to find content they already have access to.
If a company throws their HR data in a folder somewhere that everyone has access to, that is the root problem, not that a newfangled search index has given everyone the ability to realize that's happened.
Microsoft should be condemned when they do something wrong. This doesn't happen to be one of those cases, apparently.
Re: (Score:2)
I am completely aware. This not only a Copilot problem, but Copilot makes it massively easier to find and exploit this common problem and that _is_ a copilot issue. Security is not a yes/no game.
Re: (Score:3)
Re: (Score:2)
Nope. But people like you are the reason we do not get security problems under control.
The users just didn't know they had access? (Score:2)
Google workspace (or whatever it has been rebranded to this week) is annoying about prompting you to let their bot suggest ways to reduce permissions, but at least it has secretaries explicitly have to masquerade as the CEO instead of just accessing the CEO's emails dir