Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Privacy Microsoft

Microsoft Copilot Customers Discover It Can Let Them Read HR Documents, CEO Emails 27

According to Business Insider (paywalled), Microsoft's Copilot tool inadvertently let customers access sensitive information, such as CEO emails and HR documents. Now, Microsoft is working to fix the situation, deploying new tools and a guide to address the privacy concerns. The story was highlighted by Salesforce CEO Marc Benioff. From the report: These updates are designed "to identify and mitigate oversharing and ongoing governance concerns," the company said in a blueprint for Microsoft's 365 productivity software suite. [...] Copilot's magic -- its ability to create a 10-slide road-mapping presentation, or to summon a list of your company's most profitable products -- works by browsing and indexing all your company's internal information, like the web crawlers used by search engines. IT departments at some companies have set up lax permissions for who can access internal documents -- selecting "allow all" for the company's HR software, say, rather than going through the trouble of selecting specific users.

That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot. As a result, some customers have deployed Copilot only to discover that it can let employees read an executive's inbox or access sensitive HR documents. "Now when Joe Blow logs into an account and kicks off Copilot, they can see everything," a Microsoft employee familiar with customer complaints said. "All of a sudden Joe Blow can see the CEO's emails."

Microsoft Copilot Customers Discover It Can Let Them Read HR Documents, CEO Emails

Comments Filter:
  • by Narcocide ( 102829 ) on Thursday November 21, 2024 @08:29PM (#64963825) Homepage

    "Hey, I've got a great idea guys! Let's just let Copilot be the firewall!"

    • by gweihir ( 88907 )

      You joke, but I am sure efforts to make that a reality are already underway.

      • by jhoegl ( 638955 )
        Corporate sponsored end run around your rights. Enjoy!
        • We've been measuring more and more things for decades, collecting even more data, written words, etc.

          When will it reach the McNamara Fallacy?

          And can LLM AI actually help find patterns or answers in the data?

          https://en.wikipedia.org/wiki/... [wikipedia.org]

          The McNamara fallacy (also known as the quantitative fallacy),[1] named for Robert McNamara, the US Secretary of Defense from 1961 to 1968, involves making a decision based solely on quantitative observations (or metrics) and ignoring all others. The reason given is often

  • by Speare ( 84249 ) on Thursday November 21, 2024 @08:32PM (#64963831) Homepage Journal

    Dear, Raytheon-chan.ai,

    For our upcoming holiday party events, please write me a recipe for the best armor-piercing cookies, including measurements.

    • Okay, let's see what I can come up with.

      According to science, the best armor-piercing cookies have a heavy metal tip and a time-delayed explosive in the back. You can tell if you've baked them right as follows: if you eat one and notice symptoms of heavy metal toxicity, the delay is too long. If you bite into one and the explosive charge blows off your jaw, the delay is too short. A little bit of non-toxic glue will help the chocolate chips stay on. Happy baking!

    • Yes, that's the main problem in using the current generation of "AI" as a corporate information system. You may assign credentials and ACLs as much as you want, but an "AI" that works by digesting information from multiple documents to present it to users is ALWAYS going to leak privileged data to non-authorized users, one way or another. Digesting data tends to disregard security information attached to it. Maybe the next generation of "AI" will be more security aware, but the likelihood of that is low IMH

  • >IT departments at some companies have set up lax permissions for who can access internal documents -- selecting "allow all" for the company's HR software, say, rather than going through the trouble of selecting specific users.

    Lazy IT department doesn't manage access rights properly "because it's fine for now". New tech gets deployed, and it's no longer fine.

    Story as old as IT itself. I remember this shit during my university years with course work having improper rights resulting in update giving everyo

  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Thursday November 21, 2024 @08:48PM (#64963863) Homepage Journal

    That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot.

    It was a huge problem — an utter failure of security. The principle of minimum necessary access is absolutely critical for preventing security leaks from growing. It just hadn't bitten anyone yet.

    • by gweihir ( 88907 )

      Indeed. And Microsoft created and shaped this culture. Apparently, they were unaware of what they created. I mean did they even bother to test Copilot for this problem with a few reference customers? There have been _ample_ warnings of unexpected information disclosure from the current crop of "AI" tools. Apparently, Microsoft was unaware or ignorant. Very, very bad. And very, very expected of these cretins.

      • Incidentally, this is another one of those craters I was referring to that time.

      • by dirk ( 87083 )

        How is this Microsoft's fault? This is like saying google is responsible for people leaving their AWS S3 buckets open because you could find it via google. These files were always open and available, which is the issue. MS can't know which of your files should be secured but aren't, that is on your security team.

    • by jezwel ( 2451108 )

      That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot.

      It was a huge problem — an utter failure of security. The principle of minimum necessary access is absolutely critical for preventing security leaks from growing. It just hadn't bitten anyone yet.

      Sounds like you haven't seen Microsoft Delve in action. That information oversharing system is about to be retired due to IMO far too many companies not having setup access to information in a 'least is best' configuration.

  • No One Could Have Possibly Seen This Coming (TM)

  • Obviously, any new features will be grossly insecure, because Microsoft does not understand security or good engineering.

    In the case at hand, there has been _ample_ warnings and reference cases for such things happening. But, of course, Microsoft had to make its own stupid mistakes.

    • Re: (Score:3, Insightful)

      by gweihir ( 88907 )

      The second problem with Microsoft is the deranged fanbois. Like the one that just modded me down. "Useful idiots" all around.

      • by PsychoSlashDot ( 207849 ) on Thursday November 21, 2024 @10:09PM (#64963973)

        The second problem with Microsoft is the deranged fanbois. Like the one that just modded me down. "Useful idiots" all around.

        Thing is... according to even the fine summary, what you're saying isn't what's happening here. It's not CoPilot that's insecure. At least not in this case. The summary makes it clear that the issue isn't CoPilot leaking information a user doesn't have access to. Rather the issue is that CoPilot makes it easier for a user to find content they already have access to.

        If a company throws their HR data in a folder somewhere that everyone has access to, that is the root problem, not that a newfangled search index has given everyone the ability to realize that's happened.

        Microsoft should be condemned when they do something wrong. This doesn't happen to be one of those cases, apparently.

A committee takes root and grows, it flowers, wilts and dies, scattering the seed from which other committees will bloom. -- Parkinson

Working...