Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Privacy Microsoft

Microsoft Copilot Customers Discover It Can Let Them Read HR Documents, CEO Emails 39

According to Business Insider (paywalled), Microsoft's Copilot tool inadvertently let customers access sensitive information, such as CEO emails and HR documents. Now, Microsoft is working to fix the situation, deploying new tools and a guide to address the privacy concerns. The story was highlighted by Salesforce CEO Marc Benioff. From the report: These updates are designed "to identify and mitigate oversharing and ongoing governance concerns," the company said in a blueprint for Microsoft's 365 productivity software suite. [...] Copilot's magic -- its ability to create a 10-slide road-mapping presentation, or to summon a list of your company's most profitable products -- works by browsing and indexing all your company's internal information, like the web crawlers used by search engines. IT departments at some companies have set up lax permissions for who can access internal documents -- selecting "allow all" for the company's HR software, say, rather than going through the trouble of selecting specific users.

That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot. As a result, some customers have deployed Copilot only to discover that it can let employees read an executive's inbox or access sensitive HR documents. "Now when Joe Blow logs into an account and kicks off Copilot, they can see everything," a Microsoft employee familiar with customer complaints said. "All of a sudden Joe Blow can see the CEO's emails."

Microsoft Copilot Customers Discover It Can Let Them Read HR Documents, CEO Emails

Comments Filter:
  • by Narcocide ( 102829 ) on Thursday November 21, 2024 @08:29PM (#64963825) Homepage

    "Hey, I've got a great idea guys! Let's just let Copilot be the firewall!"

    • by gweihir ( 88907 )

      You joke, but I am sure efforts to make that a reality are already underway.

      • by jhoegl ( 638955 )
        Corporate sponsored end run around your rights. Enjoy!
        • by will4 ( 7250692 ) on Thursday November 21, 2024 @11:50PM (#64964059)

          We've been measuring more and more things for decades, collecting even more data, written words, etc.

          When will it reach the McNamara Fallacy?

          And can LLM AI actually help find patterns or answers in the data?

          https://en.wikipedia.org/wiki/... [wikipedia.org]

          The McNamara fallacy (also known as the quantitative fallacy),[1] named for Robert McNamara, the US Secretary of Defense from 1961 to 1968, involves making a decision based solely on quantitative observations (or metrics) and ignoring all others. The reason given is often that these other observations cannot be proven.

                  But when the McNamara discipline is applied too literally, the first step is to measure whatever can be easily measured. The second step is to disregard that which can't easily be measured or given a quantitative value. The third step is to presume that what can't be measured easily really isn't important. The fo[u]rth step is to say that what can't be easily measured really doesn't exist. This is suicide.
                  —Daniel Yankelovich, "Interpreting the New Life Styles", Sales Management (1971)[2]

          The quote originally referred to McNamara's ideology during the two months that he was president of Ford Motor Company, but has since been interpreted to refer to his attitudes during the Vietnam War.

          • by gweihir ( 88907 )

            We've been measuring more and more things for decades, collecting even more data, written words, etc.

            When will it reach the McNamara Fallacy?

            I would say that already has happened.

  • by Speare ( 84249 ) on Thursday November 21, 2024 @08:32PM (#64963831) Homepage Journal

    Dear, Raytheon-chan.ai,

    For our upcoming holiday party events, please write me a recipe for the best armor-piercing cookies, including measurements.

    • Okay, let's see what I can come up with.

      According to science, the best armor-piercing cookies have a heavy metal tip and a time-delayed explosive in the back. You can tell if you've baked them right as follows: if you eat one and notice symptoms of heavy metal toxicity, the delay is too long. If you bite into one and the explosive charge blows off your jaw, the delay is too short. A little bit of non-toxic glue will help the chocolate chips stay on. Happy baking!

    • Yes, that's the main problem in using the current generation of "AI" as a corporate information system. You may assign credentials and ACLs as much as you want, but an "AI" that works by digesting information from multiple documents to present it to users is ALWAYS going to leak privileged data to non-authorized users, one way or another. Digesting data tends to disregard security information attached to it. Maybe the next generation of "AI" will be more security aware, but the likelihood of that is low IMH

  • >IT departments at some companies have set up lax permissions for who can access internal documents -- selecting "allow all" for the company's HR software, say, rather than going through the trouble of selecting specific users.

    Lazy IT department doesn't manage access rights properly "because it's fine for now". New tech gets deployed, and it's no longer fine.

    Story as old as IT itself. I remember this shit during my university years with course work having improper rights resulting in update giving everyo

  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Thursday November 21, 2024 @08:48PM (#64963863) Homepage Journal

    That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot.

    It was a huge problem — an utter failure of security. The principle of minimum necessary access is absolutely critical for preventing security leaks from growing. It just hadn't bitten anyone yet.

    • by gweihir ( 88907 )

      Indeed. And Microsoft created and shaped this culture. Apparently, they were unaware of what they created. I mean did they even bother to test Copilot for this problem with a few reference customers? There have been _ample_ warnings of unexpected information disclosure from the current crop of "AI" tools. Apparently, Microsoft was unaware or ignorant. Very, very bad. And very, very expected of these cretins.

      • Incidentally, this is another one of those craters I was referring to that time.

      • by dirk ( 87083 )

        How is this Microsoft's fault? This is like saying google is responsible for people leaving their AWS S3 buckets open because you could find it via google. These files were always open and available, which is the issue. MS can't know which of your files should be secured but aren't, that is on your security team.

    • by jezwel ( 2451108 )

      That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot.

      It was a huge problem — an utter failure of security. The principle of minimum necessary access is absolutely critical for preventing security leaks from growing. It just hadn't bitten anyone yet.

      Sounds like you haven't seen Microsoft Delve in action. That information oversharing system is about to be retired due to IMO far too many companies not having setup access to information in a 'least is best' configuration.

    • by micheas ( 231635 )

      That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot.

      It was a huge problem — an utter failure of security. The principle of minimum necessary access is absolutely critical for preventing security leaks from growing. It just hadn't bitten anyone yet.

      Or nobody had mentioned the access before. I'm not sure I'd count on employee's mentioning that they had read the CEOs emails to determine if it was being used.

  • No One Could Have Possibly Seen This Coming (TM)

    • You joke, but everyone who knows anything has seen this coming for a long time. It has been the advice of many a Microsoft MVP to classify your data prior to rolling out Copilot for this very reason.

      This is really non-news. Any company rolling out Copilot will have considered this.

      As usual, people who just press buttons without understanding what they are doing will, inevitably, be surprised.

  • Obviously, any new features will be grossly insecure, because Microsoft does not understand security or good engineering.

    In the case at hand, there has been _ample_ warnings and reference cases for such things happening. But, of course, Microsoft had to make its own stupid mistakes.

    • Re: (Score:3, Insightful)

      by gweihir ( 88907 )

      The second problem with Microsoft is the deranged fanbois. Like the one that just modded me down. "Useful idiots" all around.

      • by PsychoSlashDot ( 207849 ) on Thursday November 21, 2024 @10:09PM (#64963973)

        The second problem with Microsoft is the deranged fanbois. Like the one that just modded me down. "Useful idiots" all around.

        Thing is... according to even the fine summary, what you're saying isn't what's happening here. It's not CoPilot that's insecure. At least not in this case. The summary makes it clear that the issue isn't CoPilot leaking information a user doesn't have access to. Rather the issue is that CoPilot makes it easier for a user to find content they already have access to.

        If a company throws their HR data in a folder somewhere that everyone has access to, that is the root problem, not that a newfangled search index has given everyone the ability to realize that's happened.

        Microsoft should be condemned when they do something wrong. This doesn't happen to be one of those cases, apparently.

        • by gweihir ( 88907 )

          I am completely aware. This not only a Copilot problem, but Copilot makes it massively easier to find and exploit this common problem and that _is_ a copilot issue. Security is not a yes/no game.

          • If your public water fountain is dispensing poison, removing the large "FREE WATER HERE" sign above it is not the solution to the problem. This absolutely a binary case - fix the permissions, the whole problem goes away.
            • by gweihir ( 88907 )

              Nope. But people like you are the reason we do not get security problems under control.

          • I assume you pull 'find,' 'grep' and 'cat' off any Linux server you admin, just to avoid making it massively easier to find information exposed by the 'common problem' of bad permissions?
  • If I understand the article correctly, at many companies all employees have permission to read the CEO's emails, they just don't know where to look, and copilot finds the sensitive emails that the user has access to and presents it to them.

    Google workspace (or whatever it has been rebranded to this week) is annoying about prompting you to let their bot suggest ways to reduce permissions, but at least it has secretaries explicitly have to masquerade as the CEO instead of just accessing the CEO's emails dir

"The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts." -- Bertrand Russell

Working...