Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Privacy Microsoft

Microsoft Copilot Customers Discover It Can Let Them Read HR Documents, CEO Emails 53

According to Business Insider (paywalled), Microsoft's Copilot tool inadvertently let customers access sensitive information, such as CEO emails and HR documents. Now, Microsoft is working to fix the situation, deploying new tools and a guide to address the privacy concerns. The story was highlighted by Salesforce CEO Marc Benioff. From the report: These updates are designed "to identify and mitigate oversharing and ongoing governance concerns," the company said in a blueprint for Microsoft's 365 productivity software suite. [...] Copilot's magic -- its ability to create a 10-slide road-mapping presentation, or to summon a list of your company's most profitable products -- works by browsing and indexing all your company's internal information, like the web crawlers used by search engines. IT departments at some companies have set up lax permissions for who can access internal documents -- selecting "allow all" for the company's HR software, say, rather than going through the trouble of selecting specific users.

That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot. As a result, some customers have deployed Copilot only to discover that it can let employees read an executive's inbox or access sensitive HR documents. "Now when Joe Blow logs into an account and kicks off Copilot, they can see everything," a Microsoft employee familiar with customer complaints said. "All of a sudden Joe Blow can see the CEO's emails."
This discussion has been archived. No new comments can be posted.

Microsoft Copilot Customers Discover It Can Let Them Read HR Documents, CEO Emails

Comments Filter:
  • by Narcocide ( 102829 ) on Thursday November 21, 2024 @08:29PM (#64963825) Homepage

    "Hey, I've got a great idea guys! Let's just let Copilot be the firewall!"

    • by gweihir ( 88907 )

      You joke, but I am sure efforts to make that a reality are already underway.

      • by jhoegl ( 638955 )
        Corporate sponsored end run around your rights. Enjoy!
        • by will4 ( 7250692 ) on Thursday November 21, 2024 @11:50PM (#64964059)

          We've been measuring more and more things for decades, collecting even more data, written words, etc.

          When will it reach the McNamara Fallacy?

          And can LLM AI actually help find patterns or answers in the data?

          https://en.wikipedia.org/wiki/... [wikipedia.org]

          The McNamara fallacy (also known as the quantitative fallacy),[1] named for Robert McNamara, the US Secretary of Defense from 1961 to 1968, involves making a decision based solely on quantitative observations (or metrics) and ignoring all others. The reason given is often that these other observations cannot be proven.

                  But when the McNamara discipline is applied too literally, the first step is to measure whatever can be easily measured. The second step is to disregard that which can't easily be measured or given a quantitative value. The third step is to presume that what can't be measured easily really isn't important. The fo[u]rth step is to say that what can't be easily measured really doesn't exist. This is suicide.
                  —Daniel Yankelovich, "Interpreting the New Life Styles", Sales Management (1971)[2]

          The quote originally referred to McNamara's ideology during the two months that he was president of Ford Motor Company, but has since been interpreted to refer to his attitudes during the Vietnam War.

          • by gweihir ( 88907 )

            We've been measuring more and more things for decades, collecting even more data, written words, etc.

            When will it reach the McNamara Fallacy?

            I would say that already has happened.

  • by Speare ( 84249 ) on Thursday November 21, 2024 @08:32PM (#64963831) Homepage Journal

    Dear, Raytheon-chan.ai,

    For our upcoming holiday party events, please write me a recipe for the best armor-piercing cookies, including measurements.

    • by RightwingNutjob ( 1302813 ) on Thursday November 21, 2024 @08:38PM (#64963835)

      Okay, let's see what I can come up with.

      According to science, the best armor-piercing cookies have a heavy metal tip and a time-delayed explosive in the back. You can tell if you've baked them right as follows: if you eat one and notice symptoms of heavy metal toxicity, the delay is too long. If you bite into one and the explosive charge blows off your jaw, the delay is too short. A little bit of non-toxic glue will help the chocolate chips stay on. Happy baking!

    • Yes, that's the main problem in using the current generation of "AI" as a corporate information system. You may assign credentials and ACLs as much as you want, but an "AI" that works by digesting information from multiple documents to present it to users is ALWAYS going to leak privileged data to non-authorized users, one way or another. Digesting data tends to disregard security information attached to it. Maybe the next generation of "AI" will be more security aware, but the likelihood of that is low IMH

      • AI isn't needed here at all, you could theoretically just search for it. The reason it doesn't work is because Windows searching is bad, but mostly because it doesn't search everything in the network. Now youv'e got a system that has searched and indexed EVERYTHING in the corporate network, and it doesn't matter if it's AI or just keyword search, it's going to find sensitive stuff. And to search everything takes a very very long time, unless of course you have a pre-installed search-everything agent on e

  • >IT departments at some companies have set up lax permissions for who can access internal documents -- selecting "allow all" for the company's HR software, say, rather than going through the trouble of selecting specific users.

    Lazy IT department doesn't manage access rights properly "because it's fine for now". New tech gets deployed, and it's no longer fine.

    Story as old as IT itself. I remember this shit during my university years with course work having improper rights resulting in update giving everyo

  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Thursday November 21, 2024 @08:48PM (#64963863) Homepage Journal

    That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot.

    It was a huge problem — an utter failure of security. The principle of minimum necessary access is absolutely critical for preventing security leaks from growing. It just hadn't bitten anyone yet.

    • by gweihir ( 88907 )

      Indeed. And Microsoft created and shaped this culture. Apparently, they were unaware of what they created. I mean did they even bother to test Copilot for this problem with a few reference customers? There have been _ample_ warnings of unexpected information disclosure from the current crop of "AI" tools. Apparently, Microsoft was unaware or ignorant. Very, very bad. And very, very expected of these cretins.

      • Incidentally, this is another one of those craters I was referring to that time.

      • by dirk ( 87083 )

        How is this Microsoft's fault? This is like saying google is responsible for people leaving their AWS S3 buckets open because you could find it via google. These files were always open and available, which is the issue. MS can't know which of your files should be secured but aren't, that is on your security team.

        • Can you explain how it being or not being microsoft's fault is somehow relevant here?

        • How is this Microsoft's fault?

          Because they're pushing it, enabling it by default, making it both challenging and time consuming to opt-out, it's unclear which documents have been ingested, and it's basically impossible to put the toothpaste back in the tube if a sensitive document / email ends up as part of the LLM pile accidentally. S3 buckets are private by default, and public buckets can be made private at any time, with a few obvious clicks.

          If Microsoft wants to avoid liability, make Copilot training and integration a matter of clea

      • I've got to say; that the CEO's emails aside. Presumably all these documents are in SharePoint and that lets you go and find all these documents right from your Edge Browser and Bing Search that Microsoft is pushing on people without paying $360 a year on it.

        This just happens to have a broader reach for things that non-specifically search for some data.

    • by jezwel ( 2451108 )

      That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot.

      It was a huge problem — an utter failure of security. The principle of minimum necessary access is absolutely critical for preventing security leaks from growing. It just hadn't bitten anyone yet.

      Sounds like you haven't seen Microsoft Delve in action. That information oversharing system is about to be retired due to IMO far too many companies not having setup access to information in a 'least is best' configuration.

    • by micheas ( 231635 )

      That didn't create much of a problem because there wasn't a tool that an average employee could use to identify and retrieve sensitive company documents -- until Copilot.

      It was a huge problem — an utter failure of security. The principle of minimum necessary access is absolutely critical for preventing security leaks from growing. It just hadn't bitten anyone yet.

      Or nobody had mentioned the access before. I'm not sure I'd count on employee's mentioning that they had read the CEOs emails to determine if it was being used.

  • No One Could Have Possibly Seen This Coming (TM)

    • You joke, but everyone who knows anything has seen this coming for a long time. It has been the advice of many a Microsoft MVP to classify your data prior to rolling out Copilot for this very reason.

      This is really non-news. Any company rolling out Copilot will have considered this.

      As usual, people who just press buttons without understanding what they are doing will, inevitably, be surprised.

  • by gweihir ( 88907 ) on Thursday November 21, 2024 @09:08PM (#64963891)

    Obviously, any new features will be grossly insecure, because Microsoft does not understand security or good engineering.

    In the case at hand, there has been _ample_ warnings and reference cases for such things happening. But, of course, Microsoft had to make its own stupid mistakes.

    • Re: (Score:3, Insightful)

      by gweihir ( 88907 )

      The second problem with Microsoft is the deranged fanbois. Like the one that just modded me down. "Useful idiots" all around.

      • by PsychoSlashDot ( 207849 ) on Thursday November 21, 2024 @10:09PM (#64963973)

        The second problem with Microsoft is the deranged fanbois. Like the one that just modded me down. "Useful idiots" all around.

        Thing is... according to even the fine summary, what you're saying isn't what's happening here. It's not CoPilot that's insecure. At least not in this case. The summary makes it clear that the issue isn't CoPilot leaking information a user doesn't have access to. Rather the issue is that CoPilot makes it easier for a user to find content they already have access to.

        If a company throws their HR data in a folder somewhere that everyone has access to, that is the root problem, not that a newfangled search index has given everyone the ability to realize that's happened.

        Microsoft should be condemned when they do something wrong. This doesn't happen to be one of those cases, apparently.

        • by gweihir ( 88907 )

          I am completely aware. This not only a Copilot problem, but Copilot makes it massively easier to find and exploit this common problem and that _is_ a copilot issue. Security is not a yes/no game.

          • If your public water fountain is dispensing poison, removing the large "FREE WATER HERE" sign above it is not the solution to the problem. This absolutely a binary case - fix the permissions, the whole problem goes away.
            • by gweihir ( 88907 )

              Nope. But people like you are the reason we do not get security problems under control.

          • I assume you pull 'find,' 'grep' and 'cat' off any Linux server you admin, just to avoid making it massively easier to find information exposed by the 'common problem' of bad permissions?
            • by gweihir ( 88907 )

              I assume you completely missed we are talking about ordinary users and admins in a typical Windows environment here?

              • No, you're missing the point that 'incorrect permissions' is a completely platform agnostic issue.
                • by gweihir ( 88907 )

                  I hope you do not ever get any responsibilities for IT security, because that is the most dumb statement I have ever heard in that space.

                  • I hope you do not ever get any responsibilities for IT security, because that is the most dumb statement I have ever heard in that space.

                    That's being obtuse. For instance, if your password policy is 6 characters, alpha only, and you have to lockout after failed attempts... it doesn't matter what platform you're on, it's the policy that's bad. If you set your file shares to allow everyone access, it doesn't matter if that's SharePoint, an NFS share, or a Windows file server, it's the ACLs that are bad.

                    In this case, you're trying to blame a web browser for SharePoint perms being overly open. If the users only had telnet.exe and had to par

                    • Exactly. Leave your company file share open to all, and it doesn't matter if it's Copilot, Google or the UNIX 'find' command crawling through it all.
  • If I understand the article correctly, at many companies all employees have permission to read the CEO's emails, they just don't know where to look, and copilot finds the sensitive emails that the user has access to and presents it to them.

    Google workspace (or whatever it has been rebranded to this week) is annoying about prompting you to let their bot suggest ways to reduce permissions, but at least it has secretaries explicitly have to masquerade as the CEO instead of just accessing the CEO's emails dir

  • Copilot didn't index your CEOs email. It searched using the search engine you expose to all employees. That's how retrieval augmented generation works

    How is it a copilot problem that you expose privileged information in your internal web search?

Hackers are just a migratory lifeform with a tropism for computers.

Working...