Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Microsoft Privacy Security

Microsoft Employees Exposed Own Company's Internal Logins (vice.com) 8

Multiple people who appear to be employees of Microsoft have exposed sensitive login credentials to the company's own infrastructure on GitHub, potentially offering attackers a gateway into internal Microsoft systems, according to a cybersecurity research firm that found the exposed credentials. Motherboard reports: "We continue to see that accidental source code and credential leakages are part of the attack surface of a company, and it's becoming more and more difficult to identify in a timely and accurate manner. This is a very challenging issue for most companies these days," Mossab Hussein, chief security officer at cybersecurity firm spiderSilk which discovered the issue, told Motherboard in an online chat. Hussein provided Motherboard with seven examples in total of exposed Microsoft logins. All of these were credentials for Azure servers. Azure is Microsoft's cloud computer service and is similar to Amazon Web Services. All of the exposed credentials were associated with an official Microsoft tenant ID. A tenant ID is a unique identifier linked to a particular set of Azure users. One of the GitHub users also listed Microsoft on their profile.

Three of the seven login credentials were still active when spiderSilk discovered them, with one seemingly uploaded just days ago at the time of writing. The other four sets of credentials were no longer active but still highlighted the risk of workers accidentally uploading keys for internal systems. Microsoft refused to elaborate on what systems the credentials were protecting when asked multiple times by Motherboard. But generally speaking, an attacker may have an opportunity to move onto other points of interest after gaining initial access to an internal system. One of the GitHub profiles with exposed and active credentials makes a reference to the Azure DevOps code repository. Highlighting the risk that such credentials may pose, in an apparently unrelated hack in March attackers gained access to an Azure DevOps account and then published a large amount of Microsoft source code, including for Bing and Microsoft's Cortana assistant.
"We've investigated and have taken action to secure these credentials," said a Microsoft spokesperson in a statement. "While they were inadvertently made public, we haven't seen any evidence that sensitive data was accessed or the credentials were used improperly. We're continuing to investigate and will continue to take necessary steps to further prevent inadvertent sharing of credentials."
This discussion has been archived. No new comments can be posted.

Microsoft Employees Exposed Own Company's Internal Logins

Comments Filter:
  • by bloodhawk ( 813939 ) on Tuesday August 16, 2022 @08:01PM (#62795341)
    Will always be a problem, some developers you simply require a baseball bat to the head to smack some sense into them. We had a set of Drupal developers, they complained to the execs about our strict security policies saying they can manage security better and provide faster deployments for their drupal sites and it should be turned over to them. Idiot exec agreed and it was handed over to them. the VERY first week of them having it our scans picked up them putting a spreadsheet of all the admin and service account userids and passwords and they put this in both github and stored on the bloody Drupal site itself.
    • by toddz ( 697874 )
      Calling them Drupal developers seems generous...
    • It's a challenging problem. People aren't good at this, so we need tools to help. Github, IDEs, libraries, linters, etc. all need to act in unison to prevent this sort of things.

      Microsoft has a tool CredScan that helps catch this and it gets run on their repos frequently. Clearly it wasn't enough.

      • yep and we use all of those tools plus a lot more hence we caught it within a day of it happening and we have developer security training and they are all also enforced to take developer security courses. This was actually a pretty good incident for us at the time as the only company consequence was the sites being taken offline for 2 days while security audits were conducted and sites redeployed securely. The lead dev had his contract cancelled and his team were sent back to redo security training. On the

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...