Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Microsoft AI researchers accidentally exposed terabytes of internal sensitive data | TechCrunch

Sep 18, 2023 - techcrunch.com
Microsoft AI researchers accidentally exposed around 38 terabytes of sensitive data, including private keys, passwords, and personal backups of two employees' computers, on GitHub. The data was exposed due to a misconfigured URL that granted permissions on the entire storage account instead of being read-only. The URL, which had been exposing the data since 2020, was found by cloud security startup Wiz as part of its research into accidental exposure of cloud-hosted data.

Wiz shared its findings with Microsoft on June 22, and the overly permissive shared access signature (SAS) token was revoked two days later. Microsoft completed its investigation into the potential organizational impact on August 16, stating that no customer data was exposed and no other internal services were at risk. As a result of Wiz’s research, Microsoft has expanded GitHub’s secret scanning service to include any SAS token that may have overly permissive expirations or privileges.

Key takeaways:

  • Microsoft AI researchers accidentally exposed tens of terabytes of sensitive data, including private keys and passwords, on GitHub due to a misconfigured URL.
  • The exposed data included 38 terabytes of sensitive information, including personal backups of two Microsoft employees’ personal computers, passwords to Microsoft services, secret keys, and over 30,000 internal Microsoft Teams messages.
  • The storage account wasn’t directly exposed, but an overly permissive shared access signature (SAS) token in the URL allowed access to the data.
  • Microsoft has since revoked the SAS token and expanded GitHub’s secret spanning service to monitor all public open-source code changes for plaintext exposure of credentials and other secrets.
View Full Article

Comments (0)

Be the first to comment!