More

    Microsoft Azure Leaked 38TB Data Breach

    Microsoft Azure Leaked 38TB Data Breach

    Microsoft Azure Leaked 38TB Data Breach

    Microsoft Azure Leak: A Deep Dive into the Unsecured Storage Incident

    In a surprising turn of events, Microsoft’s AI research division inadvertently leaked a staggering 38TB of sensitive data due to unsecured Azure storage. This incident dates back to July 2020, when the division was contributing open-source AI learning models to a public GitHub repository. Uncovered almost three years later by cloud security firm Wiz, the leak was a result of a Microsoft employee inadvertently sharing the URL for a misconfigured Azure Blob storage bucket containing confidential information.

    Unveiling the Root Cause

    The heart of the matter lies in the utilization of an excessively permissive Shared Access Signature (SAS) token by Microsoft, which granted full control over the shared files. This Azure feature, though intended for secure data sharing, posed challenges in terms of monitoring and revocation, as noted by Wiz researchers.

    When used appropriately, Shared Access Signature (SAS) tokens provide a secure mechanism for granting delegated access to resources within a storage account. This includes precise control over data access, specifying interactable resources, defining permissions, and determining token validity duration.

    However, a lack of proper monitoring and governance made SAS tokens a security risk, urging the need for their judicious use. Microsoft’s failure to offer a centralized way to manage these tokens within the Azure portal exacerbated the issue. Moreover, these tokens could be configured to last indefinitely, making them unsafe for external sharing, a cautionary revelation by Wiz.

    The Extent of the Exposure

    The research team at Wiz delved into the incident and made a disconcerting discovery. In addition to the open-source models, the internal storage account inadvertently granted access to an additional 38TB of private data. This included backups of personal information belonging to Microsoft employees, encompassing passwords for Microsoft services, secret keys, and an archive of over 30,000 internal Microsoft Teams messages originating from 359 Microsoft employees.

    Microsoft’s Response and Resolution

    In response, the Microsoft Security Response Center (MSRC) team issued an advisory on Monday, asserting that no customer data had been exposed, and no other internal services were compromised due to this incident. Wiz reported the incident to MSRC on June 22nd, 2023, following which the SAS token was promptly revoked on June 24th, 2023, blocking all external access to the Azure storage account and effectively mitigating the issue.

    The Future of AI and Data Security

    This incident sheds light on the significant role AI plays in the tech industry. As data scientists and engineers strive to deploy new AI solutions, the immense volume of data they handle necessitates heightened security measures. The emerging technology relies heavily on extensive datasets for training. However, managing and securing these vast sets of data pose challenges, especially in cases where development teams need to manipulate and share massive amounts of data for collaboration or open-source projects. The Microsoft incident serves as a crucial lesson, emphasizing the growing complexity of data security in the era of AI.

    Stay Updated about the latest technological developments and reviews by following TechTalk, and connect with us on Twitter, Facebook, Google News, and Instagram. For our newest video content, subscribe to our YouTube channel.

    Read More: HP Spectre Fold: The Epitome of Elegance in Foldable Computing

    HP Spectre Fold: The Epitome of Elegance in Foldable Computing

    Latest articles

    spot_imgspot_img

    Related articles

    Leave a reply

    Please enter your comment!
    Please enter your name here

    spot_imgspot_img