More

    Samsung Employees Share Sensitive Data with ChatGPT

    Samsung Employees Share Sensitive Data with ChatGPT

    Samsung Employees Share Sensitive Data with ChatGPT

    TechRadar reported on Tuesday, April 4 that workers at Samsung’s semiconductor arm inadvertently leaked confidential company data to ChatGPT on at least three separate occasions. The ban on using ChatGPT was lifted about three weeks ago, though it may have been warranted to prevent data leaks.

     

    ChatGPT’s Assistance in Samsung’s Development Process:

    Korean media noted that two of the events were meant to troubleshoot faulty code. In one incident, a Samsung employee sought ChatGPT’s help in diagnosing the source code from a faulty semiconductor database. In the second incident, another employee shared a confidential code with ChatGPT to seek fixes for defective equipment.

     

    ChatGPT Saves Data Automatically:

    In the third case, an employee reportedly submitted an entire meeting to ChatGPT to create meeting minutes. OpenAI saves data from ChatGPT prompts to improve its AI models unless users are opt-out. However, the service is “not able to delete specific prompts,” as noted by OpenAI in an outline of what the service does.

     

    Cyberhaven, a data detection and response platform provider, revealed in a research blog post that “as of March 21, 8.2% of employees have used ChatGPT in the workplace and 6.5% have pasted company data into it since it launched” – a notable potential security risk.

     

    Samsung’s AI Program Development:

    Samsung Semiconductor is developing an artificial intelligence program for in-house use, but can only use prompts that are, at most, 1,024 bytes in size. This move follows the accidental leak of confidential company data to ChatGPT by Samsung employees.

     

    The use of chatbots like ChatGPT in workplaces is increasing as companies look for ways to improve efficiency and productivity. However, this can also lead to potential security risks, especially when dealing with sensitive company data.

     

    OpenAI’s Warning on Data Sharing:

    OpenAI has previously warned against sharing sensitive information with ChatGPT, and users should be cautious when using the service. Companies must also implement strict policies and guidelines to ensure that employees are aware of the potential risks when using chatbots like ChatGPT.

     

    It is essential to have robust security measures in place to prevent data leaks and breaches. Companies should invest in advanced security tools and technologies, provide training to employees on data protection, and enforce strict policies on the use of chatbots and other AI tools.

     

    Read More: Samsung Galaxy Tab S9 Faster and More Powerful Than Galaxy S23?

    Samsung Galaxy Tab S9 Faster and More Powerful Than Galaxy S23?

    Latest articles

    spot_imgspot_img

    Related articles

    Leave a reply

    Please enter your comment!
    Please enter your name here

    spot_imgspot_img