Techno Blender
Digitally Yours.

Three Samsung employees reportedly leaked sensitive data to ChatGPT

0 55


On the surface, might seem like a tool that can come in useful for an array of work tasks. But before you ask the chatbot to summarize important memos or check your work for errors, it’s worth remembering that anything you share with ChatGPT could be used to train the system and perhaps even pop up in its responses to other users. That’s something several employees probably should have been aware of before they reportedly shared confidential information with the chatbot.

Soon after Samsung’s semiconductor division started allowing engineers to use ChatGPT, workers leaked secret info to it on at least three occasions, according to (as spotted by ). One employee reportedly asked the chatbot to check sensitive database source code for errors, another solicited code optimization and a third fed a recorded meeting into ChatGPT and asked it to generate minutes.

suggest that, after learning about the security slip-ups, Samsung attempted to limit the extent of future faux pas by restricting the length of employees’ ChatGPT prompts to a kilobyte, or 1024 characters of text. The company is also said to be investigating the three employees in question and building its own chatbot to prevent similar mishaps. Engadget has contacted Samsung for comment.

ChatGPT’s states that, unless users explicitly opt out, it uses their prompts to train its models. The chatbot’s owner OpenAI not to share secret information with ChatGPT in conversations as it’s “not able to delete specific prompts from your history.” The only way to get rid of personally identifying information on ChatGPT is to delete your account — a process that .

The Samsung saga is another example of why it’s as you perhaps should with all your online activity. You never truly know where your data will end up.


On the surface, might seem like a tool that can come in useful for an array of work tasks. But before you ask the chatbot to summarize important memos or check your work for errors, it’s worth remembering that anything you share with ChatGPT could be used to train the system and perhaps even pop up in its responses to other users. That’s something several employees probably should have been aware of before they reportedly shared confidential information with the chatbot.

Soon after Samsung’s semiconductor division started allowing engineers to use ChatGPT, workers leaked secret info to it on at least three occasions, according to (as spotted by ). One employee reportedly asked the chatbot to check sensitive database source code for errors, another solicited code optimization and a third fed a recorded meeting into ChatGPT and asked it to generate minutes.

suggest that, after learning about the security slip-ups, Samsung attempted to limit the extent of future faux pas by restricting the length of employees’ ChatGPT prompts to a kilobyte, or 1024 characters of text. The company is also said to be investigating the three employees in question and building its own chatbot to prevent similar mishaps. Engadget has contacted Samsung for comment.

ChatGPT’s states that, unless users explicitly opt out, it uses their prompts to train its models. The chatbot’s owner OpenAI not to share secret information with ChatGPT in conversations as it’s “not able to delete specific prompts from your history.” The only way to get rid of personally identifying information on ChatGPT is to delete your account — a process that .

The Samsung saga is another example of why it’s as you perhaps should with all your online activity. You never truly know where your data will end up.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment