Clicky

Home » Latest News » Samsung bans its employees from using ChatGPT & Other AI tools

Samsung bans its employees from using ChatGPT & Other AI tools

By

Ankita

| Updated on:

Samsung has banned its employees from using generative AI tools such as ChatGPT due to security risks. Earlier, Samsung faced a major setback after sensitive data of the company was accidentally leaked by a few employees while accessing ChatGPT. To avoid such incidents Samsung came up with a new memo to ban the employees from using AI tools to avoid security risks of confidential and personal data. 

Samsung bans its employees from using ChatGPT & Other AI tools

Key Points: 

  • Samsung employees are banned from using generative AI tools such as ChatGPT, Bing, and others tools. 
  • Security risks such as the breaching of data are considered the major reason behind the ban of generative AI tools. 
  • Last month a survey was conducted by the company within the organization which resulted in 65% of respondents believing usage of AI tools carries security risks. 

Samsung bans the usage of generative AI tools such as ChatGPT & other AI tools

Samsung has banned its employees from using popular generative AI tools such as OpenAI’s ChatGPT, Bing, and other AI tools. The employees of the South Korean consumer electronics company have been informed about the new policy through a memo reviewed by Bloomberg. Data transmission to AI platforms such as ChatGPT, Bard, and Bing is considered to be the major reason behind the ban on the usage of generative AI tools among employees. Since the data transmitted is stored on external servers, making it is challenging to retrieve and delete which can result in private or confidential data getting disclosed to other users which worries Samsung. 

According to the Bloomberg report, a survey was conducted last month by the company within the organization. The results of the survey indicate around 65% of the respondents believe using AI tools such as ChatGPT & other services carries security risks. 

A Samsung staff stated, “Interest in generative AI platforms such as ChatGPT has been extensively growing both internally and externally.”

“The appeal towards AI tools such as ChatGPT are raised due to its beneficial capabilities and efficiency, however, these generative AI tools are also bound to certain security risks as well” the staff added. 

According to the memo, the new policy has been implemented on the employees due to the accidental leak of internal source code by Samsung engineers who uploaded it to OpenAI’s chatbot ChatGPT a few weeks earlier. 

According to a report by Korean media, Samsung employees had uploaded a corporate secret to OpenAI’s ChatGPT. One employee even went on to copy the source code of a semiconductor database download program, while another employee uploaded the program code to identify defective equipment. The third employee tried to auto-generate meeting minutes by uploading the meeting records to the AI chatbot. 

The employees of the company have been instructed to not share any confidential or personal data related to the company while using personal devices to access AI generative tools like ChatGPT or other AI tools. 

Since this might lead to an accidental leak of the company’s intellectual property and data. The company emphasizes that failure to comply with the new policy generated by the company could lead to termination. Although the new rules would not affect the company’s devices sold to the consumers like Windows laptops and smartphones. 

Samsung creating its AI tools

Reportedly, Samsung has been involved in generating its own AI tools which can be utilized for a variety of functions such as software development, summarizing documents, and translation. 

“Security methods are being evaluated by the HQ to make sure the environment is completely secure for the usage of generative AI tools by the users to improve productivity,” said the memo. 

“Up until the measures are completely ready, the company will be restricting the usage of AI tools,” said the memo.

Leave a Comment