Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Companies are struggling to keep private data safe from generative AI, Cisco says

Jan 25, 2024 - fastcompany.com
A new report from Cisco reveals that over a quarter of companies have banned the use of generative AI tools like ChatGPT at work due to privacy and security concerns. The survey, which polled 2,600 privacy and security professionals, found that 63% of respondents have limited the data employees can input into these systems, and 61% have restricted which AI tools can be used. The main concern is the potential for employees to unintentionally leak private company data to third parties like OpenAI, which could then use the data to train its AI models.

Despite the restrictions, 62% of respondents admitted to entering information about internal processes into generative AI tools, and 42% have entered non-public company information. The biggest concern among professionals is that AI companies are using public data to train their models in ways that infringe on their businesses’ intellectual property. The survey results suggest that addressing these privacy risks is a top priority for most companies, with many welcoming legislation that would enforce privacy protections.

Key takeaways:

  • More than one in four companies have banned the use of generative AI tools at work due to privacy concerns, according to a report from Cisco.
  • 63% of companies have limited what data their employees can enter into AI systems, and 61% have restricted which generative AI tools employees can use.
  • Companies are concerned about the potential for employees to inadvertently leak private company data to third parties like OpenAI, which could then use that data to train its AI models.
  • Despite these concerns, the survey found that 62% of respondents have entered information about internal processes into generative AI tools, and 42% have entered non-public company information into these tools.
View Full Article

Comments (0)

Be the first to comment!