Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

OpenAI Employees Warn of Advanced AI Dangers

Jun 05, 2024 - macrumors.com
Several current and former employees of OpenAI and Google DeepMind have issued an open letter warning about the dangers of advanced AI and the lack of oversight of companies working on AI technology. The letter highlights risks such as entrenchment of existing inequalities, manipulation and misinformation, and loss of control of autonomous AI systems which could lead to human extinction. The employees call for AI companies to offer solid whistleblower protections and to avoid creating or enforcing agreements that prevent criticism for risk-related concerns.

The letter was signed by 13 employees, including seven former OpenAI employees, four current OpenAI employees, one former Google DeepMind employee, and one current Google DeepMind employee. The employees claim that OpenAI has threatened employees with loss of vested equity for speaking up and makes them sign strict NDA agreements that prevent criticism. The letter comes as Apple is set to announce multiple AI-powered features for iOS 18 and other software updates, and has signed a deal with OpenAI to integrate ChatGPT features into iOS 18.

Key takeaways:

  • Current and former employees of OpenAI and Google DeepMind have written an open letter warning about the dangers of advanced AI and the lack of oversight of companies working on AI technology.
  • The letter highlights risks such as entrenchment of existing inequalities, manipulation and misinformation, and loss of control of autonomous AI systems, which could potentially lead to human extinction.
  • The employees are calling on AI companies to offer solid whistleblower protections, including creating a verifiably anonymous process for raising risk-related concerns and supporting a culture of open criticism.
  • OpenAI has been accused of threatening employees with loss of vested equity for speaking up and making them sign strict NDA agreements that prevent criticism.
View Full Article

Comments (0)

Be the first to comment!