SSI is co-founded by Daniel Gross, a former AI lead at Apple, and Daniel Levy, a former member of technical staff at OpenAI. Sutskever, who left OpenAI in May, had hinted at starting a new project. The company's first product will be safe superintelligence, and it will not pursue any other projects until this is achieved. This comes as other OpenAI staff have resigned citing safety concerns.
Key takeaways:
- Ilya Sutskever, co-founder and former chief scientist of OpenAI, is launching a new AI company called Safe Superintelligence Inc. (SSI), focused on creating a safe and powerful AI system.
- SSI is described as a startup that balances safety and capabilities, allowing it to advance its AI system while prioritizing safety. The company's business model is designed to insulate safety, security, and progress from short-term commercial pressures.
- SSI is co-founded by Daniel Gross, former AI lead at Apple, and Daniel Levy, a former member of the technical staff at OpenAI. Sutskever led a push to oust OpenAI CEO Sam Altman last year and left OpenAI in May.
- While OpenAI is pursuing partnerships with Apple and Microsoft, SSI's first product will be safe superintelligence, and the company will not pursue anything else until this is achieved, according to Sutskever.