The company also noted that while GPT-4 can provide niche information, access to information alone is not sufficient to create a biological threat. They did not test how GPT-4 could assist in the physical construction of a threat. OpenAI stressed the need for further research and community deliberation on this issue. Despite the slight increase in ease of creating a bioweapon with AI, OpenAI assures that the risk is minimal.
Key takeaways:
- OpenAI conducted a study on GPT-4's potential to aid in creating a bioweapon, concluding that it poses only a slight risk.
- The study was conducted in response to concerns raised in President Biden's AI Executive Order about AI lowering the barrier for creating biological weapons.
- The study involved 100 participants, including biology experts and students, who were asked to create a plan for a bioweapon using either just the internet or the internet plus GPT-4. The use of GPT-4 resulted in a slight increase in accuracy and completeness of the plans.
- OpenAI emphasizes that while GPT-4 can provide niche information, access to information alone is insufficient to create a biological threat, and more research is needed in this area.