Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

From quantum AI to photonics, what OpenAI’s latest hire tells us about its future

Mar 13, 2024 - theregister.com
OpenAI has hired Ben Bartlett, a former quantum systems architect at PsiQuantum, possibly indicating a move towards quantum computing. Quantum computing could potentially improve the efficiency of training large AI models, allowing them to derive more accurate answers from models with fewer parameters. This could be particularly beneficial for AI models like GPT-4, which are rumored to have over a trillion parameters. Quantum optimization algorithms could also be used to determine which features to leave in or out of AI training datasets, resulting in leaner, more accurate models.

In the long term, companies like D-Wave are exploring the use of quantum processing units (QPUs) in the training process and the application of quantum computing to sampling. French quantum computing startup Pasqal is also looking at using quantum computing to offload graph structured data sets commonly found in neural networks. However, this would require quantum systems to become significantly larger and faster. Bartlett's expertise in silicon photonics, a technology that could overcome bandwidth limits and scale machine learning performance, could also be of interest to OpenAI.

Key takeaways:

  • OpenAI has hired Ben Bartlett, a former quantum systems architect at PsiQuantum, potentially signaling a move towards quantum computing to improve AI model efficiency.
  • Quantum computing could drastically improve the efficiency of training large AI models, allowing them to derive more accurate answers from models with fewer parameters.
  • Quantum algorithms can be used to optimize AI training datasets for specific requirements, resulting in leaner, more accurate models.
  • OpenAI might also be interested in silicon photonics, a technology that Bartlett has expertise in, which could help overcome bandwidth limits and scale machine learning performance.
View Full Article

Comments (0)

Be the first to comment!