Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

New AI Training Technique Is Drastically Faster, Says Google

Jul 07, 2024 - news.bensbites.com
Google's DeepMind researchers have developed a new method, called Joint Example Selection (JEST), to speed up AI training and reduce the computational resources and time required. The approach could make AI development faster and cheaper, and reduce the industry's high energy consumption. JEST surpasses current models with up to 13 times fewer iterations and 10 times less computation, potentially lowering overall energy consumption.

JEST works by selecting complementary batches of data to maximize an AI model's learnability, rather than selecting individual examples. This method, known as "multimodal contrastive learning," identifies dependencies between data points, improving the speed and efficiency of AI training while requiring less computing power. The technique focuses on high-quality, well-curated datasets, optimizing training efficiency and showing solid performance gains across various benchmarks.

Key takeaways:

  • Google's DeepMind researchers have developed a new method called JEST (multimodal contrastive learning with joint example selection) that accelerates AI training and reduces the computational resources and time needed.
  • The AI industry is known for its high energy consumption, but JEST could significantly reduce the number of iterations and computational power needed, potentially lowering overall energy consumption.
  • JEST works by selecting complementary batches of data to maximize the AI model's learnability, improving the speed and efficiency of AI training while requiring less computing power.
  • The researchers found that the JEST algorithm quickly discovered highly learnable sub-batches, accelerating the training process by focusing on specific pieces of data that “match” together, a technique referred to as 'data quality bootstrapping'.
View Full Article

Comments (0)

Be the first to comment!