Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Nvidia rivals focus on building a different kind of chip to power AI products

Nov 19, 2024 - financialpost.com
The article discusses the rise of companies focusing on building AI inference chips, which are more efficient for running AI tools than the graphics processor chips (GPUs) currently used. These chips are designed to reduce the significant computing costs of generative AI. While Nvidia has dominated the market with its GPUs, rivals such as AMD, Intel, and startups like Cerebras, Groq, and d-Matrix are pitching more inference-friendly chips.

AI inference chips are used in the day-to-day running of AI tools, taking in new information and making inferences from what the AI system already knows to produce a response. The article suggests that these chips could be more cost-effective for businesses wanting to utilize AI technology without building their own AI infrastructure. It also mentions the potential for reducing environmental and energy costs associated with running AI.

Key takeaways:

  • Rivals of Nvidia are focusing on building AI inference chips, which are more efficient at running AI tools and designed to reduce some of the huge computing costs of generative AI.
  • Startups like Cerebras, Groq and d-Matrix, as well as traditional chipmaking rivals such as AMD and Intel, are pitching more inference-friendly chips.
  • D-Matrix, launching its first product this week, sees a big market in AI inferencing, comparing that later stage of machine learning to how human beings apply the knowledge they acquired in school.
  • Better-designed chips could bring down the huge costs of running AI to businesses, and also affect the environmental and energy costs for everyone else.
View Full Article

Comments (0)

Be the first to comment!