Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Meet 'Groq,' the AI Chip That Leaves Elon Musk’s Grok in the Dust

Feb 20, 2024 - gizmodo.com
Groq, an AI chip company, has developed a technology that could significantly speed up AI chatbots like ChatGPT, Gemini, and Grok. The company's Language Processing Units (LPUs) are reportedly faster than Nvidia’s Graphics Processing Units (GPUs), which are currently the industry standard for running AI models. Groq's LPUs can produce 247 tokens/second compared to Microsoft’s 18 tokens/second, potentially making chatbots more than 13 times faster.

Groq's technology doesn't replace chatbots but helps them run faster, making them more practical for real-world use. The company's founder, Jonathon Ross, previously co-founded Google’s AI chip division. Despite the buzz around Groq's technology, it's still unclear if its AI chips can scale in the same way as Nvidia’s GPUs or Google’s TPUs.

Key takeaways:

  • Groq, an AI chip company, claims to provide the world's fastest large language models, with third-party tests suggesting this claim might be valid.
  • Groq's AI chips, called Language Processing Units (LPUs), are reportedly faster than Nvidia’s Graphics Processing Units (GPUs), which are generally seen as the industry standard for running AI models.
  • Groq's LPUs help chatbots like ChatGPT, Gemini, and Grok run incredibly fast but do not replace them. Groq's chips could make these AI chatbots significantly more useful by enabling them to keep up with real-time human speech.
  • Before Groq, the company's founder and CEO, Jonathon Ross, co-founded Google’s AI chip division. Ross claims that Groq's LPUs bypass two large language model bottlenecks that GPUs and CPUs get stuck on: compute density and memory bandwidth.
View Full Article

Comments (0)

Be the first to comment!