Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

We asked Intel to define 'AI PC'. Its reply: 'Anything with our latest CPUs'

Mar 12, 2024 - theregister.com
Intel's senior director of technical marketing, Robert Hallock, has clarified that a PC qualifies as an "AI PC" if it includes a GPU, a processor with a neural processing unit (NPU), and can handle VNNI and Dp4a instructions. These features are present in Intel's current-generation desktop processors, such as the 14th-gen Core. Hallock stated that AI PCs are not a separate brand or spec, but rather what all PCs will be like in a few years. He also noted that while all PCs can run AI, those without an NPU will do so at significantly slower speeds.

Hallock predicts that AI will become as essential to the PC experience as graphics cards are today. He expects AI to be used in more profound ways, such as in security systems that detect system memory tampering, or in CRM systems that can generate summaries of relationships. He also anticipates AI being used to automate tasks like minute-taking and email follow-ups, potentially saving businesses significant time. However, to benefit from these advancements, businesses won't need to buy specialized AI PCs, but rather just the latest PCs.

Key takeaways:

  • An "AI PC" is defined by Intel's senior director of technical marketing, Robert Hallock, as a PC that includes a GPU, a processor with a neural processing unit, and can handle VNNI and Dp4a instructions.
  • Intel does not consider "AI PC" to be a brand or a specific category, but rather the standard for what PCs will be like in the near future.
  • AI PCs are not expected to have specific requirements for memory, storage, or I/O speeds, but for large language models or generative AI, a neural processing unit is considered essential.
  • Hallock predicts that AI will become an integral part of the PC experience, similar to how graphics cards have become essential, and will be used in a variety of applications, from security to CRM systems.
View Full Article

Comments (0)

Be the first to comment!