Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Korean researchers power-shame Nvidia with new neural AI chip — claim 625 times less power draw, 41 times smaller

Mar 09, 2024 - tomshardware.com
Scientists from the Korea Advanced Institute of Science and Technology (KAIST) have unveiled their 'Complementary-Transformer' AI chip at the 2024 International Solid-State Circuits Conference. The C-Transformer chip is touted as the world's first ultra-low power AI accelerator chip capable of large language model processing. It reportedly uses 625 times less power and is 41 times smaller than Nvidia's A100 Tensor Core GPU. However, no direct comparative performance metrics have been provided.

The C-Transformer chip is fabricated on Samsung's 28nm process and has a die area of 20.25mm2. It operates at a maximum frequency of 200 MHz, consuming under 500mW and can achieve 3.41 TOPS. The chip's architecture features three main functional blocks: a Homogeneous DNN-Transformer / Spiking-transformer Core, an Output Spike Speculation Unit, and an Implicit Weight Generation Unit. Despite uncertainties about its performance, the chip is seen as a promising option for mobile computing.

Key takeaways:

  • The Korea Advanced Institute of Science and Technology (KAIST) has unveiled their 'Complementary-Transformer' AI chip, claimed to be the world's first ultra-low power AI accelerator chip capable of large language model (LLM) processing.
  • The C-Transformer chip is said to use 625 times less power and is 41x smaller than Nvidia's A100 Tensor Core GPU.
  • The chip's architecture is characterized by three main functional feature blocks: a Homogeneous DNN-Transformer / Spiking-transformer Core, an Output Spike Speculation Unit, and an Implicit Weight Generation Unit with Extended Sign Compression.
  • Despite uncertainties about the performance of the C-Transformer chip due to lack of direct comparisons with industry-standard AI accelerators, it is believed to be an attractive option for mobile computing.
View Full Article

Comments (0)

Be the first to comment!