Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Controlling AI’s Growing Energy Needs – Communications of the ACM

Dec 01, 2024 - news.bensbites.com
The energy required to train artificial intelligence (AI) models is becoming a significant concern due to its exponential growth and contribution to global warming. Training AI models, such as the large language model (LLM) powering Chat GPT-3, requires vast amounts of energy, equivalent to the annual energy consumption of 130 American homes. The energy demand is driven by the use of power-hungry graphics processing units (GPUs) for training these models. However, alternatives to GPUs are being explored to reduce the energy footprint of AI training, including neuromorphic computers that mimic the human brain's energy-efficient processing and optical computers that use light waves for information transmission.

Neuromorphic computers, which process multiple sources of information simultaneously like the human brain, are being developed to improve energy consumption during learning. However, programming these machines is a challenge due to the intertwining of hardware and software. Optical computers, on the other hand, offer advantages such as faster computation and lower energy loss due to the use of light particles (photons) for data transmission. Despite the promise of these new computing approaches, they require time for development and adoption. In the meantime, efforts are being made to make current models more energy-efficient, such as using smaller, fine-tuned models that outperform larger ones in certain cases.

Key takeaways:

  • The energy required to train artificial intelligence (AI) models, particularly large language models (LLMs) like Chat GPT-3, is becoming a concern due to its significant contribution to global warming. The power needed for AI training has been doubling roughly every 3.4 months since 2012.
  • Alternatives to power-hungry graphics processing units (GPUs) are being explored to reduce the energy footprint of AI training. One such alternative is neuromorphic computing, which mimics the human brain's energy-efficient processing capabilities.
  • Another emerging technology is optical computing, which uses light waves to transmit information. This approach is faster and more energy-efficient than traditional computing methods. Lightmatter, a computer hardware company, is developing hybrid solutions that integrate optical components into silicon chips.
  • While new computing approaches are promising, they will take time to develop and adopt. In the meantime, researchers are exploring ways to make current AI models more energy-efficient, such as using smaller, fine-tuned models that outperform larger ones in certain cases.
View Full Article

Comments (0)

Be the first to comment!