To mitigate the environmental impact, companies are focusing on improving AI efficiency and sustainability. SAP, for instance, operates its data centers on 100% renewable electricity and optimizes its machine-learning models for energy efficiency. Other strategies include using smaller AI models, capping power usage during training, and employing more efficient processing chips. The World Economic Forum suggests that such measures can significantly reduce energy consumption without greatly extending task completion times. Additionally, companies like Google are developing more efficient chips, such as their latest Tensor chips, which are 67% more efficient than previous versions.
Key takeaways:
- Global data centers consume significant electricity, with estimates of 200 terawatt-hours annually, and this demand is growing due to AI advancements.
- The U.S. government and tech companies are investing heavily in new data centers and alternative energy sources, including nuclear and renewable energy, to meet increasing power demands.
- Efforts to improve AI efficiency include using renewable energy, optimizing AI models, and employing efficient processing chips to reduce energy consumption.
- Companies are exploring strategies like capping power usage during AI training and halting underperforming models to achieve substantial energy savings.