To address these challenges, the industry is focusing on improving compute efficiency to reduce power use, a strategy known as "more work per watt". However, this alone is not enough. Other solutions include expanding transmission lines, using predictive software to reduce grid failures, and exploring alternative cooling methods for servers. Companies like Apple, Samsung, and Qualcomm are also promoting on-device AI to keep power-hungry queries off the cloud and out of data centers.
Key takeaways:
- The boom in artificial intelligence has led to a surge in the construction of data centers, leading to concerns about whether the U.S. can generate enough electricity for the widespread adoption of AI and whether the aging grid can handle the load.
- Efforts to reduce power use in data centers include improving compute efficiency, but this is not enough to solve the AI energy crisis. For instance, one ChatGPT query uses nearly 10 times as much energy as a typical Google search.
- The demand for data centers is expected to rise 15%-20% every year through 2030, and they're expected to comprise 16% of total U.S. power consumption by then.
- Generative AI data centers will also require significant amounts of water for cooling, with estimates suggesting they will need 4.2 billion to 6.6 billion cubic meters of water withdrawal by 2027.