While efforts are being made to improve AI efficiencies, de Vries warns of Jevons' Paradox, where increased efficiency often leads to increased demand, resulting in a net increase in resource use. He cites Google's incorporation of generative AI in its services as an example, estimating that if every Google search used AI, it would require about 29.2 TWh of power a year, equivalent to Ireland's annual electricity consumption. Despite current high costs and supply chain bottlenecks, AI server production is expected to grow rapidly, potentially increasing AI-related electricity consumption to levels comparable to countries like the Netherlands, Argentina, and Sweden.
Key takeaways:
- Artificial intelligence (AI) has a large energy footprint, which could potentially exceed the power demands of some countries in the future.
- Training AI tools and their subsequent use for generating data is energy intensive. For instance, ChatGPT could cost 564 MWh of electricity a day to run.
- Despite efforts to improve AI efficiencies, the increase in demand due to technological advancements could lead to a net increase in resource use, a phenomenon known as Jevons' Paradox.
- By 2027, worldwide AI-related electricity consumption could increase by 85 to 134 TWh annually, comparable to the annual electricity consumption of countries like the Netherlands, Argentina, and Sweden.