Data scientist Alex de Vries predicts that by 2027, the entire AI sector could consume between 85 to 134 terawatt-hours annually, potentially accounting for half a percent of global electricity consumption. This is significantly higher than the electricity usage of tech giants like Samsung, Google, and Microsoft. OpenAI has not yet responded to requests for comment on these findings.
Key takeaways:
- ChatGPT, OpenAI's chatbot, uses over half a million kilowatt-hours of electricity daily, which is more than 17 thousand times the amount used by an average US household.
- If generative AI is widely adopted, it could significantly increase electricity consumption. For instance, if Google integrated generative AI into every search, it would consume about 29 billion kilowatt-hours a year.
- Data scientist Alex de Vries estimates that by 2027, the entire AI sector could consume between 85 to 134 terawatt-hours annually, potentially accounting for half a percent of global electricity consumption.
- Estimating the exact electricity consumption of the AI industry is challenging due to the variability in AI models and the lack of transparency from Big Tech companies about their energy use.