Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

ChatGPT uses 17,000 times the amount of electricity than the average US household does daily: report

Mar 10, 2024 - businessinsider.com
OpenAI's chatbot, ChatGPT, reportedly uses over half a million kilowatt-hours of electricity daily to respond to some 200 million requests, according to The New Yorker. This is more than 17 thousand times the amount of electricity used by the average US household. The AI industry's electricity consumption is difficult to estimate due to the variability in how large AI models operate and the lack of transparency from Big Tech companies about their energy use.

Data scientist Alex de Vries predicts that by 2027, the entire AI sector could consume between 85 to 134 terawatt-hours annually, potentially accounting for half a percent of global electricity consumption. This is significantly higher than the electricity usage of tech giants like Samsung, Google, and Microsoft. OpenAI has not yet responded to requests for comment on these findings.

Key takeaways:

  • ChatGPT, OpenAI's chatbot, uses over half a million kilowatt-hours of electricity daily, which is more than 17 thousand times the amount used by an average US household.
  • If generative AI is widely adopted, it could significantly increase electricity consumption. For instance, if Google integrated generative AI into every search, it would consume about 29 billion kilowatt-hours a year.
  • Data scientist Alex de Vries estimates that by 2027, the entire AI sector could consume between 85 to 134 terawatt-hours annually, potentially accounting for half a percent of global electricity consumption.
  • Estimating the exact electricity consumption of the AI industry is challenging due to the variability in AI models and the lack of transparency from Big Tech companies about their energy use.
View Full Article

Comments (0)

Be the first to comment!