Tech companies have pledged to become "water positive" by 2030, but the location of water replenishment often differs from where it was extracted. The expansion of data centres has also faced backlash due to environmental concerns. Alternative solutions to reduce water use include using air-based cooling systems and optimising AI models and algorithms. Companies like Google DeepMind and Digital Realty are already implementing such strategies. However, the demand for speed in AI response makes it impractical to build data centres in cooler countries at scale.
Key takeaways:
- The computer clusters powering AI chatbots like ChatGPT require four times more water than previously estimated, with 10 to 50 queries consuming about two litres of water, according to a study from the University of California, Riverside.
- AI servers generate a lot of heat due to their high power density and data processing demands, necessitating water-based cooling systems. However, this water is often lost in the process and must be of drinking quality to avoid damaging the servers.
- Big Tech companies like Google, Microsoft, and Meta have reported double-digit increases in water consumption, and while they have pledged to become 'water positive' by 2030, the water they return often isn't in the same place it was taken from, potentially exacerbating water stress in certain areas.
- There are ways to reduce water consumption in data centres, such as using air-based cooling systems, optimizing AI models and algorithms, and distributing workloads to locations with better water efficiency. However, there is currently no requirement for data centres to report their water consumption.