Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Advocates urge Chuck Schumer to tackle AI’s climate impact

Sep 12, 2023 - theverge.com
A coalition of environmental, tech, and anti-hate speech groups have written to Senator Chuck Schumer, urging him to address the environmental impact of AI technology. The letter, signed by organizations such as Amazon Employees for Climate Justice and Greenpeace USA, calls for companies to disclose the environmental impact of developing energy-intensive AI models and to prevent the spread of climate change disinformation via AI. The coalition also expressed concern about the energy consumption of large language models (LLMs) and the potential for AI to fuel disinformation campaigns.

The letter suggests that companies should be held accountable for the environmental damage caused by their AI models and should be able to explain how these models create content. Senator Schumer has been pushing for AI regulation, including the organization of "AI Insight Forums". Other industries, such as Bitcoin mining, are also facing scrutiny over their carbon footprints, with proposals for mandatory reporting of climate impact and greenhouse gas emissions.

Key takeaways:

  • A coalition of environmental, tech, and anti-hate speech groups has written to Senator Chuck Schumer demanding policy to address the environmental impact of AI and prevent AI-aided climate change disinformation.
  • The letter, signed by groups including Amazon Employees for Climate Justice and Greenpeace USA, calls for companies to disclose the environmental impact of developing energy-intensive AI models and to publicly report the energy consumption and greenhouse gas emissions from the entire life cycle of their AI models.
  • The coalition expressed concern about the energy use of large language models (LLMs) and the potential for them to be used to spread climate disinformation, slowing efforts to combat climate change.
  • The letter also suggests that companies and executives should be held liable for environmental harm caused by generative AI, and that they should be able to explain to regulators and the public how their AI models create content and measure accuracy.
View Full Article

Comments (0)

Be the first to comment!