What happened: A Guardian report looks at the fast-rising energy and water demands of AI-hungry datacentres, and asks the awkward question nobody in a keynote wants: should people who care about the environment try to use less generative AI?
Why it matters: The International Energy Agency says datacentre electricity demand is growing four times faster than other sectors and could exceed Japan’s electricity use by 2030 — and that’s before every app decides it needs a “helpful” chatbot stapled to the settings menu.
Wider context: The piece links environmental concerns with the broader backlash against AI’s harms (surveillance, weapons, and general social corrosion), arguing that “opt out” behaviour — cancelling subscriptions, avoiding energy-intensive prompts — can be both emissions-reducing and political pressure.
Background: Researchers and advocates cited in the article emphasise a transparency problem: companies rarely disclose the energy, water, and emissions costs of training and running models. Meanwhile local communities face real-world impacts from warehouse-scale facilities: 24/7 lighting, constant cooling noise, and competing demands for power and water.
The environmental cost of datacentres is rising. Is it time to quit AI? — The Guardian
Singularity Soup Take: The AI boom keeps selling itself as “just software” while it quietly manifests as steel sheds that drink electricity and water — and then asks you to feel guilty for leaving a light on. If the industry wants to scale, it can start by paying for the infrastructure it’s using up.
Key Takeaways:
- Datacentres are the bottleneck: The article frames AI’s footprint less as an abstract “cloud” issue and more as physical infrastructure that must be powered and cooled, with impacts that show up in grids, water planning, and neighbourhood quality of life.
- Transparency is still missing: Experts quoted argue it’s difficult to assess the true costs because major tech firms don’t provide clear, standardised disclosures on energy use, water use, and emissions across training and day-to-day inference.
- Opting out is possible (sometimes): While AI is increasingly embedded in products and workplaces, the piece suggests practical reductions — cancelling subscriptions, avoiding energy-heavy use cases like text-to-video, and preferring simpler searches — as a form of personal and collective pressure.
Related News
Nvidia Backs Nscale in $2B Data Center Round — The money trail behind the “AI needs more boxes” era, where infrastructure funding is treated as destiny.
Nvidia’s Inference Pivot Is Really a Power Play for the Next AI Economy — A reminder that the business model is built on running models constantly, not just training them once and calling it a day.