

Many people now turn to ChatGPT and other generative A.I. chatbots for everything from weather updates and cooking tips to math help and relationship advice. But that kind of blanket usage may be doing more harm than good, warns Sasha Luccioni, A.I. and climate lead at Hugging Face, an open-source A.I. platform.
“I think we really have this obligation to not just be like, ‘Oh yeah, I’m going to use ChatGPT for everything and anything,’” Luccioni said during a keynote yesterday (July 9) at the AI for Good Summit in Geneva.
She emphasized that A.I. chatbots “cannot replace therapists,” nor are they “made to do math.” And relying on them for such tasks could consume “10 or 100,000 times more energy” than simpler tools, she added.
As demand for A.I. grows, so does its environmental footprint. Data centers are devouring more electricity and water, fueling backlash from nearby communities. In Memphis, Tenn., local environmental groups have opposed gas turbines installed by xAI to power its chatbot Grok, citing concerns about air pollution.
Still, Luccioni doesn’t think the answer is to stop using A.I. chatbots altogether. Rather, she urges people to think more critically about when and how they use them. “Thinking about why we’re using A.I. and what’s the best usage of a finite resource of our planetary boundaries is really, really important,” she said.
Training one LLM emits as much as carbon as 500 New York-London flights
During her talk, Luccioni outlined the cascading environmental impacts of A.I. systems. Training a single large language model, she noted, can emit as much carbon as 500 flights between New York and London. But the damage doesn’t stop there. As demand grows, the electricity and water required to power and cool data centers is also surging. The A.I. supply chain brings additional strain: hardware relies on rare earth minerals like cobalt and germanium, which are often mined in environmentally stressed regions and shipped across borders for production.
Still, Luccioni emphasized that A.I. isn’t all bad news for the planet. She highlighted how targeted, small-scale A.I. tools are already helping conservationists and researchers fight climate change. One of her favorite examples, she said, is a project by Rainforest Connection, a nonprofit that hides thousands of solar-powered phones throughout the Amazon. These devices run lightweight A.I. models that listen to the jungle, identify species, and detect illegal logging. She also pointed to A.I. tools that help track endangered species, spot methane leaks invisible to the human eye, and accelerate discoveries in material science that could lead to greener batteries and solar panels.
Even so, Luccioni warned that A.I.’s ripple effects extend beyond its “tangible material environmental impacts.” As users replace analog or lower-tech digital tools with A.I. at home and work, they may use the time saved to travel, shop or consume more—indirectly increasing their carbon footprints.
These secondary effects are part of what Luccioni calls the “Jevons paradox” of A.I.: as tools become faster and cheaper, usage rises, driving up the total environmental cost. A.I., she said, becomes “a commodity we just can’t get enough of.” “For the CEO of Microsoft, this is a net benefit,” she added. “But what if you look at the cost of A.I.? What if you look at the energy needed by A.I.?”
To Luccioni, building truly sustainable A.I. will take more than efficiency gains or lower emissions. It requires a deeper reckoning with how these systems reshape society and who holds the power to shape them.
As she put it: “In order to be truly sustainable, A.I. has to respect social justice, respect economic incentives, and respect the environment.”