Artificial Intelligence may be the sharpest tool in humanity’s digital shed, but behind its sleek interface lies a growing climate conundrum. From helping us choose our next binge-worthy show to whispering sweet nothings as a virtual romantic partner, AI is rapidly becoming an inseparable part of everyday life. But what powers this “magic” comes with a carbon footprint big enough to leave scorch marks on the planet.
A new investigation by MIT Technology Review has pulled back the curtain on the escalating environmental cost of AI—and the findings are as alarming as they are eye-opening. Experts now suggest that what we’re seeing today might just be the calm before a very energy-hungry storm.
And it doesn’t stop there. OpenAI’s Sam Altman admitted that even the “politeness” in our prompts costs tens of millions of dollars in operational expenses. AI systems like Google's Gemini and image generators that churn out 78 million images a day aren’t just consuming bandwidth—they’re devouring energy.
The MIT report reveals that by 2028, over half of all electricity used by data centres could go directly into powering AI. That translates to between 165 and 326 terawatt-hours annually—more electricity than what all U.S. data centres currently use for everything. It’s also enough to power nearly a quarter of all American homes.
To put it in a wilder perspective: this energy use would emit the same carbon as 1,600 round trips from Earth to the Sun in a car. It's a statistic so surreal it almost feels fictional—except it’s not.
This has led to serious concerns from environmentalists. “It’s not clear to us that the benefits of these data centres outweigh these costs,” said Eliza Martin of Harvard’s Environmental and Energy Law Program. “Why should we be paying for this infrastructure? Why should we be paying for their power bills?”
If the trajectory continues unchecked, this hidden environmental tax may soon become too heavy to ignore. While AI may promise smarter futures, the question remains: at what cost?
And perhaps more pressingly—was today’s AI energy footprint the smallest it will ever be? If so, the future could be brighter for tech but bleaker for the planet.
A new investigation by MIT Technology Review has pulled back the curtain on the escalating environmental cost of AI—and the findings are as alarming as they are eye-opening. Experts now suggest that what we’re seeing today might just be the calm before a very energy-hungry storm.
The Energy Avalanche You Didn't See Coming
For every chatbot reply or AI-generated painting, there’s a surge of electricity flowing through data centres that rarely sleep. And according to Professor Sajjad Moazeni, asking ChatGPT a single question may use 10 to 100 times more energy than sending an email. Multiply that by the billion messages ChatGPT receives daily, and you begin to understand the scale.And it doesn’t stop there. OpenAI’s Sam Altman admitted that even the “politeness” in our prompts costs tens of millions of dollars in operational expenses. AI systems like Google's Gemini and image generators that churn out 78 million images a day aren’t just consuming bandwidth—they’re devouring energy.
The MIT report reveals that by 2028, over half of all electricity used by data centres could go directly into powering AI. That translates to between 165 and 326 terawatt-hours annually—more electricity than what all U.S. data centres currently use for everything. It’s also enough to power nearly a quarter of all American homes.
To put it in a wilder perspective: this energy use would emit the same carbon as 1,600 round trips from Earth to the Sun in a car. It's a statistic so surreal it almost feels fictional—except it’s not.
Data Centres: The Silent Gas Guzzlers of the Digital World
AI infrastructure isn’t just greedy—it’s relentless. “AI data centres need constant power, 24-7, 365 days a year,” said Rahul Mewawalla, CEO of Mawson Infrastructure Group. And despite the optimism around renewables, the bulk of that power still comes from fossil fuels. As AI adoption accelerates, so does the dependency on energy grids that are far from green.This has led to serious concerns from environmentalists. “It’s not clear to us that the benefits of these data centres outweigh these costs,” said Eliza Martin of Harvard’s Environmental and Energy Law Program. “Why should we be paying for this infrastructure? Why should we be paying for their power bills?”
A Future Too Hot to Handle?
The AI revolution is pushing boundaries, but it’s also pushing climate scientists to the brink of panic. With global warming already spinning out of control, the sudden explosion of energy-intensive AI tools adds a new layer to an already urgent crisis.If the trajectory continues unchecked, this hidden environmental tax may soon become too heavy to ignore. While AI may promise smarter futures, the question remains: at what cost?
And perhaps more pressingly—was today’s AI energy footprint the smallest it will ever be? If so, the future could be brighter for tech but bleaker for the planet.