Back to Knowledge Hub
Research

The Carbon Cost of Intelligence: AI's Environmental Footprint

Data center electricity demand will double by 2030. The environmental cost of AI has moved from a niche concern to a defining sustainability challenge.

ResponsibleAI Labs
March 23, 2026
9 min read
The Carbon Cost of Intelligence: AI's Environmental Footprint

AI systems may already have a carbon footprint equivalent to that of New York City and a water footprint approaching the world's total annual consumption of bottled water. With data center electricity demand projected to double by 2030, the environmental cost of artificial intelligence has moved from a niche concern to a defining sustainability challenge. This article examines the latest data on AI's energy, carbon, and water impacts - and the roadmap for making AI sustainable.


The numbers behind the boom

The explosive growth of generative AI has triggered an infrastructure buildout unprecedented in the history of computing. OpenAI and the Trump administration announced the Stargate initiative, aiming to spend $500 billion on up to 10 data centers. Apple pledged $500 billion in US manufacturing and data center investment over four years. Google expects to spend $75 billion on AI infrastructure in 2025 alone.

This isn't simply the normal growth of a digital economy. From 2005 to 2017, data center electricity consumption remained remarkably flat despite the rise of cloud computing, social media, and streaming services - thanks to efficiency gains that kept pace with demand. AI has shattered that equilibrium. The International Energy Agency estimates that global data center electricity demand will more than double by 2030, reaching approximately 945 terawatt-hours - slightly more than Japan's total consumption and enough to make data centers collectively the fifth-largest electricity consumer on the planet.

The carbon cost of intelligence: AI's environmental footprint

Figure 1: Key statistics on AI's energy, carbon, and water footprint, plus the scale of the infrastructure buildout and regional hotspots.


Carbon: the climate cost

Quantifying AI's exact carbon emissions is complicated by a fundamental transparency problem: no major tech company reports AI-specific environmental metrics. A December 2025 study published in ScienceDirect by Alex de Vries-Gao of VU Amsterdam estimated that AI systems' carbon footprint in 2025 fell between 32.6 and 79.7 million tons of CO₂ - comparable to the total annual emissions of Norway at the lower end.

A parallel study published in Nature Sustainability in November 2025 projected that US AI server deployment alone could generate 24 to 44 million tons of CO₂-equivalent annually between 2024 and 2030, depending on the scale of expansion and grid decarbonization rates. The researchers concluded that the AI server industry is "unlikely to meet its net-zero aspirations by 2030 without substantial reliance on highly uncertain carbon offset and water restoration mechanisms."

The energy math is sobering. According to Goldman Sachs Research, approximately 60% of the increasing electricity demand from data centers will be met by burning fossil fuels, potentially adding roughly 220 million tons to global carbon emissions. Even as individual kilowatt-hours get cleaner, total emissions can rise if AI demand grows faster than the grid decarbonizes - a dynamic Cornell researchers call the "rebound problem."

And the emissions story doesn't stop at electricity. "Embodied carbon" - the emissions from building data centers, manufacturing GPUs, and mining raw materials - is substantial but often overlooked. Data centers are enormous facilities, 10 to 50 times more energy-intensive per square foot than typical commercial buildings, constructed from tons of steel and concrete. GPU fabrication involves complex semiconductor manufacturing processes with their own significant carbon and chemical footprint.


Water: the hidden cost

Less visible but equally significant is AI's water footprint. Data centers consume enormous quantities of water, both directly (for cooling servers) and indirectly (in the electricity generation that powers them). The VU Amsterdam study estimated that AI's water footprint in 2025 could range from 312.5 to 764.6 billion liters - a volume comparable to the world's entire annual consumption of bottled water.

Individual facilities can use staggering amounts. Some data centers consume up to 5 million gallons per day - equivalent to a small town's total daily water use. The problem is compounded by siting decisions: many current data center clusters are being built in water-stressed regions like Nevada and Arizona. In northern Virginia, where data centers already consume 26% of the state's electricity, rapid clustering is straining local water infrastructure.


Regional hotspots

AI's environmental impact is not evenly distributed. It is concentrated in regions where data centers cluster, creating localized stress on electricity grids and water supplies.

In Dublin, data centers already consume approximately 79% of the city's electricity, according to analysis by the Oeko-Institute. The IEA estimates Ireland's nationwide data center electricity share could rise to 32% by 2026. In the United States, Virginia is the densest data center market in the world, with these facilities consuming 26% of state electricity. Roughly half of all US and Japanese power demand growth over the next five years is expected to come from data centers.

These concentrations create cascading effects. As MIT Technology Review documented, Elon Musk's X supercomputing center near Memphis was found to be using dozens of methane gas generators - allegedly without regulatory approval - to supplement grid power. When demand outpaces supply, even companies with clean-energy commitments resort to fossil fuels.


The per-query perspective

Amid the macro-level alarm, it's worth noting that efficiency gains at the individual query level have been dramatic. Google reports a 33-fold reduction in energy and 44-fold reduction in carbon for the median AI prompt between 2024 and 2025. A single Gemini-class query now uses approximately 0.24 watt-hours of energy and 0.26 milliliters of water - comparable to watching about nine seconds of television.

But scale matters. Billions of daily prompts, combined with training runs, hardware manufacturing, and end-of-life processes, aggregate into the system-level impacts described above. And as AI is integrated into every corner of digital life - search, email, document editing, customer service - users increasingly consume AI resources without consciously choosing to do so.


The roadmap to sustainable AI

A November 2025 Nature Sustainability study led by Cornell University's Fengqi You outlined an actionable roadmap showing that coordinated mitigation strategies could reduce AI's carbon emissions by approximately 73% and water use by approximately 86% compared to worst-case scenarios.

Roadmap to sustainable AI: mitigation strategies and their impact

Figure 2: Three pillars of AI sustainability - smart siting, grid decarbonization, and operational efficiency - and their combined impact potential.

Smart siting: location as the first lever

By far the most impactful single factor is where data centers are built. Locating facilities in regions with low water stress and clean electricity grids could slash water demands by about 52% on its own. The Cornell study identified the US Midwest and "windbelt" states - Texas, Montana, Nebraska, and South Dakota - as offering the best combined carbon-and-water profile. New York State's clean electricity mix (nuclear, hydropower, and growing renewables) also makes it a favorable option.

Conversely, continuing to cluster data centers in water-scarce, fossil-fuel-dependent regions would lock in decades of environmental harm. As Professor You put it: "This is the build-out moment. The AI infrastructure choices we make this decade will decide whether AI accelerates climate progress or becomes a new environmental burden."

Grid decarbonization: matching clean energy to AI demand

Even optimal siting can only do so much if the underlying electricity grid remains carbon-intensive. In the Cornell study's ambitious high-renewables scenario, grid decarbonization alone reduced CO₂ by approximately 15% compared to baseline - but roughly 11 million tons of residual emissions remained, requiring an additional 28 gigawatts of wind or 43 gigawatts of solar capacity to reach net zero.

Tech companies are investing heavily in clean energy. By the end of 2022, US technology companies had already contracted over 35 gigawatts of renewable electricity through power purchase agreements. Microsoft has signed a 10 GW renewable energy deal with Brookfield and a 0.8 GW nuclear deal with Constellation Energy. But the expansion of renewable generation is not yet keeping pace with AI's demand growth - a gap that must close for climate goals to be met.

Operational efficiency: doing more with less

Technical improvements within data centers offer additional gains. Advanced liquid cooling can reduce water use by 29%. Improved server utilization, workload scheduling (shifting computation to times when the grid is cleanest), and more efficient AI models all contribute. Google's 33-fold per-prompt efficiency improvement between 2024 and 2025 demonstrates what is technically possible.

At the model level, techniques like knowledge distillation, quantization, and architecture search can dramatically reduce the computational cost of AI inference. As MIT's Neil Thompson has argued, "Making these models more efficient is the single most important thing you can do to reduce the environmental costs of AI."

Combined, siting, grid decarbonization, and operational efficiency can achieve approximately 73% carbon reduction and 86% water reduction - but only if pursued simultaneously and at scale.


The transparency gap

A consistent theme across the research is the lack of transparency from the AI industry. The VU Amsterdam study examined environmental reports from nine major tech companies and found that no company reports AI-specific environmental metrics, despite several acknowledging AI as a key driver of increased energy consumption. Without this data, researchers must rely on approximations, and regulators cannot effectively oversee environmental compliance.

The study called for new policies mandating disclosure of AI-specific metrics, including the locations where AI systems operate, the scale of operations at each site, and water usage effectiveness values for individual facilities. Without such transparency, the environmental impact of AI will remain, in the words of one MIT researcher, "largely hidden from public view."


What comes next

The environmental sustainability of AI is not a solved problem, but it is a solvable one. The research points to several priorities:

Mandate environmental disclosure. Regulators should require AI companies to report AI-specific energy consumption, carbon emissions, and water use - disaggregated by facility and workload type. The EU AI Act's environmental provisions offer a starting point.

Integrate environmental criteria into siting decisions. Planning authorities should factor water stress, grid carbon intensity, and renewable energy availability into data center permitting processes, rather than allowing market forces alone to determine location.

Accelerate clean energy deployment. The renewable energy transition must keep pace with AI demand growth. This means not only contracting new capacity but ensuring it is built and grid-connected on the timelines that AI buildout demands.

Invest in algorithmic efficiency. Research funding should prioritize making AI models more computationally efficient - reducing both training and inference costs. The 33-fold improvement Google achieved in one year shows the potential.

Protect communities. The environmental and economic costs of data centers - from water consumption to electricity price increases - fall disproportionately on local communities. Equity must be embedded in every stage of planning and permitting.


Conclusion

The AI industry stands at a crossroads. The infrastructure decisions being made right now - where to build, how to power, and how to cool the next generation of data centers - will determine whether artificial intelligence accelerates the transition to a sustainable economy or becomes a significant new burden on the planet's climate and water resources.

The roadmap exists. Smart siting, grid decarbonization, and operational efficiency can together cut AI's environmental footprint by roughly three-quarters. But achieving this requires the same urgency, coordination, and accountability that the industry brings to building the AI systems themselves. The carbon cost of intelligence is real - but it is not yet inevitable.


References

  1. IEA (2025). Global Energy Review 2025.

  2. Nature Sustainability (2025). "Environmental impact and net-zero pathways for sustainable AI servers in the USA." Nov 10.

  3. de Vries-Gao, A. (2025). "The carbon and water footprints of data centers and what this could mean for AI." ScienceDirect, Dec 17.

  4. Cornell Chronicle (2025). "'Roadmap' shows the environmental impact of AI data center boom." Nov.

  5. MIT News (2025). "Explained: Generative AI's environmental impact." Jan 17.

  6. MIT News (2025). "Responding to the climate impact of generative AI." Sept 30.

  7. MIT Technology Review (2025). "We did the math on AI's energy footprint." May 20.

  8. Carbon Brief (2025). "AI: Five charts that put data-centre energy use – and emissions – into context." Sept.

  9. Euronews (2025). "AI data centres could have a carbon footprint that matches small European country." Dec 20.

  10. Carbon Direct (2025). "Understanding the carbon footprint of AI and how to reduce it."

  11. Online Learning Consortium (2025). "The Real Environmental Footprint of Generative AI." Dec 4.


This article is part of ResponsibleAI Labs' 2026 series on emerging AI ethics and risk. For more, visit responsibleailabs.com.

AI's Environmental Footprint: Energy, Carbon, and Water | ResponsibleAI Labs