Cornell-led researchers uncover the growing environmental impact of AI data centers, comparing their emissions to adding millions of cars on U.S. roads. The study offers a roadmap for substantial reductions in carbon and water usage.
As artificial intelligence rapidly integrates into daily life, the computing infrastructure required to support AI has grown exponentially. This surge has fueled increasing energy demands and environmental concerns, particularly regarding the power consumption and water usage of large data centers.
A new study led by researchers at Cornell University provides a comprehensive state-by-state analysis of the environmental impact of AI data centers. By leveraging advanced data analytics and AI, the team predicts that the current trajectory of AI growth will emit 24 to 44 million metric tons of carbon dioxide annually by 2030. This is equivalent to adding 5 to 10 million cars to U.S. roadways.
Additionally, these data centers could consume 731 to 1,125 million cubic meters of water each year — comparable to the annual household water usage of 6 to 10 million Americans.
The sobering findings, published today in the journal Nature Sustainability, underline the urgency of addressing the environmental footprint of the burgeoning AI sector.
“Artificial intelligence is changing every sector of society, but its rapid growth comes with a real footprint in energy, water and carbon,” Fengqi You, the Roxanne E. and Michael J. Zak Professor in Energy Systems Engineering in Cornell Engineering and the project’s lead, said in a news release. “Our study is built to answer a simple question: Given the magnitude of the AI computing boom, what environmental trajectory will it take? And more importantly, what choices steer it toward sustainability?”
The research team, including first author Tianqi Xiao, a doctoral student in You’s research group, compiled three years’ worth of multi-dimensional data, including financial, marketing and manufacturing records. This data was integrated with location-specific information on power systems and resource consumption to project the environmental impacts of AI infrastructure growth.
“There’s a lot of data, and that’s a huge effort. Sustainability information, like energy, water, climate, tend to be open and public. But industrial data is hard, because not every company is reporting everything,” You said in the news release.
The team employed AI to fill certain gaps in the data, ensuring a comprehensive analysis.
But the study goes beyond identifying problems. It presents an actionable roadmap for significantly reducing the environmental impact of AI data centers. Strategies include smarter siting, faster grid decarbonization and improved operational efficiency. Implementing these measures could cut carbon emissions by approximately 73% and water usage by about 86%.
One critical factor in reducing environmental impact is the location of data centers. Many currently existing hubs are situated in water-scarce regions such as Nevada and Arizona, exacerbating local resource strain. Relocating facilities to areas with lower water stress, coupled with enhanced cooling efficiencies, could reduce water usage by around 52%.
The optimal regions for both carbon and water efficiency, according to the study, are in the Midwest and so-called “windbelt” states such as Texas, Montana, Nebraska and South Dakota.
“New York state remains a low-carbon, climate-friendly option thanks to its clean electricity mix of nuclear, hydropower and growing renewables, although prioritizing water-efficient cooling and additional clean power is key,” You added.
The necessity of accelerating grid decarbonization to keep pace with AI demand is also emphasized in the study. Without substantial improvements, carbon emissions could rise by roughly 20%.
“Even if each kilowatt-hour gets cleaner, total emissions can rise if AI demand grows faster than the grid decarbonizes,” added You. “The solution is to accelerate the clean-energy transition in the same places where AI computing is expanding.”
Moreover, deploying advanced technologies like liquid cooling and enhanced server utilization could further reduce carbon dioxide emissions by 7% and water usage by 29%.
As industry giants like OpenAI and Google continue to invest in AI data centers, this research highlights a crucial moment for coordinated planning between industry stakeholders, utility providers and regulators. The choices made now will determine whether AI serves as a tool for climate progress or an additional environmental burden.
“This is the build-out moment,” You concluded. “The AI infrastructure choices we make this decade will decide whether AI accelerates climate progress or becomes a new environmental burden.”
Co-authors of the study include researchers from the KTH Royal Institute of Technology in Stockholm, Concordia University in Montreal, and the RFF-CMCC European Institute on Economics and the Environment in Milan.
Source: Cornell University

