Abstract
The rapid global adoption of advanced artificial intelligence (AI) and machine learning (ML) systems is catalyzing a new wave of digital infrastructure expansion. While much attention has focused on the capabilities and societal impacts of AI, comparatively less scrutiny has been given to the material infrastructure—specifically, high-performance data centers—that enables its growth. This article investigates whether increased AI/ML deployment is driving significant increases in energy and water consumption worldwide. Findings indicate that AI-specific workloads now constitute a rapidly growing share of total data center operations, with training and inference of large models contributing disproportionately to electricity demand and cooling requirements. In tandem, water usage for data center cooling—often overlooked—has grown substantially, especially in regions already facing water stress. The paper concludes with implications for theory, engineering practice, and policy, advocating for mandatory resource transparency, efficiency regulation, and lifecycle accountability in AI development. It also identifies research gaps in regional modeling, AI workload differentiation, and cross-sector environmental assessments.
Get full access to this article
View all access options for this article.
