The rapid expansion of artificial intelligence workloads is driving high demand for electricity, water, and land to build and run data centers [1, 2, 3].

This surge in infrastructure needs threatens to outpace the availability of natural resources and energy grids. As AI models grow in complexity, the physical requirements to sustain them create a conflict between technological progress and environmental sustainability.

Training and operating large AI models requires massive compute power, which necessitates significant amounts of cooling water and physical space [3, 4]. According to a CNN analysis, AI's rapid growth is colliding headlong with a finite amount of available energy and computing power [5].

In Australia, the impact is visible through the growth of companies like NextDC. Craig Scroggie spent 15 years building NextDC into one of the largest data-center companies in the country [6]. The scale of these facilities is immense, with some AI data centers consuming up to 200 megawatts of power per facility [5].

Resource consumption remains a point of contention among industry leaders and observers. A CBS News report said, "Data centers used to store, train and operate AI models use significant amounts of energy and water" [7]. The International Energy Agency reports that a typical AI data center uses about 200 MWh of electricity per year [7].

However, views on the severity of these impacts vary. While some reports suggest the environmental cost is rising to a level that may require a retreat from AI [8], others propose technological solutions. Josh Pearce said, "Agrivoltaics can cover energy for data centers while still producing food" [9]. This approach suggests that energy production and land use can be integrated to mitigate the footprint of the AI boom.

Despite these potential solutions, the tension between compute needs and resource limits persists. The current boom, spanning 2024 to 2026, has accelerated the timeline for infrastructure upgrades across the globe [5, 8, 6].

AI's rapid growth is colliding headlong with a finite amount of available energy and computing power.

The conflict between AI's compute requirements and planetary boundaries indicates a looming ceiling for the current scaling phase of large language models. If energy and water constraints cannot be solved through innovations like agrivoltaics or more efficient cooling, the industry may face a forced slowdown or a shift toward decentralized, low-power computing architectures.