* Google is backing the construction of seven small nuclear reactors in the US.
* Amazon has signed a string of deals for nuclear energy to power its data centres.
* Microsoft plans to reopen the second reactor at Pennsylvania’s Three Mile Island (the first was shuttered following a partial meltdown in 1979; the second for economic reasons in 2019).
Why is Big Tech turning to power generation, and what are the lessons for India?
Artificial intelligence looks weightless because it travels by fibre and appears “wirelessly” on screens. Increasingly, it is turning into an industrial process that converts electricity into prediction (and heat).
Training large AI models such as OpenAI’s ChatGPT and Anthropic’s Claude is energy-intensive; but the process of inference, the serving of billions of queries and searches, is many orders of magnitude hungrier in the aggregate.
As AI becomes pervasive in search, business software, call centres and smartphones, the “always on” aspect is emerging as its most looming challenge. As AI scales, the limiting factor may no longer be data or code or chips but something far older: electricity, and the question of how it can be delivered reliably, at scale.
According to the International Energy Agency (IEA), data centres used roughly 415 terawatt-hours of electricity in 2024, about 1.5% of global demand. This could more than double to 945 TWh by 2030. AI is the fastest-growing slice of total data-centre load.
McKinsey estimates that by 2030, an investment of $5.2 trillion will be needed for AI data centres, of which 25%, or $1.3 trillion, will be for electricity generation, transmission, cooling, and electrical equipment.
For the massive, gigawatt-scale data centres that tech giants are now looking to build, reliable energy, then, is not some background cost. It can become a core competitive advantage.
Where electricity is scarce or slow to arrive, national AI ambitions can stall. In Ireland, for instance, data centres already consume more than a fifth of all electricity generated, and grid operators, faced with reliability risks, have effectively paused new data-centre connections around Dublin.
Across Europe, grid congestion and permit delays are quietly redrawing the map of digital infrastructure. In the traditional hubs of Frankfurt, London, Amsterdam and Paris, connecting a large new data centre can now take seven years. The result is a migration of investment toward regions with spare capacity: southern Europe, the Nordics, or anywhere that electrons can be delivered faster than permits.
POWER POINTS
America’s version of the problem is less national and more local.
Northern Virginia, the world’s densest data-centre cluster, now channels a significant share of the state’s electricity to server farms. According to the US Department of Energy, data centres consumed about 4.4% of American electricity in 2023, and that could double or triple by 2028. In practical terms, AI demand is arriving faster than transmission lines and substations can be built.
These bottlenecks expose a mismatch between the digital and physical worlds. Digital infrastructure can be erected in months; electrical infrastructure moves on a decadal clock. Transformers, switchgear and high-voltage lines have to be manufactured, approved and installed on an industrial scale.
This underpins the sudden corporate embrace of nuclear energy.
OIL IS THE NEW OIL
Intermittent renewables are cheap and clean, but they cannot, on their own, satisfy AI server workload requirements. A model-training run cannot pause when solar power supply dips; a global inference service cannot flicker based on wind-power speeds. Thus the return of nuclear power in corporate boardrooms.
Microsoft’s deal at Three Mile Island and Google’s support for small nuclear reactors are hedges against grid risk. Nuclear power is seen as offering what AI demands: steady output, low carbon intensity and predictable costs over decades.
If Ireland is the cautionary tale, the Middle-East is the counter-example.
Energy-rich states are discovering that electricity abundance can be parlayed directly into global AI relevance. Saudi Arabia, UAE and Qatar are using cheap, reliable power, backed by sovereign balance sheets, to position themselves as AI hubs. Saudi Arabia’s plans include building multiple gigawatts of data-centre capacity by the early 2030s, underwritten by sovereign capital and long-term power contracts and executed by state-backed AI companies such as Humain. UAE is pairing hyperscale AI campuses with gas, solar and nuclear supply to ensure continuous power.
LOCKED IN
This logic increasingly resembles old industrial geography. Aluminium smelters went where power was cheap. AI infrastructure will likely do the same.
When electricity prices differ by a factor of two or three between regions, data centres will flock to places with inexpensive power. Over time, this “compute arbitrage” will matter as much as chip access or the availability of talent. Sovereign wealth funds already finance data centres the way they once financed refineries: capital-intensive assets justified by a long-term edge in energy supplies.
India sits at these crossroads. The country’s data-centre market is scaling from a small base into an infrastructure build-out. Installed data-centre capacity currently stands at about 1.4 gigawatts, with projections of more-than-five-fold growth to 8 gigawatts by 2030. This expansion is being driven by the hyperscalers (large cloud-computing companies), domestic conglomerates and a swelling digital economy, plus a new pressure: national ambition for sovereign AI.
India’s data-centre electricity demand is projected to rise sharply alongside capacity, reaching tens of terawatt-hours by 2030, still modest as a share of national consumption but concentrated in specific states and cities. Announcements over the past two years point to a rapid scaling of data-centre capacity across Mumbai, Chennai, Hyderabad, Bengaluru and the National Capital Region (NCR).
The success of India’s AI push will hinge as much on developing world-class AI models locally as on delivering gigawatts of reliable, affordable power without destabilising the grid.
As IEA points out, data centres are set to be one of the fastest-growing sources of electricity demand this decade. As grid planning, renewable build-out, nuclear options and cooling standards are aligned with data-centre siting and AI roadmaps, leadership in artificial intelligence will accrue not only to those with the best models and chips, but to those who can finance, build and operate gigawatt-scale infrastructure at acceptable costs.
The pursuit of machine intelligence, it turns out, leads back to one of humanity’s oldest constraints: how to keep the lights on.
(Kashyap Kompella is a tech industry analyst and author of three books on AI)
