Why Do Data Centres Need So Much Power?And How India Is Rising to Meet the Challenge

by | Feb 26, 2026

The hidden physics behind AI, cloud computing, and the massive energy infrastructure that powers them — explained for everyone

In October 2025, Google announced a $15 billion investment to build Asia’s first gigawatt-scale AI hub in Visakhapatnam, Andhra Pradesh — a landmark moment for India’s digital ambitions. The facility, when operational, will place Vizag alongside the world’s most important AI infrastructure nodes.

But a natural question arises: why does an AI data centre need a so much of power — Why can’t we just use normal office electricity?

The answer traces back to two concepts from computer chip design: Dennard Scaling and the Power Wall. You don’t need an engineering degree to understand them. And once you do, you’ll see why building energy infrastructure for data centres is one of the most important economic challenges of the 2020s — and why India’s proactive approach to meeting it is exactly the right strategy.

• • •

The Golden Era: When Chips Got Faster for Free

From roughly the 1970s to the early 2000s, the computer industry enjoyed an extraordinary free lunch. Every couple of years, engineers figured out how to make the tiny switches on a computer chip (called transistors) smaller. And here’s the magical part: when you made a transistor smaller, it didn’t just fit more of them on a chip — it also used less power per transistor.

This observation was formalised in 1974 by a researcher named Robert Dennard, and it became known as Dennard Scaling:

As transistors shrink, the voltage and current they need shrink proportionally. So even though you pack more transistors onto a chip, the total power consumption stays roughly the same.

Imagine you run a restaurant kitchen. Dennard Scaling is like discovering that every time you hire more cooks (transistors), each cook needs proportionally less food (power) to work. So your grocery bill stays flat even though you’re cooking twice as many dishes. More output, same energy bill.

Diagram 1 · Dennard Scaling — The Golden Era (1974–2005)
THE FREE LUNCH ERA 1,000 transistors SHRINK 10,000 transistors SHRINK 100K transistors SHRINK 1M+ transistors TOTAL POWER CONSUMPTION ~10W ~10W ~10W ~10W ✓ More transistors, same power = FREE performance Dennard Scaling kept the power bill flat for 30 years

For three decades, this was the engine of the computing revolution. Your laptop today is millions of times more powerful than the computers that sent astronauts to the moon, and for most of that journey, each leap in performance came without a proportional leap in electricity bills.

• • •

The Wall: When the Free Lunch Ended

Around 2005–2006, Dennard Scaling broke down. Transistors had become so tiny — just a few dozen atoms wide — that electrical current began to leak through the transistor walls, even when the transistor was supposed to be “off.” Think of it like a water tap you can never fully close: no matter how hard you turn the handle, it still drips.

This leakage meant that shrinking transistors no longer reduced their power consumption. Every additional transistor now added to the power bill. This breakdown is what engineers call the Power Wall.

Diagram 2 · The Power Wall — Clock Speeds Hit a Ceiling
0 1 GHz 2 GHz 3 GHz 4 GHz CLOCK SPEED 1990 1995 2000 2005 2015 2025 POWER WALL ~2005 Dennard Scaling Era ↑ Stuck at ~3-5 GHz for 20 years

Before the Power Wall, chip clock speeds doubled every few years — from 100 MHz to 200, to 500, to 1 GHz, to 3 GHz. Then the climb stopped. Today, twenty years later, most chips still run at 3–5 GHz. The wall held.

• • •

So What Did the Industry Do? It Went Wide.

When you can’t make a single engine faster, you bolt on more engines. Instead of one super-fast processor core, modern chips have thousands of cores running in parallel — this is what GPUs (Graphics Processing Units) do, and they are the workhorses of AI.

This shift solved the performance problem but made the power problem much worse. Since Dennard Scaling no longer helps, every additional chip is an additional power consumer.

Diagram 3 · The Shift: From One Fast Chip to Thousands of Parallel Chips
BEFORE: One Fast Chip 1 Core 3 GHz Power: ~100W POWER WALL AFTER: Thousands of Parallel Chips 3,571 GPUs × 700W each Power: 5,000,000 W Easy to power Needs dedicated power plants
• • •

Inside a 5 MW Data Centre: Where Does All the Power Go?

Let’s make this concrete with a 5 MW AI data centre — a modest facility by global standards. The workhorse GPU chip for AI today is NVIDIA’s H100, drawing about 700 watts each — roughly the power of a large microwave oven running continuously, 24/7/365.

Only about 50% of total facility power goes to compute. The rest supports cooling, storage, networking, and power conversion:

Number of H100 chips = 0.5 × 5,000,000W ÷ 700W ≈ 3,571 GPUs
3,571 H100 GPU chips
700W per chip, 24/7
5 MW total facility power
Diagram 4 · Power Consumption Breakdown — 5 MW AI Data Centre
TOTAL: 5 MW Compute — 50% — 2.50 MW Cooling — 25% 8.5% 8.5% 8% Compute (GPUs / CPUs) 2.50 MW — The AI chips doing actual work 50% Cooling Systems 1.25 MW — Removing heat from the chips 25% Storage 0.43 MW — Hard drives and SSDs 8.5% Networking 0.43 MW — Switches and cables 8.5% Power Conversion + Misc 0.40 MW — Voltage conversion, lighting, security 8%

A full quarter of all electricity — 1.25 MW — goes purely to cooling. That’s 1.25 million watts doing nothing computationally useful. It exists solely to keep the chips from overheating. This is the Power Wall made tangible: because Dennard Scaling broke down, every watt of useful computation generates waste heat that demands additional energy to manage.

The PUE Number: Measuring Overhead

Engineers use Power Usage Effectiveness (PUE) to measure data centre overhead. Indian data centres in 2024 range from PUE 1.3 to 1.6. At PUE 1.6, for every watt used by IT equipment, the facility consumes 1.6 watts total — the extra 0.6 watts goes to cooling, power conversion, and support systems.

Diagram 5 · PUE 1.6 — For Every Watt of Compute, You Pay 1.6 Watts Total
IT Equipment: 1.0W Overhead: 0.6W ← 62.5% useful work → ← 37.5% overhead → Total: 1.6 watts consumed for every 1 watt of computation
• • •

India’s Proactive Response: Meeting the Power Challenge Head-On

What’s remarkable about India’s approach is that the Government has recognised this challenge early and is building a comprehensive energy ecosystem to support data centre growth — rather than waiting for a crisis to force action.

🔬 Nuclear Energy Mission & Small Modular Reactors

The Union Budget 2025–26 launched the “Nuclear Energy Mission for Viksit Bharat” with an allocation of ₹20,000 crore (~$2.5 billion), targeting at least five indigenously designed Small Modular Reactors (SMRs) operational by 2033 and a long-term goal of 100 GW nuclear capacity by 2047.

India’s Bhabha Atomic Research Centre is developing three SMR variants — the 200 MW Bharat SMR, a 55 MW SMR for remote locations, and a 5 MWt high-temperature gas-cooled reactor. Lead units are planned for Tarapur in Maharashtra and the Vizag campus in Andhra Pradesh. The December 2025 amendment to the Atomic Energy Act now allows private sector participation for the first time — companies like Tata Power and Jindal Group have already expressed interest.

SMRs are particularly relevant for data centres: their modular design means capacity can be added incrementally as demand grows, they provide reliable baseload power regardless of weather, and they can be co-located near data centres — exactly the kind of dedicated, clean power source that gigawatt-scale AI facilities need.

🔋 Battery Energy Storage Systems (BESS)

India’s push for grid-scale battery storage is critical for making renewable energy reliable enough for data centres, which need 24/7 uninterrupted power.

The Government has launched two tranches of Viability Gap Funding: the first for 13.2 GWh with ₹3,760 crore, and a second for 30 GWh with ₹5,400 crore — totalling 43.2 GWh of supported BESS capacity. The National Electricity Plan projects India will need 411 GWh of storage by 2031–32 and 2,380 GWh by 2047.

Lithium-ion battery costs have fallen over 90% since 2010 — from $1,200/kWh to approximately $100/kWh in 2025. This cost reduction, combined with Government VGF support and the new 20% domestic content requirement for BESS projects, is creating a robust domestic ecosystem.

☀️ Renewable Energy at Scale

India’s total installed renewable energy capacity has crossed 250 GW — already exceeding fossil fuel capacity. The Union Budget 2025–26 allocated an additional ₹26,549 crore to the Ministry of New and Renewable Energy, a significant jump from the previous year’s ₹19,100 crore.

For the Visakhapatnam AI hub specifically, Google has committed $2 billion of its total investment to renewable energy development — integrating clean energy directly into the facility’s design. This approach — coupling data centre demand with new renewable generation — means these massive facilities can actually accelerate India’s clean energy transition rather than burden it.

Diagram 7 · India’s Multi-Pronged Energy Strategy for Data Centres
DATA CENTRE POWER 24/7 reliable, clean electricity 🔬 Nuclear Energy Mission + SMRs ₹20,000 Cr · 100 GW by 2047 · 5 SMRs by 2033 ☀️ Renewables 250+ GW ₹26,549 Cr MNRE budget 🔋 BESS: 43 GWh VGF 411 GWh target by 2031-32 🏭 Make in India Solar PV cells, batteries, turbines ⚡ Grid Modernisation Green Energy Corridor + subsea cables Integrated strategy: no single technology, but a coordinated national ecosystem
• • •

The Causal Chain: Why This All Matters

Diagram 8 · From a 1974 Physics Principle to India’s Energy Opportunity
DENNARD SCALING BREAKS DOWN (~2005) Transistors can’t shrink without leaking power THE POWER WALL Clock speeds stuck at 3–5 GHz for 20 years INDUSTRY GOES PARALLEL Thousands of GPUs instead of one fast CPU MASSIVE POWER DEMAND: 5 MW → 1 GW FACILITIES 50% compute + 25% cooling + 25% support systems NATIONS MUST BUILD DEDICATED ENERGY INFRASTRUCTURE Data centres need power plant-scale, 24/7, reliable supply INDIA’S RESPONSE: A NATIONAL ENERGY ECOSYSTEM Nuclear Mission (₹20K Cr) + SMRs + 250 GW Renewables + 43 GWh BESS VGF + Grid Modernisation + Make in India Google AI Hub Vizag: $15B investment, $2B in renewables 188,000 jobs/year · ₹10,518 Cr annual GSDP contribution The physics creates the challenge. Smart policy creates the opportunity.
• • •

The Bottom Line

When you hear about data centres needing the power, it’s not waste or poor planning. It’s physics:

  • Dennard Scaling — the principle that let chips get faster without more power — broke down around 2005.
  • The Power Wall forced the industry to use thousands of parallel chips instead of faster individual ones.
  • A modest 5 MW data centre packs 3,571 GPUs, each drawing 700 watts non-stop. Only half the electricity does computation; a quarter goes to cooling.
  • What matters is how a nation responds to this challenge. India’s integrated strategy — combining a ₹20,000 crore Nuclear Energy Mission with SMR development, 250+ GW of installed renewables, 43 GWh of VGF-supported battery storage, domestic manufacturing through Make in India, and grid modernisation — positions the country not merely to cope with AI’s energy demands but to lead the global transition to sustainable, AI-ready infrastructure.

    The Google AI hub in Visakhapatnam — with its $15 billion investment and $2 billion dedicated to renewable energy — is a testament to this approach. It demonstrates that when policy, infrastructure, and private investment align, the immense power needs of modern AI become not a burden but a catalyst for national development.

    The next time your AI assistant answers a question in two seconds, remember: somewhere, thousands of chips are running at full speed, backed by power plants, solar farms, battery banks, and soon, small modular reactors — an entire national energy ecosystem working in concert, all because a scaling law named after Robert Dennard stopped working twenty years ago. The physics created the challenge. India is building the answer.

    Data centre power benchmarks from Takshashila Institution Report 2025-27.
    Government of India policy data from Union Budget 2025–26, PIB, and Ministry of Power announcements.
    Google AI Hub details from official Google Cloud press release, October 2025.

    Disclaimer : This article is made using Claude AI.

Reader Response: Use the form below to share observations, corrections, or relevant insights related to this article.

Share This