Estimating the energy consumption of an AI giga factory
News media report that the EU’s AI giga factories can each consume as much electricity as 2 million households, but does that really hold up?
The EU wants to build 5 so called “AI giga factories” (AIGFs) that will each host “over 100,000 advanced AI processors”.
The Danish government wants one of these AIGFs to be placed in Denmark which has caught the attention of news outlets.
Some of these news articles, eg one by the Danish Broadcasting Corporation (DR), cite Danish university professors for saying that such a data center can consume the same amount of energy as 2,000,000 Danish households.
“Come again?” was my first thought when I read that. It seemed like an overshot.
So I decided to try to estimate the energy use from the AI processors.
I arrived at a number well below 2,000,000 households and my number is in the right ballpark according to an SVP of infrastructure at a company that plans to make a bid to build a Danish AIGF.
Read on to see how I went about it and what I arrived at.
Estimating the energy consumption of an AI giga factory
To estimate the energy use of an AIGF, we need to make some assumptions.
Let’s say an AIGF hosts 100,000 Nvidia B200 chips spread across 12,500 Nvidia DGX B200 servers.
The rated power of an Nvidia DGX B200 server is 14.3 KW. Rated power is the server’s maximum power draw, according to the specs.
AI servers normally don’t operate at rated power draw. I will assume that the AIGF will mainly run AI training jobs or jobs with a similar power profile. I assume such jobs draw an average of 75 % of the rated power. That’s the so called operational power draw. This assumption is based on empirical measurements.
AI servers also normally don’t run jobs all the time. It is natural to have some idle time between jobs. In the 2024 United States Data Center Energy Usage Report – an authoritative report on the topic of estimating data center energy consumption – the authors assume that AI servers run 80 % of the time and are idle for the remainder. That’s the so-called average operational time. I rely on the same assumptions. So, to recap, I assume the servers run at 75 % of rated power 80 % of the time. When they don’t run AI workloads, I assume they draw 20 % of rated power like the aforementioned report does.
It’s standard practice to incorporate the power usage effectiveness (PUE) of a data center when modeling energy use. PUE measures the energy efficiency of data centers and is calculated as the ratio of the total facility power to the power used by IT equipment. I assume a PUE of 1.14 for AI specialized data centers like the aforementioned report does.
So, if 12,500 Nvidia DGX B200 servers run in a data center with a PUE of 1.14 at 75 % of rated power 80 % of the time and is idle at 20 % of rated power for the remainder, they will use 1,428,060,210 KWh (1,428 GWh) in a year.
How many Danish households does that equal?
According to a Danish energy company, a Danish household of 2 adults and 2 children uses between 4,500 and 5,000 KWh in a year. Let’s make that 4,500.
So the energy consumption of the AIGF would equal that of roughly 317,000 Danish households. Far from 2,000,000.
The actual calculations can be viewed in this spreadsheet.
Validity of assumptions and estimates
So, do these assumptions hold up?
The hardware could run less energy intensive workloads and/or be used less of the time.
The data center could be stocked with less energy intensive chips like the H100s. But they are less energy efficient. They would also provide fewer FLOPs, ie less compute power and by the time the AIGF is built, the H100 will be outdated if not entirely discontinued by Nvidia.
The above calculations do not account for any external networking devices, firewalls, external storage etc. They do however account for the on-server storage and networking equipment. Nor do they account for power distribution or recovery of heated water for district heating (we don’t know for a fact that heated water will be recovered).
That being said, the numbers are in the right ballpark if we only consider energy consumption by GPU servers, says Ali Syed, SVP of Infrastructure at Danish AI Innovation Center (DCA) – a company planning to bid on the building of an AIGF in Denmark. He left a comment on a LinkedIn post of mine where I first aired my skepticism. He wrote:
Great analysis! Denmark’s industrial champions – Danfoss and Grundfos – are already helping us drive PUE toward its practical minimum, and the next generation of hotter, liquid-cooled chips actually plays to our strengths.
When we model our energy budget and PUE calculations we include:
Redundancy losses, Power-train inefficiencies (switchgear, transformers,chillers etc.), Non-GPU workloads (storage, networking, comms), Energy to export waste heat to district-heating networks.
Even after adding every overhead, we’re nowhere near the oft-quoted “2 million households” benchmark – the worst case is about 20 % of that, assuming a realistic mix of inference and training.
We’ll share more projections in the next phase. Your back-of-the-envelope check is spot-on
My estimates are also in the same ballpark as numbers presented on LinkedIn by Henrik Hansen, CEO of the industry association Danish Data Center Industry. He writes:
One of the new GIGA Factories will, depending on the exact assumptions, require a 110 – 150 MW data center with an annual electricity consumption of approximately 1 – 1.3 TWh – equivalent to around 300,000 households – not 1 or 2 million households.
The data center market is already expected to grow from 335 to 860 MW by 2028, so a GIGA Factory will only add a small part to the growth that is already accounted for in the expansion plans for data centers and renewable energy.
Other perspectives on the AIGFs
Debates about the AIGFs not only center on their electricity consumption.
Some express concern that the Danish energy system cannot produce enough additional sustainable energy to supply an AIGF.
Others think that the EU should not fund such projects, because they believe there is not enough demand for compute at that scale. Why have the markets have not met the demand if it exists, they wonder.
Conclusion
It seems safe to say that the AIGF will not have the same energy footprint as 2,000,000 households although it will use a substantial amount of energy. I hope this message is received by other journalists who want to write about the story of AIGFs. I also hope that journalists will do more digging next time. Instead of asking energy systems experts about the energy use of data centers, ask data center experts (people like Ali Syed from DCAI). Lastly, I hope experts who are interviewed for stories like this will be cautious about the numbers they share. We should definitely debate whether it’s sensible to build an AIGF in Denmark (and whether the EU should fund them in the first place), but the debate should be grounded in facts.
