ChatGPT’s electricity consumption, pt. II
An estimate of ChatGPT’s costs supports estimate that ChatGPT uses millions of kilowatt hours per month.

I recently published an article in which I estimate ChatGPT’s electricity consumption in January 2023 to be between 1.1M and 23M KWh.
23M KWh is roughly the same amount of energy that 175,000 Danes use in a month on average.
I’ve since come across an article by Dylan Patel and Afzal Ahmad that estimates the monetary costs of running ChatGPT [1]. To calculate the costs, Patel and Ahmad assume that ChatGPT has 13M daily active users and that each user makes an average of 15 queries per day. Based on this, they estimate that it takes 28,936 Nvidia A100 GPUs to serve ChatGPT.
Let’s now see how my initial estimate of ChatGPT’s energy consumption compares if we use Patel and Ahmad’s numbers.
Let’s start by calculating ChatGPT’s electricity use if it has 13M daily users who each make 15 requests.
This gives a total of 195,000,000 daily requests. Over 30 days, this amounts to 5.85B requests. Let’s assume that each request takes 0.00396 KWh to handle. That was the energy consumption that a similar language model, BLOOM, on average required to handle one request – see [3] or my first article for more details. That puts ChatGPT’s monthly electricity consumption at 23,166,000 KWh.
Let’s now calculate ChatGPT’s monthly energy use if we assume that it requires 28,936 GPUs.
The maximum power draw of the Nvidia A100's is 400W [2], ie 400/1000 = 0.4 KW. Patel and Ahmad assume the hardware is running at 50% capacity due to idle time, so let’s assume the average power draw is 0.4 * 0.5 = 0.2 KW.
With 28,936 GPUs, the total power draw is 5,787.2 KW. That gives an hourly electricity consumption of 5,787.2 KWh. Over a 30 day period, that would put ChatGPT’s electricity consumption at 4,166,784 KWh.
Recall that I initially estimated ChatGPT’s electricity in a month to be between 1.1M to 23M KWh.
So the two numbers that we arrive at based on Patel and Ahmad’s assumptions fall within that range.
Conclusion
The fact that the two different approaches to estimating ChatGPT’s electricity consumption covered here and the approach I used in my initial article yield numbers that are within the same range tells me two important things:
- We can now have more confidence that ChatGPT’s monthly electricity consumption may in fact at the moment be in the millions of KWh
- The energy consumption that BLOOM was measured to have per request actually seems to be relatively similar to what we can expect ChatGPT’s underlying language model, GPT-3, to have. This makes research like the BLOOM paper so valuable.
- The numbers used by Patel and Ahmad are probably in the right ballpark.
That’s it! I hope you enjoyed the story. Let me know what you think!
Follow me for more on AI and sustainability and subscribe to get my stories via email when I publish.
I also sometimes write about time series forecasting.
And feel free to connect on LinkedIn.
References
[1] https://www.semianalysis.com/p/the-inference-cost-of-search-disruption
[2] https://www.nvidia.com/content/dam/en-zz/Solutions/Data-Center/a100/pdf/a100-80gb-datasheet-update-nvidia-us-1521051-r2-web.pdf
[3] https://arxiv.org/abs/2211.02001