ChatGPT’s electricity consumption, pt. II
An estimate of ChatGPT’s costs supports estimate that ChatGPT uses millions of kilowatt hours per month.
--
I recently published an article in which I estimate ChatGPT’s electricity consumption in January 2023 to be between 1.1M and 23M KWh.
23M KWh is roughly the same amount of energy that 175,000 Danes use in a month on average.
I’ve since come across an article by Dylan Patel and Afzal Ahmad that estimates the monetary costs of running ChatGPT [1]. To calculate the costs, Patel and Ahmad assume that ChatGPT has 13M daily active users and that each user makes an average of 15 queries per day. Based on this, they estimate that it takes 28,936 Nvidia A100 GPUs to serve ChatGPT.
Let’s now see how my initial estimate of ChatGPT’s energy consumption compares if we use Patel and Ahmad’s numbers.
Let’s start by calculating ChatGPT’s electricity use if it has 13M daily users who each make 15 requests.
This gives a total of 195,000,000 daily requests. Over 30 days, this amounts to 5.85B requests. Let’s assume that each request takes 0.00396 KWh to handle. That was the energy consumption that a similar language model, BLOOM, on average required to handle one request – see [3] or my first article for more details. That puts ChatGPT’s monthly electricity consumption at 23,166,000 KWh.
Let’s now calculate ChatGPT’s monthly energy use if we assume that it requires 28,936 GPUs.
The maximum power draw of the Nvidia A100's is 400W [2], ie 400/1000 = 0.4 KW. Patel and Ahmad assume the hardware is running at 50% capacity due to idle time, so let’s assume the average power draw is 0.4 * 0.5 = 0.2 KW.
With 28,936 GPUs, the total power draw is 5,787.2 KW. That gives an hourly electricity consumption of 5,787.2 KWh. Over a 30 day period, that would put ChatGPT’s electricity consumption at 4,166,784 KWh.