Facebook disclose the carbon footprint of their new LLaMA models

Facebook used 2.6 million KWh hours of electricity and emitted 1,000 tons of CO2 when developing their new LLaMA models.

--

Photo by Sébastien Goldberg on Unsplash

Facebook recently released four new language models called LLaMA, which outperform GPT-3 — ChatGPT’s underlying language model — on a number of tasks.

Following a recent trend on resource consumption transparency in LLM training, Facebook estimate the electricity consumption and carbon footprint of their models in the paper published on arxiv [1].

Facebook estimate that to train the 4 different sizes of LLaMA they used 2048 Nvidia A100–80GB GPUs for a period of approximately 5 months.

The energy consumption from this is estimated to be 2,638,000 KWh, roughly the same amount of electricity that 1,648 Danes use in a year on average.

Producing that amount of electricity is estimated to have led to the emission of 1,015 tCO2e – roughly the annual carbon footprint of 92 Danes.

What’s really interesting about Facebook’s numbers is that they seem to account for all the GPU compute they used in the entire model development process – not just the compute from training the final version(s), which seems to otherwise be the norm.

Although the energy consumption from developing the LLaMA models cover the development of all four models, it is still surprising that the number exceeds the resource consumption from training GPT-3, which is estimated to have consumed 1,287 MWh [2].

The explanation for this, I venture, is probably that Facebook report the electricity consumption of the entire development process including…

--

--

Kasper Groes Albin Ludvigsen

I write about time series forecasting, sustainable data science and green software engineering