Facebook disclose the carbon footprint of their new LLaMA models

Facebook used 2.6 million KWh hours of electricity and emitted 1,000 tons of CO2 when developing their new LLaMA models.

Kasper Groes Albin Ludvigsen
4 min readMar 3, 2023
Photo by Sébastien Goldberg on Unsplash

Facebook recently released four new language models called LLaMA, which outperform GPT-3 — ChatGPT’s underlying language model — on a number of tasks.

Following a recent trend on resource consumption transparency in LLM training, Facebook estimate the electricity consumption and carbon footprint of their models in the paper published on arxiv [1].

Facebook estimate that to train the 4 different sizes of LLaMA they used 2048 Nvidia A100–80GB GPUs for a period of approximately 5 months.

The energy consumption from this is estimated to be 2,638,000 KWh, roughly the same amount of electricity that 1,648 Danes use in a year on average.

Producing that amount of electricity is estimated to have led to the emission of 1,015 tCO2e – roughly the annual carbon footprint of 92 Danes.

--

--

Kasper Groes Albin Ludvigsen

I write about LLMs, time series forecasting, sustainable data science and green software engineering