Energy consumption of artificial intelligence (AI) models in training is considerable, with both GPT-3, the original release of the current iteration of OpenAI's popular ChatGPT, and Gopher consuming well over a thousand-megawatt hours of energy simply for training. As this is only for the training model it is likely that the energy consumption for the entire usage and lifetime of GPT-3 and other large language models (LLMs) is significantly higher. The largest consumer of energy, GPT-3, consumed roughly the equivalent of 200 Germans in 2022. While not a staggering amount, it is a considerable use of energy.
Energy savings through AI
While it is undoubtedly true that training LLMs takes a considerable amount of energy, the energy savings are also likely to be substantial. Any AI model that improves processes by minute numbers might save hours on shipment, liters of fuel, or dozens of computations. Each one of these uses energy as well and the sum of energy saved through a LLM might vastly outperform its energy cost. A good example is mobile phone operators, of which a third expect that AI might reduce power consumption by ten to fifteen percent. Considering that much of the world uses mobile phones this would be a considerable energy saver.
Emissions are considerable
The amount of CO2 emissions from training LLMs is also considerable, with GPT-3 producing nearly 500 tonnes of CO2. This again could be radically changed based on the types of energy production creating the emissions. Most data center operators for instance would prefer to have nuclear energy play a key role, a significantly low-emission energy producer.
Power consumption when training artificial intelligence (AI) based large language models (LLMs) in 2023
(in megawatt hours)
Profit from the additional features of your individual account
Currently, you are using a shared account. To use individual functions (e.g., mark statistics as favourites, set
statistic alerts) please log in with your personal account.
If you are an admin, please authenticate by logging in again.
Learn more about how Statista can support your business.
Cornell University. (November 3, 2022). Power consumption when training artificial intelligence (AI) based large language models (LLMs) in 2023 (in megawatt hours) [Graph]. In Statista. Retrieved November 22, 2024, from https://www.statista.com/statistics/1384401/energy-use-when-training-llm-models/
Cornell University. "Power consumption when training artificial intelligence (AI) based large language models (LLMs) in 2023 (in megawatt hours)." Chart. November 3, 2022. Statista. Accessed November 22, 2024. https://www.statista.com/statistics/1384401/energy-use-when-training-llm-models/
Cornell University. (2022). Power consumption when training artificial intelligence (AI) based large language models (LLMs) in 2023 (in megawatt hours). Statista. Statista Inc.. Accessed: November 22, 2024. https://www.statista.com/statistics/1384401/energy-use-when-training-llm-models/
Cornell University. "Power Consumption When Training Artificial Intelligence (Ai) Based Large Language Models (Llms) in 2023 (in Megawatt Hours)." Statista, Statista Inc., 3 Nov 2022, https://www.statista.com/statistics/1384401/energy-use-when-training-llm-models/
Cornell University, Power consumption when training artificial intelligence (AI) based large language models (LLMs) in 2023 (in megawatt hours) Statista, https://www.statista.com/statistics/1384401/energy-use-when-training-llm-models/ (last visited November 22, 2024)
Power consumption when training artificial intelligence (AI) based large language models (LLMs) in 2023 (in megawatt hours) [Graph], Cornell University, November 3, 2022. [Online]. Available: https://www.statista.com/statistics/1384401/energy-use-when-training-llm-models/