AI's Carbon Crisis: The Unseen Cost of Innovation

OpenAI was launched in December 2015, just days after the release of the draft Paris Climate Accord, which sought a 43% reduction in emissions by 2030. 

It’s unlikely at that time that anyone involved in the Paris draft understood the dramatic impact that artificial intelligence (AI), and especially generative AI, would have on global energy consumption and emissions.  

While the impact of other big carbon emitters such as ICE vehicles and business travel were well-understood at the time, no one would have imagined a world with ubiquitous generative AI (outside of a few people in labs and a few cafes in Silicon Valley). Or its impact on the demand for energy and carbon emissions. 

Then in 2019, researchers estimated that training a single large language model would produce the same CO₂ as 125 Beijing-New York return flights. With the rapid proliferation of LLMs in recent years (there are now around 100 with more coming), it’s hard to quantify the impact AI is having on emissions - but it’s certainly no rounding error. Read on. 

An iPhone AI index? 

Before you get too far into the story, think about the last time you charged your smartphone, and how long it took the battery to go from anxious red to that comforting full green bar.  

Each AI-generated image you see on the web or social media burns as much energy as recharging your phone in full, and then it spits out a few more grams of CO₂ into the atmosphere.  

ChatGPT burst into public consciousness in late 2022. Collecting 100 million users within a few weeks it has grown to 180 million users today. For many business leaders, ChatGPT demonstrated the transformative power of artificial intelligence in a way the dozens of PowerPoint presentations by vendors, consultants, or CIOs they sat through over the last decade never could. 

It also lit the flame on a global bonfire of software innovation as application providers rushed to integrate generative AI capabilities into their platforms. 

As a result, your company is already using generative AI across functions as diverse as finance, marketing, customer service, and HR, whether you actively choose to do so or not. As Microsoft’s Copilot becomes ubiquitous, pretty much every employee in your business will start pumping little puffs of CO₂ into the atmosphere a few times a day and maybe a few times an hour, depending on the job they do.  

The SaaS Academy, a business advisory service for startups, predicts that AI will be integrated into every new software product and service by 2025. Other studies suggest it is already getting towards halfway. 

All those new artificial intelligence capabilities promise a new wave of efficiency for business, but that wave also comes with a giant sustainability sting – a huge increase in energy consumption back in the data centre, and with it, spiking carbon emissions. 

Boiling the frog 

Generative AI’s impact on sustainability isn’t an issue receiving much attention inside corporate environments. However, with 2030 carbon emission goals closing in that is likely to change, as the scale of AI’s energy consumption is better understood. Gartner, for instance, estimates that AI will be eating through 3.5 per cent of the world's total electricity by 2030. 

Even before the impact of artificial intelligence is factored in, some large companies such as Shell, HSBC and Standard Chartered have made announcements that either explicitly adjust the target or put it in doubt.  

According to the International Energy Agency, “Electricity consumption from data centres, artificial intelligence (AI), and the cryptocurrency sector could double by 2026. Data centres are significant drivers of growth in electricity demand in many regions. After globally consuming an estimated 460 terawatt-hours (TWh) in 2022, data centres’ total electricity consumption could reach more than 1,000 TWh in 2026. This demand is equivalent to Japan's electricity consumption.” 

Why so hungry? 

The reason artificial intelligence is so energy intensive is that the GPUs (graphical processing units) that do the calculations and which specialise in multiprocessing, pack thousands of cores onto a single chip with each capable of performing a small task concurrently.  

This makes them ideal for the task, but dramatically increases the demand for energy, and the density inside the rack in the data centre. That in turn drives a need for advanced cooling solutions which themselves use more energy. GPUs may be more efficient for AI-related tasks than CPUs, but unfortunately, that efficiency does not mitigate the raw power draw. 

More LLMs than ever before 

The development of large language models like OpenAI’s ChatGPT or Anthropic’s Claude are at the heart of the swelling generative AI tide. It’s not simply that many more people are using the models – there has also been a dramatic increase in the number of LLMs. According to Schneider Electric, it took 10 years to develop 25 large language models – but in the last two years, around 100 have been created. 

It turns out LLMs and the generative AI applications they enable are energy hogs driving huge emission increases. 

Early in the decade, with hyper scalers confident that they had the energy tiger by the tail, Google and Amazon crowed that they already had 100 per cent renewable energy contracts in place, and Microsoft said it would do so by the mid-decade. 

They are now stepping back from that. 

Google acknowledges it is going backward, saying artificial intelligence has in large part contributed to a 48 per cent increase in greenhouse gas emissions since 2019. Google used offsets to make its carbon neutrality claims as early as 2007. The company acknowledged, in its 2024 environmental report, “Starting in 2023, we’re no longer maintaining operational carbon neutrality.” 

Amazon likewise saw growth in its emissions. Based on 2023 figures, its emissions are up 40 per cent since 2019, when it first revealed the numbers around its carbon footprint. 

A search for a single word on ChatGPT uses 10 times as much energy as a search on Google according to Goldman Sachs (some estimates have this as high as 25 times.) 

Earlier this year, researchers at Carnegie Mellon University and Hugging Face studied the energy consumption required for 1,000 inferences across a range of models, tasks, and 30 datasets. They published their findings in a paper called “Power Hungry Processing: Watts Driving the Cost of AI Deployment? 

Unsurprisingly consumption varied dramatically depending on the task, with the study finding for example that text classification (one of the more common use cases for generative AI) consumed 0.002 kWh whereas image generation consumed 2.907 kWh – that’s nearly 1,500 times more energy! 

It was this report that first brought the iPhone comparisons to light.  

Now, think about that earlier iPhone comparison in the context of the pitch by a company like Adobe - which boasts that its Firefly generative AI tool lets users quickly scale a few hero assets into thousands of renditions. 

That’s not to single out Adobe or other image generators from popular tools such as Midjourney and Dall.E, but rather to demonstrate how simple daily tasks for a single employee might suddenly balloon into very significant CO₂ outputs when magnified across all your employees and all their apps. 

Data centre impact 

It’s still early days, and according to the Uptime Institute which focuses on improving the performance, efficiency, and reliability of data centres, the initial impact on data centres is currently limited beyond a few dozen large sites.  

But that’s set to change. 

In its journal published in late July it noted, “For the first quarter of 2024, we estimate the annualized power use by Nvidia systems installed to be around 5.8 TWh. This figure, however, will rise rapidly if Nvidia meets its forecast sales and shipment targets. By the first quarter of 2025, the generative AI infrastructure in place could account for 21.9 TWh of annual energy consumption.” 

It says these numbers are indicative but will likely shift up as more information becomes available. 

“To put these numbers into perspective, the total global data centre energy use has been variously estimated at between 200 TWh and 450 TWh per year in the periods from 2020 to 2022. By taking a middle figure of 300 TWh for the annual global data centre power consumption, Uptime Intelligence puts generative AI annualized energy at around 2.3 per cent of the total grid power consumption by data centres in the first quarter of 2024. 

“However, this could reach 7.3 per cent by the first quarter of 2025.” 

Compared to other estimates from organisations like Schneider and authors like Alex de Vries of Digiconomist, The Uptime Institute acknowledges its figures are conservative. Given how rapidly organizations are adopting generative AI, its assessment may indeed prove to be too conservative. 

Possible solutions 

If Moore's Law has taught us anything there’s one thing the tech industry does brilliantly: efficiency over time. And while there’s a huge rush to stake out turf in the emerging AI landscape, the simple economics of energy consumption – not to mention regulation – mean that the tech sector is already working on mitigating the problem it has created. 

For instance, data centre location can help lower cooling costs, and small language models (SLMs) which use much less energy can be deployed within organisations to meet specific needs according.

Indeed, if more companies like Apple plan to launch SLMs and image generation models on every phone sold, the Cloud component of the problem could potentially be addressed quite quickly.

Likewise, thanks to the growth of EV usage, AI-powered software platforms are being built to manage energy consumption more efficiently as demand grows. And of course, AI itself can be deployed to make organisations more efficient. 

Timing and scale 

But the issue is timing and scale. The development and adoption of AI, especially generative AI, is happening at a dramatic pace, and with little focus on carbon emission considerations. A briefing by Schneider Electric last month on the economics of energy and AI left the clear impression that many business AI consumers (and a fair chunk of software application developers who sell to them) are neither measuring AI’s energy usage or even aware of it. “You really need to have your hands-on the tin in the data centre to understand it,” one executive told us.  When was the last time your head of HR, or marketing, or supply chain, or finance visited a data centre? 

The title of the 2023 United Nations Emissions Gap report nails the problem succinctly. “Broken Record – Temperatures hit new highs, yet world fails to cut emissions (again).” 

Ask ChatGPT how likely it is that the world will meet the targets in the Paris Climate Agreement it replies, uncomfortably, with: “Meeting the targets…is increasingly challenging due to several factors, including insufficient global commitments, slow implementation of policies, and rising emissions from key sectors like energy and transportation…Significant and immediate global action is required to have a realistic chance of meeting these goals.” 

Maybe consider that next time you want to generate a quirky image on Midjourney, or Dall.E, or Firefly.