top of page
Writer's pictureAdam Spencer

AI's insatiable power demands

These new platforms are garnering increasing concern about their rapacious demand for electricity to power them and water to stop them overheating.


In the last 18 months we have heard a LOT about AI. Especially Large Language Models (LLMs) and Generative Pre-trained Transformers (GPT). The media and particularly the business conference circuit has been dominated with talk about ChatGPT, image and sound generation and copilots to make work tasks easier and faster.

 

There has also been some discussion about the potential downsides of these new platforms. I’m sure many of you have heard about misinformation, deep fakes, online investment scams fronted by ‘celebrities’, and the like.

 

But one serious concern surrounding these new platforms is their rapacious demand for electricity to power them and water to stop them overheating.

 

These systems are run on high-powered, high-tech computer chips called GPUs and TPUs. They sit in massive data centres when being used to train these models and for their ongoing running are constantly performing billions upon billions of mathematical calculations. To train more powerful systems – think about how much better GPT4 is than GPT3, or Claude 3.5 compared to 3.0 – requires larger data sets and significantly more computational grunt. This all requires power, and lots of it.

 

Among the many (admittedly fuzzy) estimates I have seen, generating two LLM images uses the same amount of power required to recharge your phone overnight. Researcher Alex de Vries from the Vrije Universiteit Amsterdam suggests that if every Google search was done on ChatGPT, this would consume as much electricity as a country such as Ireland uses every year (source).

 

And in the same way as your laptop may get warm when you’ve been processing video and the like for a few hours, these data centres generate tremendous heat. The cooling systems that keep them operational guzzle fantastic amounts of water.


While the figures are hard to pin down, according to Data Centre Magazine a typical hyperscale data centre can use more than 10 million litres of water in a day! A typical conversation with a GPT uses, give or take, a litre of water – and there are hundreds of millions of those happening every day.

 

And that’s not to mention the e-waste aspect of all of this which, if you’re still feeling like you can read on, is excellently summarised in this recent Scientific American article.

  

Put simply, these systems require a lot of components and often tend to discard kit when new equipment comes on board, creating potentially millions of metric tonnes of hazardous e-waste in the coming decade.

 

To my mind, none of this is a reason to abandon the GPT revolution (even if this was possible to do). But the efforts of those looking to reduce power usage, create new cooling systems, and transition these platforms to renewable energy as fast as possible, are to be applauded.


And in chess news...


The defending champion Ding Liren shocked many onlookers by winning the first game of his world championship match, while playing with the black pieces! The 14 game showdown promises to be an absolute cracker.


That’s all from me for now. If you'd like more geeky fun, please check out my other newsletters below, or connect with me on LinkedIn and/or X.


Yours in nerdiness,

Adam

0 comments

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page