AI industry could use as much energy as Netherlands

AI industry could use as much energy as Netherlands

Technology

They use far more power than conventional applications

(Web Desk) - Big tech firms have scrambled to add AI-powered services since ChatGPT burst onto the scene last year.

They use far more power than conventional applications, making going online much more energy-intensive.

However, the study also said AI's environmental impact could be less than feared if its current growth slowed.

Many experts, including the report author, say such research is speculative as tech firms do not disclose enough data for an accurate prediction to be made.

There is no question, though, that AI requires more powerful hardware than traditional computing tasks.

The study, by Alex De Vries, PhD candidate at the VU Amsterdam School of Business and Economics, is based on some parameters remaining unchanged - such as the rate at which AI is growing, the availability of AI chips, and servers continuing to work at full pelt all the time.

De Vries considered that the chip designer Nvidia is estimated to supply about 95% of the AI processing kit required by the sector.

By looking at the amount of these computers it is expected to deliver by 2027, he was able to approximate a range for the energy consumption of AI of 85-134 terrawatt-hours (TWh) of electricity each year.

At the top end that is roughly the amount of power used annually by a small country.

"You would be talking about the size of a country like the Netherlands in terms of electricity consumption. You're talking about half a per cent of our total global electricity consumption," he told BBC News.

AI systems such as the large language models that power popular chatbots, like OpenAI's ChatGPT and Google's Bard, require warehouses full of specialist computers - called data centres - to work.

That means the equipment is more power-hungry and, like traditional kit, it also needs to be kept cool, using water-intensive systems.

The research did not include the energy required for cooling. Many of the big tech firms don't quantify this specific energy consumption or water use. Mr de Vries is among those calling for the sector to be more transparent about it.

But there is no doubt demand for the computers that power AI is mushrooming - and with it the amount of energy needed to keep those servers cool.

Danny Quinn, boss of the Scottish data centre firm DataVita, said his company has gone from receiving "one or two enquiries a week" at the start of 2023 about using his facility to house AI kit, to receiving hundreds.

He also described the difference in energy use between a rack containing standard servers, and one containing AI processors.

"A standard rack full of normal kit is about 4kWh of power, which is equivalent to a family house. Whereas an AI kit rack would be about 20 times that, so about 8kWh of power. And you could have hundreds, if not thousands, of these within a single data centre."