top of page
  • Voltaire Staff

AI boom may send global energy consumption in spiral


Image Courtsy: DALL-E 2


As the AI boom continues apace, experts have begun to sound alarm bells over the energy consumption that its use will result in if it keeps up at current rate.


People have become hooked to generative artificial intelligence since the advent of OpenAI's ChatGPT, an interactive AI platform, which sent top executives at Google and Bing rummage deep inside their pockets and look for native alternatives. The endeavour resulted in Google coming up with Bard, and Bing with Bing Chat.


Since last year, several other AI platforms such as text-to-image generators DALL-E and Midjourney have attracted users in droves and elicited millions in queries.


As their use becomes more popular and AI scene still nascent, concerns have begun to float about the net electricity consumption such a swift spike in number of servers – used to store data and power search engines -- would look like.


The rise in AI-powered tools has over the last more than one year sent the need for servers skyrocketing, with NVIDIA, a leader in industry, making a killing, and posting revenue of $13.5 billion for the quarter ending July 2023.


According to International Energy Agency, data centres across the world are responsible for 1 to 1.5 per cent of global electricity use. That number is set to see sharp incline.


An analysis by a PhD candidate at VU Amsterdam School of Business and Economics says that if Google were to incorporate AI for all its searches, it would end up adding 10 times more carbon footprint to its existing load, reaching approximately 3 Wh per LLM, or large language model, interaction.


Alex de Vries, also the founder of Digiconomist, in his paper cited a research by SemiAnalysis which estimates that implementing AI similar to ChatGPT in each Google search would require 512,821 of NVIDIA’s A100 HGX servers, totalling 4,102,568 GPUs.


At such a rate, the transition would translate into a daily electricity consumption of 80 GWh and an annual consumption of 29.2 TWh – a significant addition to its total electricity consumption of 18.3 TWh in 2021 with AI accounting for just 10 per cent–15 per cent of it.


However, in an interview Scientific American, de Vries said that the energy uptick may not be all that certain.


"... it’s not going to happen like that because Google would also have to invest $100 billion in hardware to make that possible. And even if [the company] had the money to invest, the supply chain couldn’t deliver all those servers right away," he told the website.


Cost concerns aside, de Vries says the electricity consumption will be hard to predict considering the need for cooling apparatuses that one must deploy to maintain these servers.


"... global data centers, on average, will add 50 percent to the energy cost just to keep the machines cool. There are data centers that perform even worse than that," he said.



Comments


bottom of page