Chowdhury believes the challenge can be met with “clever” solutions at every level, from the physical hardware to the AI software itself.
For example, his lab has developed algorithms that calculate exactly how much electricity each AI chip needs, reducing energy use by 20-30%.
‘Clever’ solutions
Twenty years ago, operating a data centre - encompassing cooling systems and other infrastructure - required as much energy as running the servers themselves.
Today, operations use just 10% of what the servers consume, says Gareth Williams from consulting firm Arup.
This is largely through this focus on energy efficiency.
Many data centres now use AI-powered sensors to control temperature in specific zones rather than cooling entire buildings uniformly.
This allows them to optimise water and electricity use in real-time, according to McKinsey’s Pankaj Sachdeva.
For many, the game-changer will be liquid cooling, which replaces the roar of energy-hungry air conditioners with a coolant that circulates directly through the servers.
“All the big players are looking at it,” Williams said.
This matters because modern AI chips from companies like Nvidia consume 100 times more power than servers did two decades ago.
Amazon’s world-leading cloud computing business, AWS, last week said it had developed its own liquid method to cool down Nvidia GPUs in its servers - avoiding have to rebuild existing data centres.
“There simply wouldn’t be enough liquid-cooling capacity to support our scale,” Dave Brown, vice-president of computer and machine learning services at AWS, said in a YouTube video.
US vs China
For McKinsey’s Sachdeva, a reassuring factor is that each new generation of computer chips is more energy-efficient than the last.
Research by Purdue University’s Yi Ding has shown that AI chips can last longer without losing performance.
“But it’s hard to convince semiconductor companies to make less money” by encouraging customers to keep using the same equipment longer, Ding added.
Yet even if more efficiency in chips and energy consumption is likely to make AI cheaper, it won’t reduce total energy consumption.
“Energy consumption will keep rising,” Ding predicted, despite all efforts to limit it. “But maybe not as quickly.”
In the US, energy is now seen as key to keeping the country’s competitive edge over China in AI.
In January, Chinese start-up DeepSeek unveiled an AI model that performed as well as top US systems despite using less powerful chips - and by extension, less energy.
DeepSeek’s engineers achieved this by programming their GPUs more precisely and skipping an energy-intensive training step that was previously considered essential.
China is also feared to be leagues ahead of the US in available energy sources, including from renewables and nuclear.
-Agence France-Presse