A lot has been said about the likely rise in energy demand as a consequence of the boom in use of Artificial Intelligence (AI) systems. For example, researchers from Bloomberg NEF say that surging demand for AI will see data centre power demand double by 2050, whereas the International Energy Agency say it will quadruple by 2030.
However, a new report from Energy Intelligence claims that, while AI will contribute to rising energy consumption, especially in the short term as data centre usage surges, it is unlikely to trigger the dramatic global power demand spike some have predicted. It says it expects AI ‘to only moderately increase global electricity demand to 2050. AI hardware and software efficiencies, and the operational benefits AI will bring to the power sector, will in time offset some of the increased power demand required by AI as it rolls out’.
So who is right? Well, AI and data centre technology is changing fast, so it’s hard to say what exactly it’s energy use patterns will look like in the years ahead. For example, new energy saving cooling systems are emerging with one claiming to cut energy use by up to 48%. AI centres may also be able to flex- varying their energy use in response energy availability. And Mckinsey consultants say grid/transmission upgrades can help link to spare/new capacity– and that we don’t need small modular reactor, that being one common claim. In addition, AI systems themselves may get more efficient- that’s what has been claimed for some new Chinese AI technology, with Deep Seek creating quite a stir.
However, it is all very speculative as yet, the discussion opening up many questions, as pioneering US energy analysts Amory Lovins says in a wide-ranging AI impact overview: ‘AI’s potential but mostly speculative and non-comparable energy benefits, such as greater energy efficiency and cheaper renewable power, must also be compared with the indirect impacts of rising demand for energy-intensive AI services, faster hardware turnover (with more embodied energy), and especially accelerating polluting activities like finding and extracting fossil hydrocarbons—one of AI’s biggest uses’.
He goes on to ask do we actually need it? He says ‘AI does appear valuable, even transformational, in certain technical specialties, from protein folding to weather forecasting, from drug or materials discovery to image or pattern recognition and especially hydrocarbon exploration’. However, he notes that, in its recent AI study looking at key industrial sectors, Bloomberg New Energy Finance (BNEF) saw widespread industrial use of generative AI ‘unlikely,’ and concluded ‘these sectors have struggled to assess if the benefits are worth the risks’.
On that point he says that, while ‘cheaper and more-accessible AI is meant to fuel a feedback loop of consumption across sectors like logistics, education, and marketing…focusing just on short-term savings in direct emissions ignores how slower behavioral and market responses can erase efficiency gains. Will AI really save more energy than its operations consume and its use to extract 9 more fossil fuel enables? It may not. No one knows, and only a few are asking. Real exploration of AI’s complex net energy and environmental effects has barely begun. But if AI doesn’t save lots of net energy and cost, why build it? Just making prettier pictures isn’t worth a trillion dollars. Making more artful deepfakes is worth less than zero.’
While some see vast AI data centres being powered by PV solar, arguably it would not be good to waste a lot of valuable renewable energy to build and run AI system, if they don’t lead to useful green outcomes. Although Lovins says that ‘vastly more renewables will be built than AI requires, no matter how much that turns out to be, because they’re needed, profitable, & beneficial regardless’.
So what will happen? Will AI continue to boom? As a slightly disparaging side-shot, Lovins notes that, ‘Anthropic, a leading AI company, tells job applicants “please do not use AI” in their applications, so it can assess their own communication skills’, while ‘Fortune reports 74% of hiring managers shun AI-generated applications, from which most are less likely to hire’.
However, leaving quips like this aside, it may be that, as some hope, AI will open up new opportunities for new jobs and for, on one hand, enhanced human creativity and on the other, a reduction in boring work. As well as improvements in efficiency and system integration and management in many areas. It’s hard to say at present- it could have massive impacts. But they may not all be good: in addition to energy use, and also water use, there are also some other reasons to worry about AI- it may lead to wide scale deskilling and job losses, and ultimately even, some say, to the elimination of humanity!
The AI story and the debate on it continues , with new MIT analysis and energy use data- although they say it’s hard to predict the ultimate scale of energy demand, quite rapid growth in AI use looks almost unstoppable. But given that some see AI overall possibly having a catastrophic human impact, perhaps the last word should go to BNEF founder Michael Liebreich, who, at the end of his wide-ranging overview of the prospects for AI, including its likely energy use, said ‘our brains operate on around 20W. Seen like that, we are still beating the machines by around eight orders of magnitude. Humanity is not done yet!’
*AI systems are sometimes very impressive in their complexity and ingenuity, but, by way of balance, the image below left is a computer-generated artistic rendition a human cell, reputedly obtained using ‘radiography, nuclear magnetic resonance and cryo-electron microscopy’, but with components composited and colourised. So it’s not a ‘real’ image. In a way then it’s the result of AI! But it is stunning. And the other one, on the right, which seems more straight-forward, is said to show 1 cubic millimeter of a human temporal cortex, with the web of neurons and synapses. Make of that what you will…
https://theconversation.com/data-centres-in-space-theyre-a-brilliant-idea-but-a-herculean-challenge-246635
ReplyDeleteBut what about latency delays!?