Power-hungry AI a watch-out for equity investors
/In all the investor excitement about AI equities, investors may be missing a crucial factor. Pendal equities analyst Elise McKay explains.
Big US tech stocks are soaring on a wave of new, advanced AI applications.
But similar to Bitcoin’s early days, excited AI investors may be overlooking the technology’s extremely high power costs and potential associated sustainability issues, argues Pendal Aussie equities analyst Elise McKay.
Whilst the remarkable progress of AI promises to revolutionise industries, the sheer cost of the electricity needed to train and run the systems puts a question mark over the long-term prospects of adoption.
“There’s three key components of power usage required for running a generative AI model,” says McKay.“First of all, there’s the power needed to simply build the equipment that it runs on. Then there is the enormous power required to train the model.
And then every time you ask it a question it requires new computations — and that means more power.”
Even before generative AI became widely available, demand for data was expected to increase at a compound annual growth rate of 40 per cent per year. The data centre industry is already estimated to account for about 1 per cent of global energy demand, says McKay.
“Just because it’s on your phone doesn’t mean it’s not in a data centre somewhere — and data centres need electricity. Any new technology just increases demand for power.”
McKay uses the example of bitcoin mining, which rapidly increased its share of global energy consumption from next to nothing to an estimated 0.5 per cent in 2021.
“Emerging technologies like bitcoin mining can see very rapid adoption and dramatic increases in demand for power,” says McKay. “We are now seeing the broad take up of generative AI, which is significantly more power hungry than existing technologies.”
A study by Stanford found that training the popular GPT-3 generative AI system contributed almost 10 times the emissions that the average car consumes in its lifetime, says McKay.
Estimates are the newer GPT-4 model was eight times more power intensive again, she says. “And you don’t just do this once, you do it regularly.”
OpenAI — the company behind ChatGPT — says it continuously improves its AI model by “training on the conversations people have with it.”
“And each model can only do one search at a time,” says McKay. “So, if 100,000 people search for something at the exact same time you need 100,000 copies of the model otherwise queries will be queued.
Estimates are that every time you query ChatGPT, it is 300 times more expensive than a Google search.”
High power usage has also raised question marks over the carbon footprint of the technology industry, with many providers shifting to renewable energy to minimise their impact on the environment.
“The high cost of providing AI will hinder its adoption,” says McKay.
“It may mean that only companies willing to pay a high price will be able to use it. There’s a good use case for companies willing to pay for it because it improves productivity, but will we see broad adoption for low-paying use cases?”