The rise of artificial intelligence has propelled the stock prices of major tech companies to new highs, but at the expense of the sector’s climate aspirations.
Google admitted Tuesday that the technology was threatening its environmental goals after revealing that data centers, a key part of AI infrastructure, had contributed to a 48% increase in its greenhouse gas emissions since 2019. The company said a “significant uncertainty” about achieving its goal of net-zero emissions by 2030 — reducing the overall amount of CO2 emissions it is responsible for to zero — included “uncertainty around the future environmental impact of AI, which is complex and difficult to predict.”
Will technology then be able to reduce the environmental cost of AI, or will the industry continue to move forward without worrying about the rewards of supremacy?
Why AI poses a threat to tech companies’ green goals
Data centers are a critical part of training and operating AI models like Google’s Gemini or OpenAI’s GPT-4. They house the sophisticated computing equipment, or servers, that crunch the vast amounts of data that AI systems rely on. Running them requires large amounts of electricity, which generates CO2 depending on the power source, as well as “embedded” CO2 from the cost of manufacturing and transporting the necessary equipment.
According to the International Energy Agency, total electricity consumption by data centres could double from 2022 levels to 1,000 TWh (terawatt hours) by 2026, equivalent to the energy demand of Japan, while research firm SemiAnalysis calculates that AI will drive 4.5% of global energy consumption by data centres by 2030. Water consumption is also significant, with one study estimating that AI could account for up to 6.6 billion cubic metres of water by 2027, or nearly two-thirds of England’s annual consumption.
What do experts say about the environmental impact?
A recent report on AI safety, backed by the UK government, says the carbon intensity of the energy source used by tech companies is “a key variable” in determining the environmental cost of the technology. However, it adds that a “significant proportion” of AI model training still relies on energy generated from fossil fuels.
Indeed, tech companies are scooping up renewable energy contracts to try to meet their environmental goals. Amazon, for example, is the world’s largest buyer of renewable energy. But some experts say that’s pushing other energy consumers to turn to fossil fuels because there isn’t enough clean energy to go around.
“Not only is energy consumption increasing, but Google is also struggling to meet this increased demand for sustainable energy sources,” says Alex de Vries, founder of Digiconomist, a website that monitors the environmental impact of new technologies.
Is there enough renewable energy for everyone?
Governments around the world are planning to triple the world’s renewable energy resources by the end of the decade to reduce fossil fuel consumption in line with climate goals. But this ambitious commitment, agreed at the COP28 climate talks last year, is already in doubt, and experts fear that a surge in energy demand from AI data centers could make it even more unattainable.
The IEA, the global energy watchdog, has warned that even if global renewable energy capacity grows at the fastest rate in 20 years by 2023, the world could only double its renewable energy production by 2030 under current government plans.
The answer to AI’s energy appetite could be for tech companies to invest more in building new renewable energy projects to meet their growing demand for electricity.
How soon will we be able to build new renewable energy projects?
Onshore renewable energy projects such as wind and solar farms are relatively quick to build – they can take less than six months to develop. However, slow planning regulations in many developed countries and a global impasse in connecting new projects to the electricity grid could add years to the process. Offshore wind farms and hydropower projects face similar challenges, with construction times of two to five years.
This raises concerns about the ability of renewables to keep pace with the expansion of AI. According to the Wall Street Journal, big tech companies have already tapped a third of U.S. nuclear power plants to provide low-carbon electricity to their data centers. But without investing in new energy sources, these deals would divert low-carbon electricity from other users, leading to more fossil fuels being used to meet overall demand.
Will AI’s electricity demand increase indefinitely?
Normal rules of supply and demand suggest that if AI consumes more electricity, the cost of energy increases and the industry is forced to cut costs. But the unique nature of this sector means that the world’s largest companies could decide to ride out spikes in electricity costs, leading to billions of dollars in unnecessary spending.
The largest and most expensive data centers in the AI industry are those used to train “state-of-the-art” AI, systems like GPT-4o and Claude 3.5, which are more powerful and capable than any others. The leader in the field has changed over the years, but OpenAI is generally near the top, battling for position with Anthropic, the creator of Claude, and Google’s Gemini.
Competition on the “frontiers” is already considered winner-take-all, and there’s nothing stopping customers from flocking to the last leader. That means if a company spends $100 million training a new AI system, its competitors have to decide whether to spend even more or drop out of the race altogether.
Worse, the race for “AGI”—AI systems that can do everything a human can do—means it might be worth spending hundreds of billions of dollars on a single training course—if it would lead your company to monopolize a technology that could, as OpenAI puts it, “elevate humanity.”
Won’t AI companies learn to use less electricity?
Every month, new advances in AI technology are enabling companies to do more with less. In March 2022, for example, a DeepMind project called Chinchilla showed researchers how to train cutting-edge AI models using radically less computing power by changing the ratio between the amount of training data and the size of the resulting model.
But this didn’t mean that the same AI systems used less electricity; instead, it allowed the same amount of electricity to be used to make even better AI systems. In economics, this phenomenon is known as the “Jevons Paradox,” named after the economist who noted that James Watt’s improvement of the steam engine, which allowed much less coal to be used, led to a huge increase in the amount of fossil fuel burned in England. As the price of steam power fell after Watt’s invention, new uses were discovered that wouldn’t have been attractive when electricity was expensive.