Now, generative Artificial intelligence is impossible to ignore online. Every time I do a Google search, an AI-generated summary might randomly appear at the top of the results. Or I might be prompted to try Meta’s AI tools while browsing Facebook. And the ever-present sparkly emoji keeps appearing in my dreams.
This push to add AI to as many online interactions as possible dates back to OpenAI’s boundary-pushing release of ChatGPT in late 2022. Silicon Valley quickly fell in love with generative AI, and nearly two years later, AI tools powered by large-scale language models are permeating online user experiences.
One unfortunate side effect of this proliferation is that the computing processes required to run generative AI systems have become far more resource intensive, ushering in an era of internet hyper-consumption, defined by the proliferation of new kinds of computing that require huge amounts of electricity and water to build and operate.
“On the back end, these algorithms that generative AI models have to run are fundamentally very different from a traditional Google search or email,” says Sajjad Moazeni, a computer engineering researcher at the University of Washington. “For basic services, they’re very lightweight in terms of the amount of data that needs to be moved back and forth between processors.” By comparison, Moazeni estimates that generative AI applications are somewhere between 100 and 1,000 times more computationally intensive.
The energy required to train and deploy the technology is no longer generative AI’s dirty secret, as expert after expert predicted last year a surge in energy demand in the data centers where companies work on AI applications. As if on cue, Google recently stopped considering itself carbon neutral, and Microsoft may be slipping away from sustainability goals in the ongoing race to build the biggest and best AI tools.
“These data centers are basically powered in proportion to the amount of computation they perform, so their carbon footprint and energy consumption are proportional to the amount of computation they perform,” says Juncheng Zhang, a network systems researcher at the University of Chicago. The larger the AI model, the more computation it often requires, and these cutting-edge models are becoming extremely large.
Google’s total energy consumption is set to double from 2019 to 2023, but company spokeswoman Corinna Standiford said it’s not fair to say Google’s energy consumption spiked in the midst of the AI race. “It’s extremely hard to reduce emissions from our suppliers, which make up 75% of our footprint,” she said in an email. The suppliers Google blames include makers of servers, networking equipment and other technical infrastructure for data centers, the energy-intensive process needed to create the physical parts of cutting-edge AI models.