AI's Growth Drives Energy Demand, Sparking Concerns and Innovations
Oct 01, 2023 | Posted by Eric Bell
The rise of artificial intelligence (AI) is heralded as one of the most transformative technological advancements, comparable to the emergence of the internet. This sentiment is echoed by Wall Street, with the Nasdaq (^IXIC) witnessing a 26% surge this year, largely attributed to the enthusiasm surrounding AI-centric stocks.
However, the proliferation of AI technologies comes with an environmental caveat: a significant increase in energy consumption. A study conducted by the University of Washington highlighted the energy demands of OpenAI's chatbot, ChatGPT. The research found that the chatbot's operations can consume around 1 gigawatt-hour daily, an energy equivalent to that used by 33,000 US households. Professor Sajjad Moazeni of the University of Washington remarked, "The energy consumption of something like ChatGPT inquiry compared to some inquiry on your email, for example, is going to be probably 10 to 100 times more power hungry."
Industry insiders believe that the current energy demands are just the tip of the iceberg. Arijit Sengupta, founder and CEO of Aible, an enterprise AI solution company, stated, “We’re maybe at 1% of where the AI adoption will be in the next two to three years. The world is actually headed for a really bad energy crisis because of AI unless we fix a few things.”
Data centers, the backbone of advanced computing, have been undergoing a transformation. These facilities, managed predominantly by tech giants like Google, Microsoft, and Amazon, are transitioning from using simpler central processing units (CPUs) to more energy-intensive graphics processing units (GPUs). Patrick Ward, vice president of marketing for Formula Monks, an AI technology consulting firm, emphasized the shift, stating, "For the next decade, GPUs are going to be the core of AI infrastructure. And GPUs consume 10 to 15 times the amount of power per processing cycle than CPUs do. They’re very energy intensive.”
Research by professors Benjamin C. Lee of the University of Pennsylvania and David Brooks of Harvard revealed that between 2015 and 2021, data center energy usage grew by 25% annually. This surge occurred before AI technologies like ChatGPT became widely recognized. In contrast, the US Energy Information Administration reported a 7% annual growth rate in renewable energy deployment during the same period.
Major cloud providers, including Google Cloud, Microsoft Azure, and Amazon Web Services, are taking steps to address the energy challenge. They are investing heavily in renewable energy sources to match their annual electricity consumption and have made net-zero pledges. Microsoft's Azure, for instance, has been carbon neutral since 2012 and aims to be carbon negative by 2030. Similarly, Amazon plans to power its operations with 100% renewable energy by 2025, targeting net-zero carbon emissions by 2040. Google has set a goal to achieve net-zero emissions across all operations by 2030.
However, Benjamin C. Lee cautioned, "Net zero doesn’t mean you’re carbon-free potentially. There will be hours of the day where you don’t have enough sun or enough wind, but you’re still going to be drawing energy straight from the grid."
The focus on energy efficiency and cost reduction is expected to drive the industry towards sustainable solutions. As Angelo Zino of CFRA Research noted, the winners in this space will likely be data center operators, with more companies opting to rent cloud space rather than building their own energy-intensive data centers.