Michigan: Can AI training energy consumption be lowered?

Apr 21, 2023 | Posted by MadalineDunn

These days, A.I. is seemingly never out of the headlines, with concerns about everything from lack of regulations around development, unchecked A.I. growth, and energy and water consumption. Considering the world is in the grips of a climate crisis, and A.I. training is using a growing amount of resources, the latter concern is becoming increasingly important to address. Now, researchers at the University of Michigan say that they have created a new way to optimize the training of deep learning models, which could drastically reduce A.I.'s energy demands. 

Mosharaf Chowdhury, an associate professor of electrical engineering and computer science and his team at the university have developed Zeus, which is an energy optimization framework. Through using Zeus, the team says energy usage could be reduced by up to 75%, without using any new hardware. 

Speaking about the sheer scale of energy consumption during A.I. training, Chowdhury said: "At extreme scales, training the GPT-3 model just once consumes 1,287 MWh, which is enough to supply an average U.S. household for 120 years."

Likewise, Jae-Won Chung, a doctoral student in computer science and engineering and co-first author of the study, outlined that while the majority of current research focuses on how to optimize the speed of deep learning training, rarely is its impact on energy efficiency considered. "We discovered that the energy we're pouring into GPUs is giving diminishing returns, which allows us to reduce energy consumption significantly, with relatively little slowdown," said Chung.

Explaining how Zeus works, the team explained that it uses two "software knobs," one is the GPU power limit, while the other is the deep learning mode's batch size parameter. Through tuning the two settings in real-time, Zeus is able to minimize energy usage, with little impact on the training time. 

Further, Jie You, a recent doctoral graduate in computer science and engineering and co-lead author of the study, explained: "Fortunately, companies train the same DNN over and over again on newer data, as often as every hour. We can learn about how the DNN behaves by observing across those recurrences."

0 Comments