stub Two Students Develop Software to Combat CO2 Caused by AI - Unite.AI
Connect with us

Artificial Intelligence

Two Students Develop Software to Combat CO2 Caused by AI

Updated on

Experts agree that the current path of artificial intelligence (AI) development will cause it to become one of the leading contributors of CO2, despite its use to combat that very problem. This has caused various parts of the industry to begin focusing on how to remedy the situation, with one of the most-recent developments coming from two students at the University of Copenhagen.

Advanced AI methods such as deep learning are developing at an astonishing rate, but that comes with massive energy consumption levels. As this continues to increase, AI technologies and methods, especially deep learning, will likely become a significant contributor to climate change. However, this is only if no action is taken to alter the current path. 

From 2012 to 2018, the computational power required for deep learning has increased 300,000%. One of the industry's significant problems is that energy consumption and carbon footprint due to the development of algorithms are rarely measured. At the same time, many studies are detailing this issue and calling for action.

Carbontracker

In seeking to address this issue, Lasse F. Wolff Anthony and Benjamin Kanding at the University of Copenhagen's Department of Computer Science, along with Assistant Professor Raghavendra Selvan, have developed a new software program called Carbontracker. This software can accurately calculate and predict how much energy consumption and CO2 emissions come from training deep learning models. 

“Developments in this field are going insanely fast and deep learning models are constantly becoming larger in scale and more advanced,” Lasse F. Wolff Anthony said. “Right now, there is exponential growth. And that means an increasing energy consumption that most people seem not to think about.”

Deep learning models continue to grow and tackle much more complex problems, requiring a significant increase in energy consumption.

“As datasets grow larger by the day, the problems that algorithms need to solve become more and more complex,” says Benjamin Kanding. 

The open source Carbontracker program can be found here.

GPT-3

One of the best examples of this is the advanced language model GPT-3. It is one of the biggest and most complex deep learning models developed to date, but it comes at a cost. GPT-3 requires the same amount of energy used by 126 Danish homes in a year, all in just one training session. The amount of CO2 released is equivalent to 700,000 kilometers of driving. 

According to Lasse F. Wolff Anthony, “Within a few years, there will probably be several models that are many times larger.”

“Should the trend continue, artificial intelligence could end up being a significant contributor to climate change. Jamming the brakes on technological development is not the point. These developments offer fantastic opportunities for helping our climate. Instead, it is about becoming aware of the problem and thinking: How might we improve?” says Benjamin Kanding. 

Carbontracker tracks the amount of CO2 used to produce energy in areas where deep learning training is happening, making it possible to predict CO2 emissions after converting energy consumption.

According to the students, deep learning users should pay attention to what type of hardware and algorithms are being used and when model training takes place, as there are areas with larger greener energy supplies.

“It is possible to reduce the climate impact significantly. For example, it is relevant if one opts to train their model in Estonia or Sweden, where the carbon footprint of a model training can be reduced by more than 60 times thanks to greener energy supplies. Algorithms also vary greatly in their energy efficiency. Some require less compute, and thereby less energy, to achieve similar results. If one can tune these types of parameters, things can change considerably,” says Lasse F. Wolff Anthony.

Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.