Is there a extra environmentally pleasant method to prepare AI?

Machine learning is changing the world, and it’s changing it fast. In the last few years it has brought us virtual assistants understand the language autonomous vehicles, new drug discoveries, AI-based triage for medical scans, Handwriting recognition and more.

One thing that machine learning shouldn’t change is the climate.

The problem relates to the evolution of machine learning. In order for machine learning (and deep learning) to accurately make decisions and predictions, it needs to be “trained”.

Imagine an online shoe sales marketplace that has a problem with people trying to sell other things on the site – bicycles, cats, and theater tickets. The marketplace owners decide to limit the website to shoes only by creating an AI to recognize photos of shoes and decline the listing with no shoes in the picture.

The company collects tens of thousands of photos of shoes and a similar number of photos without shoes. It hires data scientists to design a complicated mathematical model and turn it into code. And then they start training their machine learning model for shoe detection.

This is the crucial part: the computer model looks at all of the images of shoes and tries to figure out what makes them “shoey”. What do they have that the non-shoe pictures don’t have? Without getting lost in technical details, this process takes up a lot of computing resources and time. Training accurate models for machine learning means running multiple chips like GPUs at full power for weeks or months for weeks or months while the models are trained, optimized, and refined.

In addition to time and money, AI training also consumes a lot of energy. Modern computer chips use minimal power when idle, but when they’re at full capacity they can burn electricity and generate a lot of waste heat (which also needs to be pumped out with cooling systems that use more energy).

Any major energy consumption has an impact on climate change, as most of our electricity is still generated from fossil fuels and carbon dioxide is created when burned. One Recent study from the University of Massachusetts claimed that training a single advanced voice-processing AI produced 626,000 pounds of CO2, the same amount five cars would produce over their lifetime!

In fact, a team at Canada’s Montreal Institute for Learning Algorithms (MILA) published that Machine learning emissions calculator in December last year to help AI researchers estimate how much carbon is being generated training their machine learning models.

This problem is only getting worse as data scientists and engineers solve more complicated AI problems by throwing more power on them and using bigger, more expensive computers to solve tough problems instead of focusing on efficiency.

GPT-3, the AI-based language model recently published by OpenAI, has been trained on 45 terabytes of text data (the entire English Wikipedia contains around 6 million articles). makes up only 0.6 percent of his training data) with the environmental cost of this extremely powerful machine learning technology still unknown.

To be fair, other computer processes are also on a worrying path. ON study ICT specialist Anders Andrae stated that by 2030, according to his most optimistic projections, the ICT industry, which provides internet, video, voice and other cloud services, would be responsible for 8% of the total global energy demand, while his realistic forecast this figure put at 21% – with data centers using more than a third of that.

One of Key recommendations from the University of Massachusetts research on reducing waste through AI training was “a concerted effort by industry and science to encourage research into more computationally efficient algorithms and hardware that requires less energy”.

Software can also be used to increase hardware efficiency, thereby reducing the computing power required for AI models. However, the greatest impact may come from the use of renewable energy sources for the data centers themselves. Facebooks Data center Odense, Denmark should be operated entirely with renewable energy sources. Google has its own energy-efficient data centers. like this one in Hamina, Finland.

In the long run, as the developed world moves away from fossil fuels, the link between computational load and carbon production may be broken and all machine learning may be carbon neutral. Even longer-term, deep learning about weather and climate patterns could help humankind better understand how climate change can be combated and even reversed.

Until then, responsible companies should consider the carbon impact of their new technologies, including machine learning, and take steps to measure the carbon costs of their model development by improving development, software and hardware efficiencies.

Published on February 24, 2021 – 19:30 UTC

Comments are closed.