Training an artificial intelligence model can emit the same amount of carbon dioxide as the manufacture and lifetime use of five cars, MIT's Technology Review reports.
Researchers from the University of Massachusetts in Amherst calculated the carbon footprint of four models for natural language processing: Transformer, ELMo, BERT, and GPT-2. Based on how long training models need to run, they estimated the amount of energy consumed during that time, which they then converted into pounds of carbon dioxide based on the combination of energy sources cloud computing companies like Amazon's AWS rely upon. From this, the UMass researchers report that training one model — depending on which one and which parameters are used — can have an energy consumption that's the equivalent of flying across the US or even the lifetime use of multiple cars.
"While probably many of us have thought of this in an abstract, vague level, the figures really show the magnitude of the problem," Carlos Gómez-Rodríguez, a computer scientist at the University of A Coruña who was not involved in the study, tells Tech Review. "Neither I nor other researchers I've discussed them with thought the environmental impact was that substantial."
In their paper, the researchers urge both industry and academia to seek more efficient algorithms.