I was surprised to read that “researchers” have started speaking about the carbon emissions. They are talking about carbon credits. One of the reasons why these “articles” appear in the mainstream press is because of high stakes in capturing the “funds” and gaining visibility in the crowded space.
Researchers at the University of Massachusetts Amherst estimated the energy cost of developing AI language models by measuring the power consumption of common hardware used during training. They found that training BERT once has the carbon footprint of a passenger flying a round trip between New York and San Francisco. However, by searching using different structures—that is, by training the algorithm multiple times on the data with slightly different numbers of neurons, connections and other parameters—the cost became the equivalent of 315 passengers, or an entire 747 jet.
I am not sure about the claims. If you read this blog post, it is being served from some specific data center. If you reading this up in your email, it is being sent from some server farm.
Quantification of “estimates” like mentioned above are clear examples of dumbing down the debate. I am not sure if we need these super statistical methodologies in the first place- seems like a marketing hype to push newer hardware (5G) and IoT in the enterprise space. Legions of whitepapers pushed out by consultants and reams of newspapers have pushed out the ideas of “bionic interfaces” (or whatever newer fancy term sticks with the crowd). These “debates” happen in thin air where companies find it profitable to create “counter-culture” arguments.
It’s all kosher when the author mentions about her lab pushing through newer ways of “shape shifting”. Research will “prove anything”.
I think that these paid endorsements are more to sway the funding agencies rather than lay public.
We should stay wary of such claims.