Tuesday, 28 March 2023

AI Is Booming… But Also Burning Carbon (Fast)

NAB

AI is going to be being ubiquitous in just about everything we do — but at what cost to the planet?

article here

While some commentators continue to raise red flags about the Cyberdyne Systems’ Skynet we are building, a more frightening and near-term concern is surely the impact computer processing artificial intelligence is having on climate change.

According to a Bloomberg report, AI uses more energy than other forms of computing, and training a single model can use more electricity than 100 US homes use in an entire year.

Google researchers found that AI made up 10% to 15% of the company’s total electricity consumption in 2021, which was 18.3 terawatt hours.

“That would mean that Google’s AI burns around 2.3 terawatt hours annually, about as much electricity each year as all the homes in a city the size of Atlanta,” Bloomberg’s Josh Saul and Dina Bass report.

Yet the sector is growing so fast — and has such limited transparency — that no one knows exactly how much total electricity use and carbon emissions can be attributed to AI.

AI developers, including OpenAI whose latest ChatGPT model has just hit the market, use cloud computing that relies on thousands of chips inside servers in massive data centers to train AI algorithms and analyzing data to help them “learn” to perform tasks.

Emissions vary of course depending on what type of power is used to run them. A data center that draws its electricity from a coal or natural gas-fired plant will be responsible for much higher emissions than one that uses solar, wind or hydro.

The point is that no-one really knows — and the major cloud providers are not playing ball. The problem is not unique to AI. Data centers are a black box relative to the more transparent carbon footprint accounting being reported by the rest of the Media & Entertainment industry.

According to Bloomberg, while researchers have tallied the emissions from the creation of a single model, and some companies have provided data about their energy use, they don’t have an overall estimate for the total amount of power the technology uses.

What limited information is available has been used by researchers to estimate CO2 waste by AI — and it is alarming.

Training OpenAI’s GPT-3 took 1.287 gigawatt hours, according to a research paper published in 2021, or about as much electricity as 120 US homes consume in a year. That training generated 502 tons of carbon emissions, according to the same paper, or about as much CO2 as 110 US cars emit in a year.

While training a model has a huge upfront power cost, researchers found in some cases it’s only about 40% of the power burned by the actual use of the model, with billions of requests pouring in for popular programs.

Plus, the models are getting bigger. OpenAI’s GPT-3 uses 175 billion parameters, or variables, through its training and retraining. Its predecessor used just 1.5 billion. Version 4 will be many more times as big with a knock-on cost in compute power.

The situation is analogous to the early days of cryptocurrency where bitcoin in particular was hammered for the huge carbon waste from mining.

That negative publicity has led to change in crypto mining operations – and the same pressure could be applied to AI developers and the cloud providers that service them.

We may also conclude that using large AI models for “researching cancer cures or preserving indigenous languages is worth the electricity and emissions, but writing rejected Seinfeld scripts or finding Waldo is not,” Bloomberg suggests.

But we don’t have the information to judge this.

So where does this sit with the net carbon zero pledges of the major cloud providers like Microsoft, Amazon and Google?

Responding to Bloomberg’s inquiry, an OpenAI spokesperson said: “We take our responsibility to stop and reverse climate change very seriously, and we think a lot about how to make the best use of our computing power. OpenAI runs on Azure, and we work closely with Microsoft’s team to improve efficiency and our footprint to run large language models.”

Bland rhetoric with no detail on what the costs to the earth are now, or exactly what efforts the company is taking to reduce them.

Google’s response was similar and Microsoft highlighted its investment into research “to measure the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application.”

Ben Hertz-Shargel of energy consultant Wood Mackenzie suggests that developers or data centers could schedule AI training for times when power is cheaper or at a surplus, thereby making their operations more green.

The article identifies the computing chips used in AI as “one of the bigger mysteries” in completing the carbon counting puzzle. NVIDIA is the biggest manufacturer of GPUs and defends its record to the paper.

“Using GPUs to accelerate AI is dramatically faster and more efficient than CPUs — typically 20x more energy efficient for certain AI workloads, and up to 300x more efficient for the large language models that are essential for generative AI,” the company said in a statement.

While NVIDIA has disclosed its direct emissions and the indirect ones related to energy, according to this report it hasn’t revealed all of the emissions it is indirectly responsible for. NVIDIA is not alone in failing to account for Scope 3 Greenhouse Gas emissions, which includes all other indirect emissions that occur in the upstream and downstream activities of an organization.

When NVIDIA does share that information, researchers think it will turn out that GPUs burn up as much power as a small country.

 


No comments:

Post a Comment