Is artificial intelligence sustainable? Five ways to reduce your carbon footprint

Don’t forget to add bunny ears to your selfie; Artificial intelligence arose a long time ago and began to tackle difficult environmental problems. Its data-processing superpowers make it ideal for everything from ocean monitoring to modeling climate change prediction. But training AI models requires huge amounts of energy, so do the benefits outweigh the environmental cost? In short, is AI sustainable?

Sustainable AI: Fact or Fiction?

It is no secret that the world needs to take swift and decisive action on greenhouse gas (GHG) emissions if we are to avoid catastrophic climate change. And it’s easy to find research praising the advantages of AI in achieving this. Business consultants BCG, for example, estimates that AI could reduce emissions by 5% to 10% by 2030.

But it’s also easy to find plenty of articles comparing the carbon footprint of AI training models to 125 round trips between New York and Beijing, or, say, one 2019 paper, the lifetime carbon footprint of five cars. So what is the truth? Is artificial intelligence a hero or a villain?

While such polarizing narratives make headlines, as most things do, the reality is more nuanced. Artificial intelligence can have environmental benefits, but it is a balance between energy used and energy saved. So what can be done to maximize the benefits of AI without inflating the environmental costs?

Choose renewable energy

According to research published by nature.com, “Using renewable grids to train neural networks is the single biggest change you can make. It can make emissions vary by a factor of 40 between an all-renewable grid and an all-coal grid.”

Renewable energy is one of the world’s primary strategies for decarbonization, but whether or not it is available to you depends largely on where you live and what suppliers you can choose from. The fact remains that most low-carbon sources of electricity – such as solar or wind power – are variable. Network operators cannot turn them on and off as needed.

Grid digitization can help with load balancing and demand management, while energy storage can handle short-term changes in energy availability. The same AI can help maximize distribution efficiency and drive predictive maintenance to avoid downtime. Ultimately, however, a significant increase in storage is needed if renewable energy is to become available to all.

Lay out workloads effectively

Is it better to use AI in the cloud or at the endpoint? Surprise, surprise… the situation is delicate and the only correct answer is: it depends. Moving workloads from the cloud to the endpoint can reduce the cost of moving data, but for some workloads, the cloud is a must. However, the good news is that work is being done to reduce the carbon footprint of cloud computing by companies like Cloudflare.

Cloudflare’s mission is to build an internet that is secure, efficient, reliable, and consumes less power. More than 25 million websites operate on its global network, which stretches across more than 250 cities in more than 100 countries. Eleventh generation servers, powered by Arm Neoverse-based CPUs, process 57% more internet requests per watt than previous generation servers based on traditional CPU architectures.

Consider embedded emissions

The included emissions simply indicate the amount of greenhouse gases generated in the production of an asset. The embedded carbon of AI can be traced right all the way from the hardware to the algorithm, but in the case of Arm, it means the engineering workflow required to develop our intellectual property (IP).

These workflows consume billions of computing hours annually, and of course require a great deal of energy to run. The challenge is to increase workflow efficiency while reducing time and energy consumption, to achieve results of similar or higher quality.

Here’s an interesting thing: We can use AI to reduce AI’s embedded carbon — streamlining processes and spending computing hours more efficiently. How do? Well, engineers may choose to use a “good enough” calculation. That is, reduce workloads to produce enough cycles to get the job done accurately, without wasting energy and resources. By running complete test suites at milestones, for example, while reducing the number of tests run between these points, it is possible to reduce computing hours and conserve energy without compromising accuracy and quality.

Maximize performance per watt

As AI becomes more ubiquitous, a continued focus on efficiency will become essential to reduce its environmental impact. Performance per watt will become the new measure of success.

But to stop climate change in its tracks, keeping power and energy numbers stable is not enough. We need to take a carbon-first approach, considering it a vital statistic along with power, performance and space.

By actively looking for new ways to narrow down energy, we can help AI stay on the right side of history as part of a solution to climate and a more sustainable future.

Think! Do you need artificial intelligence?

Perhaps one of the most important questions to consider is does it need AI? Sure, it’s nice to have your coffee machine recognize your face and brew your morning cup of joe accordingly. But if we really want to avoid dangerous levels of global warming, we’re going to have to take a long, hard look at what we consider essential expense – and work to reduce or eliminate unnecessary workloads. If you can easily click through to your coffee order and save some energy, for example, why complicate things?

Of course, there are higher workloads than AI coffee machines, but the principle applies across the board. We can no longer afford to waste our resources; We need to make sure that the benefits outweigh the cost. And if that means goodbye coffee to AI, then so be it.

Leave a Comment