Forget about adding bunny ears to your selfie; AI has long since developed and started to tackle complex environmental problems. Its massive data aggregating capabilities make it ideal for everything from ocean monitoring to climate change prediction. But training AI models requires a lot of energy, so do the benefits outweigh the environmental costs? In short, is AI sustainable?
Continuous AI: fact or fiction?
It’s no secret that the world needs to take immediate and decisive action on greenhouse gas (GHG) emissions if it is to avoid catastrophic climate change. And it’s easy to find research extolling the virtues of AI in achieving that. Business consultants BCG, for example, estimate that AI could reduce emissions by 5% to 10% by 2030.
But it’s also easy to find dozens of articles comparing the carbon footprint of training AI models to 125 New York-Beijing round-trip flights or, to quote one 2019 research paper, the carbon footprint of five cars over a lifetime. So, what is the truth? Is AI a hero or a villain?
While such fragmented narratives make big headlines, as in many things, the reality is very different. AI can have environmental benefits, but it’s a balancing act between energy used and energy saved. So, what can be done to maximize the benefits of AI without increasing environmental costs?
Choose renewable energy
According to a study published by nature.com, “Using renewable energy grids to train neural networks is one big change that can be made. It would make emissions different by a factor of 40 between a fully renewable grid and a fully coal grid.”
Renewable energy is one of the world’s most important strategies to reduce carbon emissions, but whether or not it’s available to you depends largely on where you live and which suppliers you choose. And the fact remains that many low-carbon energy sources – such as solar or wind power – are variable. Grid operators cannot turn them on and off as needed.
Digitization of the grid can help with load balancing and demand management, while energy storage can deal with temporary variations in energy availability. And AI itself can help increase distribution efficiency and drive predictive maintenance to avoid downtime. But ultimately, a large increase in storage is needed if renewable energy is to be available to all.
Assign workloads effectively
Is it better to do AI in the cloud or on-premises? Surprise, surprise… the situation is dynamic and the only correct answer is: it depends. Shifting workloads from the cloud to on-premises can reduce data transfer costs, but for some workloads the cloud is essential. The good news is, however, that work is being done to reduce the carbon footprint of cloud computing by companies like Cloudflare.
Cloudflare’s mission is to create a safe, efficient, reliable, and energy-efficient Internet. More than 25 million Internet properties operate in a global network, which includes more than 250 cities in more than 100 countries. Its 11th generation servers, powered by Arm Neoverse-based CPUs, process an incredible 57% more Internet requests per watt than previous-generation servers based on conventional CPU architectures.
Consider embedded emissions
Embedded emissions simply refer to the amount of GHGs produced in the production of goods. The embedded carbon footprint of AI can be traced right down the line from device to algorithm, but in the case of Arm, it means the engineering workflow required to develop our intellectual property (IP).
This workflow consumes billions of computing hours per year and, of course, requires a significant amount of energy to run. The challenge is to increase operational efficiency while reducing the use of time and energy, achieving results of the same, or higher, quality.
And here’s the interesting thing: we can use AI to reduce the embedded carbon footprint of AI – streamlining processes and using computing hours more efficiently. How? However, developers may choose to use a ‘good enough’ computer. That is, breaking down workloads to generate enough cycles to get the job done accurately, without wasting energy and resources. By using complete test suites at important milestones, for example, but reducing the number of tests performed between these points, it is possible to reduce calculation hours and save energy without compromising accuracy and quality.
Maximize performance per watt
As AI becomes ubiquitous, a constant focus on efficiency will be critical to reducing its environmental impact. Performance per watt will be the new measure of success.
But to stop climate change in its tracks, keeping energy and energy numbers strong is not enough. We need to take a carbon-first approach, looking at it as a key metric alongside energy, performance and space.
By actively investigating new ways to strengthen the energy envelope, we can help AI stay on the right side of history as part of the climate solution and a sustainable future.
Think! Does it need AI?
Perhaps one of the most important questions to consider is does it need AI? Of course, it’s great that your coffee machine recognizes your face and makes your morning cup of joe just right. But if we really want to avoid dangerous levels of global warming, we’ll need to take a long-term view, take a hard look at what we consider important – and work to reduce or eliminate non-essential workloads. If you can easily tap your coffee order and save some energy, for example, why complicate things?
Of course, there are more difficult tasks than AI coffee machines, but the principle applies across the board. We can no longer exploit our resources; we need to ensure that the benefits outweigh the costs. And if that means bye-bye AI coffee, so be it.