https://newsletter.en.creamermedia.com
  

Private Bag X139 Halfway House 1685

  

The environmental responsibility of AI

6th November 2024

     

Font size: - +

By Ben Selier -Vice President, Secure Power, Anglophone Africa at Schneider Electric

The growth and proliferation of AI show no signs of slowing down. According to IDC, worldwide spending on AI systems will approach $98billion in 2024.  It’s well known that AI is data-driven and highly dependent on computing power, and the complexity of machine learning (ML) or deep learning models requires substantial computational resources. 

Indeed, given the significant energy requirements of modern hardware, this translates into extremely high-power consumption.

Most AI research today focuses on achieving the highest levels of accuracy, with little attention to computational or energy efficiency. Leaderboards in the AI community track which system performs best on tasks like image recognition or language comprehension, prioritising accuracy above all else. 

Deep learning, based on neural networks with billions of parameters, is inherently compute-intensive. The more complex the network, the greater the need for high-performance computational power and extended training times. 

Here, Canadian researchers, Victor Schmidt et al., report that state-of-the-art neural architectures are often trained on multiple GPUs for weeks or months to surpass previous achievements.

The cost of AI

AI is costly; research by OpenAI researchers Dario Amodei and Danny Hernandez shows that since 2012, the computing power used for deep learning research has doubled every 3.4 months. This equates to a 300,000-fold increase from 2012 to 2018, far exceeding Moore’s Law, which states that processing power doubles every two years. 

As AI usage grows, especially with consumer applications like ChatGPT, energy consumption escalates further.

However, and this is good news, as the world focuses on climate change, AI researchers are also beginning to recognise its carbon cost. A study by Roy Schwartz et al. at the Allen Institute for AI questions whether efficiency, along with accuracy, should become a priority. AI models require vast amounts of computational power for training data processing and experimentation, which drives up carbon emissions.

Similarly, the University of Massachusetts (Strubell et al., 2019) highlighted the environmental impact of AI, analysing the computational demands of neural architecture searches for machine translation.  

Therefore, five years ago already, it was projected the carbon cost of training such models is at 626,155 lbs of CO₂, equivalent to 125 round-trip flights from New York to Beijing. As AI's energy demands continue to grow, it's vital to consider sustainability alongside utility.

AI – the good news

Fortunately, AI can assist in our global quest to drive down greenhouse gas emission (GHG). A 2019 study by Microsoft and PwC predicted that responsible use of AI could reduce global greenhouse gas emissions by 4% (2.4 gigatonnes) by 2030. 

AI is already being used to optimise energy consumption in industrial and residential sectors, forecast supply and demand, manage autonomous transportation, and reduce carbon footprints. For example, Google has improved the energy efficiency of its data centres by 35% using ML technology developed by DeepMind.

AI is also helping to minimise waste in green energy production, predicting the output of solar, wind, and hydro energy, and optimising water usage in residential, agricultural, and manufacturing areas. 

Furthermore, algorithms have improved agricultural processes, such as precision farming, ensuring that crops are picked at the right time and water is used efficiently.

AI’s environmental responsibility 

According to the Shift Project, the ICT sector accounts for around 4% of global carbon emissions, with its contribution to GHG surpassing that of the aviation industry by 60%. 

Furthermore, as more businesses adopt AI to drive innovation, the demand for cloud-optimized data centre facilities will rise. By 2025, data centres will account for 33% of global ICT electricity consumption. 

To minimise their carbon footprint, companies must ensure their data centres are equipped to handle high-density compute demands efficiently. Unfortunately, up to 61% of systems run by corporate data centres are running at low efficiency, says by ScienceDirect.

Additionally, it's crucial that data centres are powered by renewable energy. If housed in fossil-fuel-powered facilities, AI's energy efficiency efforts can be negated which is why it’s important that companies verify their cloud provider's green credentials.

Location is another factor in ensuring sustainable AI. Cooling data centres is expensive, especially in warmer climates, and more than 80% of hardware does not need to be near the end user in terms of latency. 

As an example, tech giants like Google are investing in data centres in Nordic countries for better energy efficiency. Plus, in countries like Iceland, natural cooling reduces energy usage, with renewable geothermal and hydroelectric power ensuring cleaner operations. 

The future

The future of AI must focus on sustainability. The World Economic Forum World Economic Forum (WEF) suggests a four-step process to balance AI’s benefits with its environmental impact:

1. Select the right use case: Not all AI optimisations lead to significant carbon reductions. Organisations should prioritise processes that can be meaningfully optimised by AI, especially for sustainability use cases.

2. Choose the right algorithm: The energy consumption of an AI system depends largely on the algorithm used. By selecting the most efficient algorithm, organisations can significantly reduce training time and energy usage.

3. Predict and track carbon outcomes: Good intentions alone aren’t enough. AI implementers must include carbon footprint estimates in cost-benefit analyses and use sustainability as a key performance indicator for AI projects.

4. Offset the footprint with renewable energy: Organisations must utilise green energy sources to power AI models. Google has committed to powering its data centres with renewable energy, achieving net-zero carbon emissions since 2017.

 

Edited by Creamer Media Reporter

Comments

sq:0.045 0.413s - 162pq - 2rq
Subscribe Now