The rise of AI technologies holds enormous potential, but we must not overlook the environmental footprint associated with their energy consumption. From an ESG (environmental, social, governance) perspective, growing commitments to sustainability make it increasingly important to examine issues such as the energy consumed by AI solutions, how systems are operated and optimised, and to take a look at the emerging trend of ‘Green AI’, which aims to reduce the energy requirements of artificial intelligence.OverviewMostly known AI applications and areas are the followings: Training Large Machine Learning Models; Inference, like chatbots, image recognition and recommendation systems; Edge and IoT Devices; Natural Language Processing (NLP) and Translation Tools; Autonomous Vehicles and Robotics; Healthcare; Finance and Trading; Research and Development. Energy consumption of AI applications is a growing concern due to the exponential increase in the scale of AI models and the prevalence of AI in daily life. Large-scale AI models are the most energy-intensive, but even smaller applications can cumulatively contribute to significant energy use. Industry leaders are increasingly focusing on making AI more energy-efficient by improving hardware, algorithms, and the use of renewable energy sources to mitigate environmental impacts. Understanding the energy consumption of AIThe energy consumption profile of AI solutions is complex and multifaceted, spanning different points in the service chain. It also requires a lot of estimation, as AI companies have become much more secretive over the years, partly due to competition, but also possibly as an attempt to deflect criticism. Let’s start with model training, as this is one of the largest parts of AI energy consumption, and ballooning model sizes (usually measured in the number of parameters) result in increased computational power requirements, and therefore energy consumption. For example, OpenAI’s GPT-3 – with more than 175 billion parameters – consumed an estimated 1,287 megawatt hours (MWh) of energy during training, while training GPT-4 – with an estimated 1.76 trillion parameters supported by multiple external sources – may have consumed up to 62,000 megawatt hours – enough electricity to power 1,000 of the average US homes for 5 to 6 years. Energy costs also start to add up with each inference (where the query is passed through the model’s parameters to produce an output) when a model is rolled out to consumers for use. Depending on, among other things, how long the model is in use, and how many queries it receives per day, the bulk of an AI model’s energy usage can come not from training but from use. Some estimates assume 2,89 watt-hours per Chat GTP query – about 10 times more energy than a traditional Google search query. If every one of Google’s 9 billion daily search requests would run through ChatGPT at this average energy consumption, usage would exceed the power consumption of training in just under 2.5 days. For another perspective, we can take a look at cloud computing services including Amazon Web Services (AWS) and Microsoft Azure, which play a role in both the training and the inference of AI models. Amazon, whose AWS accounts for about half of the world’s public-cloud infrastructure business, has always been cagey about its energy consumption, but the International Energy Agency estimated that it consumed 30.9 TWh across its operations in 2021. The AI push is also putting pressure on Microsoft’s emissions targets, with their electricity consumption more than doubling (from 11 TWh to 24 TWh) in just four years, with a corresponding 42% increase in total carbon emissions (indicating a growing share of renewable energy sources). These trends coincide with Microsoft Azure’s use to train and run AI models, OpenAI’s ChatGPT being the most prominent of these. The increasing amount of data being generated and processed globally also requires data transmission systems. It should come as no surprise that AI-powered services generate huge amounts of data traffic, which is then sent across networks and to user devices, adding to the environmental footprint. Overall, when it comes to energy demand from data centres and data transmission networks, each account for 1-1.5% of global electricity use, resulting in 330 Mt CO2e emissions in 2020. (This excludes energy used for cryptocurrency mining, which was estimated to account for 0.4% of annual global electricity demand in 2022.) Green AI: The path to sustainable artificial intelligenceThe emerging Green AI movement focuses on reducing the energy requirements and overall environmental footprint of AI systems. Some research and development efforts turned towards streamlining AI models to reduce their computational requirements – and thus their operational costs – without significantly compromising their performance, with models like OpenAI’s GTP 4o mini, Google’s Gemini 1.5 Flash-8B or Anthropic’s Claude 3 Haiku already on the market. Besides using a more efficient model architecture, better processors and a greener data centre can also help reduce the carbon footprint as well – by 100 to 1,000 times according to a study by Google, to be precise. Google has reached the milestone of purchasing enough renewable energy to match 100% of the electricity used by its global operations, including its data centres and offices, by 2020, and set a target to run on carbon-free energy (CFE) 24/7 on every power grid by 2030. It has also introduced several versions of its Tensor Processing Unit (TPU) solution, which were able to reduce energy consumption in AI training by up to 30-50% compared to high-end NVIDIA GPU-s, kicking off a race to develop more energy-efficient chips tailored for AI development. Meta has also launched its own initiatives, looking at optimising cooling systems in its data centres, and fine-tuning LLM datasets to use less energy for training. These Green in AI initiatives are not the only forms of Green AI. Green by AI initiatives focus on developing AI solutions for eco-friendly practices in other areas, such as mobility or agriculture. Tools for accurately measuring and optimizing energy consumption are also part of the movement. How Can Businesses Assess Their AI Energy Demand?To comply with ESG guidelines, companies need to take a conscious approach to managing the energy consumption of all their digital systems, including but not limed to AI. The first step is to measure and assess the amount of energy being used. Various tools are already available to help track and monitor the carbon emissions of digital systems. One of these is the Carbon.Crane solution, which helps companies measure and optimise the carbon footprint of their digital infrastructure. Starting with their websites and email campaigns, which for companies with significant online customer traffic and mass email customer communications can add up to hundreds of tonnes of CO2e per year. Optimisation solutions can reduce this by up to 50-80%, freeing the environment from a significant carbon footprint and saving energy costs for digitally sustainable businesses. Once organisations understand their energy consumption, they can move on to the next step: optimising the use their AI systems. Few companies are large enough to train their own AI models, but every company can express a commitment by choosing more energy-efficient architectures and models that still suit their needs. Selecting the ‘greenest’ possible cloud services, looking for edge computing solutions that allow AI processing to be done closer to the source of data, only processing data that is necessary for your business objectives, and scheduling tasks to run during times when renewable energy is more available are also good steps to take. What can companies do to supply their AI applications with renewable energy?Companies looking to supply their AI applications with renewable energy can adopt several interconnected strategies that significantly reduce their environmental impact. One effective approach is to generate renewable energy on-site by installing solar panels or wind turbines. These systems can provide a direct and reliable source of clean energy for powering AI-driven operations, such as data centers or edge devices. To ensure a consistent energy supply, companies can pair these installations with energy storage systems, allowing them to store excess power during peak solar or wind periods and use it later when renewable generation is low. Another important strategy is to collaborate with green data centers. Many AI applications are hosted in cloud-based or third-party data centers, and companies can choose providers like AWS, Google Cloud, or Microsoft Azure, which are increasingly powered by renewable energy sources such as wind, solar, or hydropower. For example, Google Cloud has committed to running its data centers on 100% carbon-free energy by 2030, ensuring that AI workloads are sustainably powered. In addition to using renewable energy, optimizing AI workloads for energy efficiency can also help reduce the overall electricity needed. Techniques such as model compression, pruning, and transfer learning can make AI models more efficient, cutting down energy consumption during both training and deployment. Companies can also schedule energy-intensive tasks, such as AI training, during periods when renewable energy supply is highest, like when solar energy generation peaks during the day. Strategic considerations to mitigate environmental risksTo mitigate AI energy consumption, various strategies can be adopted, leveraging both hardware and software innovations, as well as broader infrastructural and strategic measures. InnoEnergy, along with its portfolio members and strategic partners, plays a pivotal role in reducing the business risks of applying AI sustainably, contributing to both energy efficiency and sustainability in AI deployment. InnoEnergy’s approach revolves around its strong network of strategic partnerships, investments in cutting-edge technology, and commitment to sustainability. Hardware optimizations, algorithmic efficiency improvements, the transition to green data centres, and promoting transfer learning are key strategies that can reduce the energy consumption of AI applications. By leveraging its investment portfolio and collaborative ecosystem, InnoEnergy not only reduces the carbon footprint of AI but also helps businesses manage the financial and operational risks associated with sustainable AI adoption. This comprehensive approach ensures that AI can be applied in a way that is both innovative and environmentally responsible. Final thoughtsThe energy requirements of AI are becoming an increasingly important issue in the tech sector, especially when examined through the lens of sustainability and ESG expectations. As AI becomes more integrated into both everyday life and business practices, it is important to focus on energy-efficient solutions, green AI initiatives and sustainable operations. Research and development in this area is already underway, but companies need to be actively involved in measuring and optimising AI energy use to ensure that technological progress can be achieved in an environmentally responsible way. About the authorsÁkos Dervalics is the Managing Partner of Green Brother, Head of EIT InnoEnergy HUB Hungary for the past 8 years, and an angel investor in innovative businesses with environmental benefits. He co-wrote the article, which originally appeared on Portfolio.hu, with József Bodnár, our co–founder and CEO. |
Comments are closed.