Saturday, February 22, 2025

Enterprises Must Enhance Decision-Making to Lower GenAI Emissions

Generative artificial intelligence (GenAI) is shaping our world, but it’s also leaving a heavy environmental footprint. According to a report from Capgemini Research Institute, the energy used to train the latest generative pre-training transformer (GPT) models is comparable to the annual energy needs of 5,000 homes in the U.S. And that’s just the start. Once these models are up and running, they require as much energy, if not more, to operate in a business setting.

To put it into perspective, just one query to a large language model (LLM) uses ten times the electricity of a simple Google search. That’s a staggering thought when you consider that the adoption of GenAI has skyrocketed, from 6% to 24% among organizations in just a year. By 2026, GenAI could contribute to nearly 5% of an organization’s total greenhouse gas emissions, up from just 2.6% today.

Furthermore, every time an LLM processes 20 to 50 queries, it consumes about 500 ml of water. The problem doesn’t stop there—GenAI could generate between 1.2 to 5 million metric tons of e-waste by 2030, an amount that dwarfs current production by about 1,000 times. Vincent Charpiot from Capgemini sounded the alarm: energy consumption from GenAI is on track to nearly double its share in carbon footprints within just two years. Companies need to think bigger about sustainability in their AI strategies. By opting for smaller models, using renewable energy, and working with transparent vendors, businesses can lessen their environmental impact while still innovating.

The environmental toll comes from all angles—producing graphics processing units requires mining rare earth metals, training models demands enormous data centers, and more. For many organizations, the emissions from GenAI fall under what’s known as Scope 3 emissions, which refers to indirect greenhouse gas emissions. Capgemini emphasizes that the right choices can significantly mitigate this impact. It’s not always necessary to reach for heavy-duty GenAI technologies when more efficient alternatives can get the job done.

Vishal Singhvi from Microsoft pointed out that traditional AI often meets many needs without the high energy burden. When it comes to tasks, small language models (SLMs) trained on targeted datasets can greatly reduce both emissions and costs. According to Arthur Mensch from Mistral AI, these smaller models allow for more calls at a fraction of the cost, effectively enhancing applications without excessive energy use.

Awareness of the environmental impact is crucial. Mauli Tikkiwal from Orchard Hill College emphasizes that recognizing these effects is the first step toward reducing them. Surprisingly, only 14% of Capgemini’s survey respondents say their companies track their GenAI emissions. A major hurdle is the lack of transparency from suppliers, a situation three-quarters of executives pointed out. There’s an expectation for tech companies to take the lead on this issue.

However, some tech firms are already taking sustainable steps. Nvidia’s latest GPUs boast 30 times more efficiency compared to older models. A startup, LiquidAI, is working on adaptive algorithms designed to consume less energy. Microsoft has rolled out energy-monitoring features for its LLMs, while Meta recently signed a deal for geothermal energy to power its U.S. data centers. Google also offers its Carbon Sense Suite to help businesses accurately track and reduce their emissions.

At the AI Summit London in June 2024, experts noted the dual role of GenAI. It can indeed help firms better manage their environmental impact, particularly with Scope 3 emissions. However, the urgency of addressing its current negative effects on the planet cannot be overlooked. Tracking and managing emissions is complex, as every organization has its own way of handling data.