

Given its significant environmental impact, the AI and data center industry is actively seeking innovative strategies to reduce its environmental footprint - with a focus on energy efficiency, water conservation and resource optimization. These solutions address the challenges described in the previous section.
One example is rethinking traditional approaches to cooling. Google has conducted a study that suggests that excessive cooling of components, particularly hard disk drives (HDDs), can be counterproductive. Too low temperatures can lead to mechanical and electrical problems, which paradoxically increases the failure rate.
These findings support the trend increasing operating temperatures in data centres. Google, for example, operates some of its data centres at temperatures up to 27 °C, which contributes to higher energy efficiency without negatively impacting equipment reliability.
Taking advantage of natural climatic conditions offers significant potential for reducing energy consumption:
In addition to efficient cooling, the focus is also on the smart use of waste heat from data centres:
Wider corporate commitments and investments also play an important role:
Target: water positivity: Technology companies such as Microsoft and Google have committed to making 2030 will be "water positive" - that is, they will return more water than they use.
Integration of renewable energy sources: Data centres are increasingly using renewable energy sources - solar and wind - to reduce the carbon footprint associated with electricity generation.
In addition to hardware and infrastructure, the optimization of AI models itself plays an important role, which can significantly reduce resource consumption:
Reducing the accuracy of model computations (e.g. from 32-bit to 8-bit), which reduces computational power and energy consumption and - without significantly affecting the quality of the results.
Training smaller, effective models (students) to imitate the behavior of larger models (teachers). The result is models with high accuracy and significantly lower resource requirements.
It uses only selected parts of the model according to the specific task, which reduces the number of calculations and energy consumption.
It allows you to store and reuse frequently repeated parts of a prompt, thus significantly reducing latency and computation costs.
For example, OpenAI implemented prompt caching in its APIs, which led to reduce costs and process prompts faster.
Removing less important neurons or connections in the model, resulting in smaller model size and lower computational power requirements.
Studies show that pruning can reduce size, and thus energy, by up to with minimal loss of model performance (when pruning is done carefully).
This method speeds up text generation by having a smaller and faster "draft" model suggest several tokens up front, which are then verified and possibly modified by the larger "verification" model.
Thanks to parallel token processing, inference is significantly accelerated without the need to retrain the model.
An open-source library that optimizes inference of large language models using the PagedAttention algorithm, which efficiently manages memory by partitioning keys and values into smaller blocks.
In this way, it reaches up to increased throughput over traditional libraries without having to change the model architecture.
While systemic change and technological innovation are key to reducing the environmental impact of AI, we as individuals and as a society also have an important role to play. Each of us can contribute to a more sustainable digital ecosystem through our approach:
These comprehensive approaches, both at an industry-wide level and in our individual actions, show that the industry is taking the environmental impacts of AI seriously and moving towards more responsible and sustainable technologies and approaches. Sustainability in AI is not just a technical task, it is a shared responsibility - and every thoughtful step counts.
What other innovative approaches would you see as key to reducing the ecological footprint of AI in the future?
The ecological impact of AI: What's going on behind the scenes?
Using AI is not without impact - every prompt triggers processes that consume both electricity and water. This article looks at what goes on 'behind the scenes' of AI and why it is important to approach it sustainably.
Data centres: the heart of the digital world and its environmental footprint
Data centres are the foundation of the digital world and AI. How do they work, why do they consume so much energy and what do they have to handle?
How to know when the time is right to implement AI?
A practical guide for companies that are considering using AI but are still hesitant.
AI: Assistant or threat to juniors?
AI in development through the eyes of a junior: a valuable helper or an invisible crutch that hinders growth?