Toward Environmentally Sustainable Artificial Intelligence - Our Positions
Governments should create an open marketplace for underutilized IT infrastructure to support AI training, promote transparency in environmental impacts, and ensure local value creation. They must also facilitate shared resources, support local providers, and invest in sustainable practices to mitigate the environmental impact of AI technologies.
Definitions
Digital resources: A Leitmotiv term that represents the output of ICT equipment when powered with electricity, namely the capacity to process, store, and transfer data. We use this word throughout this document to represent 'compute', 'data storage', and 'data transfer'.
AI Model: A neural network or similar structure that has been trained on data, either real or synthetic, to perform specific tasks (e.g., a large language model is trained to perform language-related tasks).
AI Training: A singular digital resource-intensive process of processing large amounts of data using a neural network to train that network and form the foundation of an AI model that can be interacted with.
AI Inference: The act of querying or prompting an AI model that is deployed on a server for a decisional response.
Governments
Resource: Policy brief on a European marketplace
Resource: Model for assessing local impact of providers
Governments Should Support Local Providers Investing in AI Infrastructure and Monitor Environmental Impact
For local providers to participate in fulfilling the digital resource demand imposed by AI, they need to invest in specialized ICT equipment. Most local providers lack access to capital or financial instruments such as debt financing. Governments should provide guarantees and debt-based instruments to enable local and national providers to:
Purchase ICT equipment required to produce specialized digital resources for AI training and inference
Invest in operational changes to improve sustainability and reduce environmental impact, for example, energy storage, heat recovery, or on-site/nearby renewable energy generation
Banks often shy away from financing ICT equipment for local providers because they are classified as higher risk and have fewer assets to guarantee their financing. This creates an uneven playing field, as large-scale, vertically integrated digital product companies like Google or Amazon can leverage their cash-generating core businesses (e.g., advertising and e-commerce) to generate capital for infrastructure investment without requiring outside financing.
Resource: Financing structure for SMEs
Regulation Should Be Used to Ensure Environmental Impact and Local Value Creation of AI Models Must Be Disclosed
Without the availability of facts, creating further policies to shape the future of AI is becoming challenging. Therefore, the first step should be to implement policies that force any producer of AI models to disclose the environmental costs incurred during training, as well as the local value creation generated by the infrastructure used for the training. This can be done in the form of a label, but it is critical that absolute numbers for energy use, emissions, and materials use are fully disclosed to the public.
Resource: list of environmental impact & local impact metrics to be disclosed
Form Strategic Partnerships to Establish and Grow the Digital Resource Marketplace Across Different Geographical Regions
Form strategic partnerships with countries that have abundant energy, land resources, or developed digital infrastructure. Integration with these nations through the marketplace can help:
Reduce pressure on local land and energy infrastructure
Unlock cost-efficient digital resources
Prioritize partnerships with countries that have renewable energy assets to minimize environmental impact from AI training and inference
Furthermore, these partnerships can create mutual value. For example, a country may export digital resources to the marketplace and in return gain market access, enabling them to address peak over- or underutilization of the digital infrastructure.
Resource: blueprint for strategic partnerships, list of good matches
Facilitate or Create Buyers Groups for AI-Related ICT Equipment
Governments should facilitate the formation of domestic and international buyers groups to:
Reduce ICT equipment costs (e.g., GPUs) for individual IT infrastructure providers
Enable volume-based price negotiations for AI training and inference hardware with vendors
Streamline market entry for new equipment manufacturers by centralizing buyer relationships
Create economies of scale through coordinated procurement across multiple providers
This approach can help democratize access to essential AI infrastructure while fostering a more competitive hardware marketplace.
Resource: example or blueprint for buyers group
Governments Should Invest in Best Practice Development and Research on Reducing Environmental Impacts of AI Models and Maximizing Societal Benefits
Model capabilities continue to increase, though they are mainly driven by using more digital resources to provide enhanced functionalities. This directly translates to increased materials and energy consumption for the necessary equipment and digital resources. This trajectory is not sustainable.
Model capabilities continue to increase, though they are mainly driven by using more digital resources to provide enhanced functionalities. This directly translates to increased materials and energy consumption for the necessary equipment and digital resources. This trajectory is not sustainable.
Governments should systematically invest in:
Improved algorithm development by advanced research facilities
Best practices that enable greater model capabilities while reducing digital resource demand
Methods to minimize energy and material usage
These initiatives can support AI advancement while enabling more environmentally sustainable pathways.
Resource: research agenda
Governments Should Develop Reporting Standards on the Environmental Impact of AI Models
The rise of AI creates two distinct risks. The first risk is unprecedented energy use and environmental costs for this emerging digital technology that remain unconstrained and unmanaged. The second risk is that AI development will further exacerbate concentration of wealth.
This wealth extraction stems from AI models using local, socialized energy infrastructure, land, and natural resources to generate value that is not returned to society through taxes, local job creation, or other local value creation.
Therefore, governments should create reporting standards that can be enforced on all AI model providers through regulation. These standards should establish key environmental impact metrics and require transparent reporting on the local impact created from AI training and inference by the infrastructure where it is located.
Resource: list of environmental impact metrics, list of local value creation aspects
Governments Should Invest in Developing Certifications for AI Models and AI Services Based on Their Environmental Costs and Value Contribution to the Local Economy
Governments can enable customer choice in the AI market by investing in the development of third-party verified certifications for AI models based on their training and operational locations. These certifications can:
Create trust and differentiation in the market for developers of AI models and operators of AI services
Help enterprises and small and medium-sized enterprises choose models and services that align with their sustainability and local impact strategies
Resource: Outline of a certification scheme
Government Tenders Should Require Transparency on Environmental Impact and Local Value Creation
Governments remain large purchasers of IT services. When considering purchasing digital products with AI-enhanced functionalities or purchasing AI services or custom model development directly, they should:
Require vendors to disclose the total environmental impact of training and usage
Provide environmental impact measurements for each AI inference
Request proof that the infrastructure used to train and operate the model creates local value in the communities where it operates
This approach uses government purchasing power to shape the market, especially as providers seek to apply AI technology and products in government digitalization.
Resource: effect analysis, procurement criteria that can be used
Governments Should Facilitate Digital Resource Sharing Across AI Market Participants
The innovation in AI model development should focus on algorithm development, reducing resource demands, and sustainability – not access to digital resources. Governments should facilitate this by:
Regulating the sharing of digital resources
Ensuring equal access for model developers and service providers to digital resources
This approach prevents individual actors from using resource access to hinder market competition and innovation. One potential solution is creating an open marketplace for digital resources, as mentioned in this paper. Another approach is to offer incentives or financial guarantees when actors co-invest in digital infrastructure and make it conditional that the infrastructure be accessible to all market actors, similar to supercomputing infrastructure in academia.
Resource: Policy options
Governments Should Facilitate Environmental Impact Measurement of AI Models
For an AI model developer or service provider to determine the environmental impact of their models during training and operation, they require transparent information from IT infrastructure and data center providers. Governments should require these providers to make that data available based on existing standards. The information should be accessible through technical interfaces such as APIs so that AI model developers and operators can read the information in real time during training and inference.
Resource: Standard, list of indicators, NADIKI reference
Dataset Providers
To train an AI model, large datasets are required. This trend has led to the establishment of specialized service providers that provide either synthetic datasets or datasets scraped from the Internet and other sources.
Dataset providers should use shared infrastructure for the generation of synthetic or the crawling and collecting of other data.
Creating a larger dataset requires vast amounts of digital resources even before the training of an AI model. The digital resources should come from shared infrastructure such as the marketplace outlined in this paper. Dataset providers should not create new and dedicated infrastructure for the collection or generation of data, which will lead to underutilization whenever the infrastructure is not in use.
Resource: Technical architecture
Dataset providers should demand transparency from their supply chain, including the infrastructure and digital resources they use, as well as any technical interfaces such as APIs.
Dataset providers often rely on third-party infrastructure tooling as well as application programming interfaces (APIs) to create the datasets they provide. For each of those datasets, it is critical to determine the total environmental impact as well as the local value creation. To determine this, providers require transparent data from across the supply chain, which the dataset providers should demand.
Resource: Letter to suppliers
Dataset providers should label each of the datasets they offer with the total environmental impact and local value creation
As competition in datasets increases, providers should differentiate their datasets not only by quantity and quality but also by the environmental impact created during collection or generation.
Each dataset offered to AI model developers should clearly display energy usage, emissions, and material consumption during its creation.
Resource: List of indicators, reporting format
AI Model Developers
AI model developers are companies that use datasets to train AI models based on public or proprietary algorithms. They create open-source or proprietary AI models that others can use to develop AI-based features or offer AI services.
AI model developers should use shared infrastructure for training to promote fair competition and innovation
Training AI models requires significant digital resources. These resources should come from shared infrastructure such as the marketplace outlined in this paper. AI model developers should not create new and dedicated infrastructure for model training, as this creates unfair competition through vertical integration and prevents other developers from accessing necessary digital resources. Using dedicated infrastructure also leads to inefficient resource utilization during periods of low usage.
Resource: Technical architecture
AI model developers should demand transparency from their supply chain, including the infrastructure, digital resources, and APIs they use
AI model developers often rely on third-party infrastructure, computing resources, and application programming interfaces for training and deploying their models. To accurately assess each model's environmental footprint and local economic impact, developers need transparent data from across their supply chain. This includes detailed information about energy usage, emissions, and resource consumption from infrastructure providers, data centers, and API services. Further, they should demand information about local value creation from their suppliers.
Resource: Letter to suppliers
AI model developers should recognize that their training not only consumes vast amounts of digital resources but also presents an opportunity to share value creation with the communities where the infrastructure is located
Local communities provide the energy, land, and other natural resources that power the infrastructure producing the digital resources that enable AI model developers to train their models. Since communities often pay for this land, energy, infrastructure, and natural resources, it should be a priority to return value to them. This value sharing includes ensuring that the infrastructure pays local taxes, creates local job opportunities, and collaborates with educational institutions in the region to facilitate training and create more economic opportunities for the community.
With this in mind, AI model training should not only create value through productivity increases in usage but should also contribute positively to society at large. This represents a technology that is not purely extractive but returns value to the communities that enable it through their resources and infrastructure.
Resource: Framework to assess local impact
AI model developers should demand transparent environmental impact and local value creation data from dataset providers
The supply chain for AI model development consists not only of digital resources but also of datasets supplied by third parties. These dataset providers are part of the supply chain of an AI model, and determining the total environmental impact and local value creation requires data from dataset suppliers.
Therefore, model developers should demand transparent information on the environmental costs of creating datasets from their providers and include that information in the model's impact assessment.
Resource: Letter to suppliers
AI model developers should schedule training during periods of abundant renewable energy
Energy remains a scarce and expensive resource. When running model training during peak times – for example, in the late afternoon – fossil fuel-based power generation is likely activated to meet demand. Training should be designed to be interruptible and run outside peak energy windows, such as at night or during sunny days when renewable energy is abundant.
An alternative approach is to utilize infrastructure that is either close to renewable power generation or within countries that have an overproduction of green electricity. Using the shared infrastructure outlined in this paper, it is also possible to train a model around the clock by shifting it across regions depending on renewable energy availability.
The information about the current energy mix at the infrastructure location used for training should be required from the infrastructure supplier (if not using own infrastructure).
AI model developers should work toward using generalized computing architectures rather than specialized ones.
The current demand for GPUs and even more specialized FPGAs, with many AI model developers announcing their desire to design their own chips, is not sustainable. Ever more specialized chips will become obsolete when a new generation of models is trained. This has happened before with blockchain mining, where each generation of mining equipment quickly became obsolete. As these chips are so specialized, there is likely no secondary use for them, meaning they become electronic waste within short time spans. There is also an economic consideration: The chip market is already highly concentrated, and demand for specialized chips is going to increase prices further, making it increasingly more expensive for AI model developers.
To avert this, model developers should focus on creating algorithms and training approaches that utilize large-scale parallelization using commodity hardware (traditional CPUs). This allows AI model developers to tap into the vast, existing, and lower-cost market for general compute and allows underutilized digital resources to be reallocated for training. It also avoids the depreciation of the hardware, as the equipment can be reused for other use cases when not needed for model training.
This might lead to a decrease in training speeds, which should be possible to circumvent by using parallelization and distributed computing paradigms.