A couple of weeks ago, we discussed the consequences that the AI training models have on the environment, and, while the costs are real, it’s just as important to ask: what can we do about it?
The reality is uncomfortable: our digital lives come with a rising environmental footprint. Every streamed video, email, or notification consumes electricity. The boom in AI-powered tools adds a new layer, embedding energy-hungry processes into daily routines and placing more strain on the infrastructure that powers them.
But here is some good news, too. When used wisely, digital technology can reduce environmental impact. Cloud-based systems are often more energy-efficient than aging on-site servers. Online collaboration tools can cut unnecessary travel. Smart systems for heating, cooling and lighting have helped institutions measurably reduce energy waste. One study estimated that by 2035, AI applications in power, transport, and food could reduce annual global emissions by up to 5.4 billion tons of carbon dioxide equivalent.
University’s Role
The reality is that the companies running today’s data centers are not transparent about the true costs of training large AI models. Understanding the actual footprint is the first step, but while pushing for clarity is important, institutions can’t wait for the giants to disclose everything. We need to ask: what can we do to make AI more sustainable?
Universities and research organizations have a crucial role in leading efforts to make AI more sustainable; they can:
- Conduct carbon footprint assessments of AI workloads to better understand and mitigate the energy impact of AI technologies.
- Embed sustainability through research strategic plans and policy recommendations, nudging the industry towards greener solutions and influencing regulatory decisions.
- Secure funding from agencies like the U.S. National Science Foundation (NSF), the Department of Energy (DOE), and the Defense Advanced Research Projects Agency (DARPA), which increasingly prioritize energy-efficient computing.
- Foster interdisciplinary collaborations among computer scientists, environmental researchers, and policymakers to develop holistic solutions that balance AI progress with environmental responsibility.
- Develop educational programs, workshops, and public forums to spread knowledge of AI sustainability, raise awareness, and encourage the adoption of energy-efficient practices within the AI research community.
This leadership matters because training AI models is exceptionally resource-intensive. Each large model involves billions of parameters, requires weeks or months of computation, and consumes massive amounts of electricity. Frequent retraining, hardware inefficiencies, and cooling demands only amplify the cost. By taking a proactive role, research institutions can drive meaningful change and help shape a more sustainable future for AI.
Energy-efficient processing is the single most important step: it reduces not just electricity use but also dependence on water-intensive cooling and even additional renewable energy production (which itself has costs). Encouragingly, AI efficiency is improving! Google claims that the carbon emissions per query for Gemini have dropped 44-fold within the last 12 months. New techniques like model merging or selectively activating smaller parts of a model make it possible to cut power use without sacrificing performance.
In early 2025, for example, DeepSeek claimed to have trained its R1 “reasoning model” for a fraction of the cost and computing power Meta needed for its latest system, proof that efficiency breakthroughs are possible.
What You Can Do As an Individual
Even if you don’t run a data center, your daily choices matter. AI queries may seem tiny in isolation, but billions of them add up. Here are practical steps to shrink your footprint while still benefiting from AI:
- Be intentional with AI use.
Before prompting, ask: Do I really need AI for this? For quick facts, recent news, or simple calculations, a search engine or calculator is often faster, more accurate, and less energy-intensive. Save AI for tasks where it adds real value, like writing help, coding support, analysis, or brainstorming. - Write smarter prompts.
A vague prompt like “summarize this” can generate long, wasteful outputs. Clear prompts with persona, task, context, and format lead to better results in fewer tries. Example: instead of “Can you summarize this paper?” use: “You are a research assistant. Summarize the paper’s key arguments (task), which is about minimum wage policy (context). Return in two sections: argument + empirical strategy (format).” - Limit excess.
- If you switch topics, start a new session to avoid dragging along unnecessary context.
- Upload only what the model needs.
- Avoid generating endless variations, images, or videos “just because.”
- Choose greener providers.
Some AI services already run on renewable energy and are starting to share data about their impact. Supporting these providers sends a signal to the market and keeps you aligned with companies actively investing in sustainability.- Google has disclosed the per-query energy and water costs of Gemini and has committed to matching 100% of its annual energy use with renewables, while piloting projects in nuclear, geothermal, and heat recovery. See their AI energy score.
- OpenAI, which runs on Microsoft Azure, benefits from Microsoft’s pledge to be fully renewable by 2025 and has its own target of carbon neutrality by 2030.
- Amazon Web Services (AWS) is the world’s largest corporate buyer of renewable energy and aims to power all of its data centers with clean energy by 2025.
- Advocate for transparency.
Ask AI providers (especially the ones you use regularly) to disclose their energy and water usage. Companies pay attention when users demand better practices.
Choosing Wisely, Acting Together
AI’s environmental impact is not just a problem for tech companies but a collective responsibility. Research institutions can set the tone for sustainability, users can make mindful choices, and policymakers can push for transparency and accountability.
Being conscious about AI’s footprint is part of being a responsible digital citizen. Every query, every model, every system design carries a cost, but with thoughtful action, the grass can be greener on this side. After all, the grass isn’t greener on the other side, it’s greener where you water it.