It’s hard to imagine you haven’t noticed the proposed next best thing in tech. Generative AI for the masses.
Commercials for ChatGPT, a detailed plan scrolling over a group of friends running, explaining how to keep up the motivation for your New Year’s resolution, to Google’s Gemini, which will create a weekly meal plan + step-by-step directions based on a list of ingredients, so you have one less thing to think about. Microsoft Copilot will even turn your paper into an entire presentation without lifting a finger. I could’ve used one of these platforms to write this blog for me. Sounds nice, right?
It’s understandably easy to assume these asks don’t come with consequences. It’s all out there on a cloud, right? Unfortunately, the consequences are real and for some, in their own backyards.
Residents across the South, as close as Memphis, are facing alarming rates of air pollution and water shortages due to the enormous amounts of energy it takes to power these high-computing centers and the water needed to cool their processor chips. These data centers are popping up at an increasingly high rate, often in already disadvantaged, low-income neighborhoods. This scenario of varying degrees is happening across the country, though Southeastern states are taking the brunt. In the case of environmental justice, this is a tale as old as time.
So what is generative AI, why is AI of concern to people and planet, and are there ways to incorporate AI sustainably?
Environment:
Let’s talk about natural resources and a few statistics to set the stage.
Generative AI refers to deep-learning models that can generate high-quality text, images, and other content based on the data they were trained on. Generative models have been used for years in statistics to analyze numerical data. The rise of deep learning, however, made it possible to extend them to images, speech, and other complex data types. (Source)
1 megawatt data centre can use up to 25.5 million litres of water annually just for cooling, which is equivalent to the daily water consumption of approximately 300,000 people. (Source) If running at full capacity 90 percent of the time for a year, a 5-gigawatt data center would consume as much energy as 3.65 million homes. (Source)
An AI tool like ChatGPT, for example, requires substantial amounts of energy for operation and training, which makes it far less energy efficient than a typical search engine. The greater the energy use, the higher the carbon emissions.
From MIT Technology Review, OpenAI has said that ChatGPT receives about 2.5 billion queries each day. In a single day, 2.5 billion.
Over the course of a year, these 2.5 billion queries could emit the same amount of carbon as driving more than 280 million miles in a gas-powered vehicle.
Health and Financial Impacts:
The near constant water and energy required of a data center is not just detrimental to the environment, but to the communities living nearby, to both their wallets and their physical health.
Studies show that power plants are most likely to be constructed in Black neighborhoods and worsen the risks of cancer and respiratory disease. (Source) Direct risks stem from the release of harmful air pollutants by on-site backup generators, while indirect risks arise from reliance on electricity sources that are highly polluting. Pollutants can travel hundreds of miles via the wind and affect communities large distances away from the source.
The historically black neighborhood of Boxtown, Memphis is the building site of xAI founder Elon Musk’s “Colossus” facility. This area has long shouldered the brunt of the city’s industrial pollution. One could say it’s coincidence, but it doesn’t look great to hear that residents of Boxtown have the highest rate of asthma-related hospital visits in Tennessee, and cancer rates four times as high as the national average.
“Southern Environmental Law Center, an environmental nonprofit, has stated that the facility operated for months without air pollution permits. Dozens of large methane gas turbines were keeping the supercomputer running as they spewed toxic fumes into the already smoggy local air.” (Source)
With fossil fuels fueling this work, carbon emissions continue to soar. To prevent residents from subsidizing the industry through higher electric bills, utilities across the country have proposed new electricity rates that would make data centers and other large energy users pay their fair share.
The increasing demand for AI heightens our need to ramp up clean energy production. The strain on our grids should be met with efforts to counterbalance it. “Without adding new clean resources, electric bills will actually go up, and you’ve also increased your reliance on the dirty stuff”, Jackson Morris, director of state power sector policy at NRDC.
Is there hope for AI?
Choices made, like the placement of AI data centers, the sourcing of electricity, and the distribution of AI workloads across centers, will prove whether AI will heighten or dampen health and financial burden.
In the words of Jackson Morris, “..with the right policies, we can harness this growth to develop a cleaner, more reliable and affordable power system. Every challenge is also an opportunity, and states should take this opportunity to help build the power system we need for decades to come.”
These tools are still so new, and we only have so much information and certainly not long-term data. For what we do know and can infer, here are some recommendations of actions to take.
Reccomendations:
- Ecosia- A web browser dedicated to climate action. Ecosia states that they “..dedicate 100% of our profits to the planet. That means you can plant trees, protect endangered animals, and uplift communities around the world simply by browsing.”
Ecosia has an AI feature fueled by renewables and states to use smaller and faster models to save energy. There are several features that you are able to toggle on to conserve battery power and manage your privacy. Ecosia is also a Certified B Corporation.
- Google search vs AI search
Take a look at this comparison chart via kanoppi (Source). This provides a breakdown of the energy required for a more basic Google search vs a query using ChatGPT. We recommend trying to lessen your AI model searches in favor of a more basic web browser search.
| Metric | Google Search | ChatGPT | Difference |
|---|---|---|---|
| Energy per Query | 0.0003 kWh | 0.0029 kWh | ~10x higher for AI |
| CO2 Emissions/Query | 0.2g | 68g | ~340x higher for AI |
| Daily Energy Use | ~10.8 MWh | ~621.4 MWh | ~58x higher for AI |
| Annual Home Equivalent | ~2,000 homes | ~21,600 homes | ~11x higher for AI |
TIP: Each time you create a Google search, you’ll notice an AI response at the top of your page. Type whatever you are searching for such as, “Urban Green Lab” followed by “-ai” and you will not receive the AI response as a part of your search.
- For regulators and legislatures:
Require data centers to pay their fair share, ensure that data centers don’t increase emissions, improve forecasting, planning, and interconnection processes, and require (or incentivize) best operational practices. Read more about these recommendations in full here.
Recommended Reading:
At the Crossroads: A Better Path to Managing Data Center Load Growth

Bethany serves as the Director of Best Practices at Urban Green Lab. She runs the Nashville Sustainability Roundtable, the Sustainable Nashville Directory, and B Tennessee. Off the clock, you’ll find Bethany outside working in her garden, digging through an estate sale, reading, and traveling as often as she can.
