In our toxic online environment surrounding the use of generative AI in creativity, many now declare the AI environmental impact is catastrophic. Headlines scream about massive data centers boiling the oceans while algorithms plot the demise of polar bears. Meanwhile, you're on your fourth consecutive hour of "Bridgerton" without a hint of eco-guilt. Curious, ain’t it?
The Moving Goalposts of Anti-AI Arguments
Let's address the elephant in the room: just months ago, the predominant criticism was "AI is theft" and "AI is stealing artists' jobs." As these arguments become more complex, environmental impact is now the new focus.
Why the pivot? Simple: environmental arguments tap into broader societal anxieties and trigger stronger emotional responses than complex debates about intellectual property. It's a calculated rhetorical move designed to maintain moral high ground without requiring consistent logic.
The Selective Outrage Over AI Emissions
A recent Fortune article proclaimed with alarm that “ChatGPT produces CO2 Equivalent of Over 250 Transatlantic Flights Monthly”. Shocking, right? At least, that’s how it was framed.
Let’s put this in perspective: those 260 transatlantic flights? That’s what about 52,000 people might take in a day. Meanwhile, ChatGPT serves millions daily while producing those same emissions. This comparision actually demonstrates AI’s efficiency rather than it’s wastefulness, but that wouldn’t generate the same clicks.
ChatGPT serves over 164 million monthly users, making it remarkably efficient at just 1.59 grams of CO2 per interaction. Even the Fortune article admits that tools like ChatGPT are more energy-efficient than alternatives. This selective presentation of data highlights how statistics are weaponized in the AI debate.
The pattern of focusing on aggregate numbers while ignoring per-use efficiency appears repeatedly in anti-AI environmental arguments. It’s like condemning public transit for its total emissions while ignoring how many cars it keeps off the road.
Streaming Vs. AI: The Comparision No One Asked For

When you hit play on a Netflix show, a complex chain of data transmission is set into motion.
According to a report by the International Energy Agency (IEA), streaming a high-definition video for one hour can consume between 0.3 to 0.6 kWh of electricity. That’s without factoring in:
Video resolution: 4K content uses significantly more data than HD or SD.
Device type: Streaming on a smart TV consumes more power than on a smartphone.
Duration: Binge-watching for hours compounds energy use.
Network infrastructure: Older networks are less energy efficient.
A 10-hour binge of a 4K series can consume up to 6 kWh—equivalent to running a refrigerator for two days. Now imagine 100 million users doing the same every weekend.
Meanwhile, AI energy consumption has two phases that critics conveniently blur together:
Training phase: High energy cost, but infrequent.
Inference phase: Low energy cost, happens millions of times. (~0.0003 kWh)
Yes, training large models like GPT-4 uses a lot of energy—once. Your decision to rewatch a series generates new energy consumption with each viewing. That AI model you’re criticizing? It was trained once, and now serves millions with remarkably efficient inference.
“But AI Uses Water!”—Yeah, And So Does Everything Else

Ah yes, the water argument. A favorite among those who just discovered what a data center is.
Yes, data centers use water for cooling – between 18,000 and 550,000 gallons daily. This requires scrutiny and improvement as technology evolves.
But let’s make a fair comparison to other technology-related water usage that serves individual consumers.
· Watching an hour of Netflix: ~0.2 gallons (IEA)
· Daily smartphone usage: ~3 gallons (Fast Company)
· Running ChatGPT for a day of regular use: ~1 gallon (Microsoft)
Meanwhile, your daily direct water consumption includes:
One toilet flush: 1.6 gallons (EPA WaterSense standards for modern toilets)
Running a dishwasher: 6 gallons per cycle (Energy Star certified models)
Washing machine: 15-30 gallons per load. (Energy Star ratings for high-efficiency models)
If you want to learn more about your water carbon footprint, check out watercalculator.org.
Critics fixate on AI's water usage while ignoring that all digital technologies consume water for cooling, including the platforms they use to berate AI (Facebook, Bluesky, Reddit etc). This outrage appears suspiciously well-timed and inconsistently applied across similar technologies.
If you’re genuinely concerned about the water footprint of technology, the solution is not to abandon AI but to ensure more water-efficient cooling technologies and clean energy at every data center—your streaming service data centers, social media data centers, and cloud storage data centers.
The Psychology of Blame: Why It Feels Good to Hate AI
Psychologists call it "responsibility deflection"—the tendency to focus intensely on others' environmental impacts while minimizing scrutiny of our own. It's why people driving SUVs might criticize your plastic straw usage. The anti-AI environmental argument provides a perfect vehicle for this cognitive bias, allowing critics to position themselves as eco-warriors while continuing their energy-intensive digital lives.
This selective outrage isn't just hypocritical—it's a documented psychological defense mechanism. By fixating on AI's environmental impact, critics create a moral permission structure that excuses their own technological footprints. It's far easier to blame a new technology than to examine decades of personal digital consumption habits.
The Plot Twist: AI Might Actually Save More Energy Than It Uses
While you're still leaving your gaming PC running overnight, AI researchers have been dramatically improving efficiency. GPT-4 requires approximately 10x less energy per output token than its predecessor. AI hardware continues to become more specialized and efficient, with some newer models requiring 1/100th the energy of earlier versions for similar performance.
Here's where the conversation gets awkward for the AI doomers: artificial intelligence actively reduces global energy consumption through:
Optimizing power grids to reduce waste by up to 10%
Reducing manufacturing defects and associated resource waste
Improving logistics to cut transportation emissions
Enhancing building energy efficiency through smart systems
Accelerating climate research that would take humans decades
That AI helping to optimize your city's traffic flow might save more carbon than it produces. The algorithm improving solar panel positioning might generate more clean energy than the dirty energy it consumes. Inconvenient narratives, I know.
The Balanced View: AI's Actual Environmental Footprint
Of course AI has environmental impacts. So does every technology. Some AI applications absolutely deserve scrutiny, particularly cryptocurrency-related ones and unnecessary AI features in products that don't benefit from them. Data centers should absolutely transition to renewable energy sources and improve water recycling systems.
But context matters, and so does proportion. The outrage directed at AI's environmental impact rarely matches its actual footprint, especially compared to legacy systems it often replaces.
A Call for Consistent Principles
If environmental impact genuinely motivates your concern about AI, consider extending that principle consistently across your digital life:
Calculate your comprehensive digital carbon footprint
Evaluate the efficiency of all your technology use
Compare AI energy consumption to alternatives accomplishing similar tasks
Acknowledge AI applications that contribute to environmental solutions
Technology criticism serves an important purpose when applied consistently and proportionally. When arguments shift without acknowledgment and apply standards selectively, they reveal more about resistance to change than commitment to principles.
FAQ: For the Curious, the Skeptical, and the Outraged
1. Does AI really use less energy than streaming?
Yes. Training a model is a one-time cost. Streaming is a repetitive energy drain. Your 10-hour binge of "The Witcher"? Way more energy than asking ChatGPT 100 questions.
2. Why do people blame AI for environmental issues?
Because it’s easier than blaming themselves. It’s a psychological coping mechanism that lets people feel morally superior without changing their habits.
3. Is AI doing anything to help the environment?
Absolutely. From optimizing energy grids to reducing waste in manufacturing, AI is actively contributing to sustainability efforts.
4. Can I reduce my own digital footprint?
Yes. Try:
Watching in HD instead of 4K
Turning off autoplay
Using energy-efficient devices
Deleting unused cloud files
Not blaming AI for everything
5. Should I stop using AI tools?
Only if you’re also ready to stop streaming, gaming, browsing, and using cloud storage. Otherwise, maybe just use them responsibly and stay informed.
Learn More
If you want to learn more about sustainable technology and reducing your carbon footprint, start with some of these sources in addition to the linked information above.
What is Sustainable Technology?