Artificial intelligence consumes too much energy and water and produces too much pollution for any ethical person or organization who cares about the planet to use it.

The Earth has a new threat: massive computer data centers.

A recent study predicted that these data centers — which already consume about 1.5% of global electricity — will double their energy usage in the next five years, to about 945 terawatt hours a year. To put that in perspective, a single terawatt hour could power all of California for a week and half.

The culprit in this massive increase in energy consumption? Generative artificial intelligence (AI).

That’s one of the reasons President Trump has pushed for more coal mining in the United States: to feed AI’s ever-consuming maw. Coal is bad enough, but many AI data centers are already powered by diesel or methane, which spew pollution into nearby communities — often poor communities of color already suffering from environmental justice issues.

And that’s the primary reason why The Revelator has banned its writers from using AI. Given AI’s enormous power and water usage — and resulting greenhouse gas emissions, noise pollution, and asthma-causing pollution — there is simply no ethical way for an environmental writer to use ChatGPT, Google Gemini, or any of the other AI “writing” services.

Similarly, we won’t be using AI to generate art for any of our stories. That requires even more power than text-based output.

Of course, AI’s energy consumption and pollution problems are just part of the story. There are plenty of other reasons to avoid it.

For one thing, there’s their accuracy — or lack thereof. I think we’ve all seen “facts” spit out by generative AI that illustrate just how little their output can be trusted. For an obvious example, just look to the Trump administration’s likely use of AI to levee tariffs on penguins. And that’s just the start: A recent report found that the more powerful AI becomes, the more it tends to fabricate answers (a process the industry worryingly calls “hallucination”).

And getting back to ethics, most of these AI language models have been “trained” using books and articles from real writers, who did not consent to having their works digitized and plugged into massive databases, let alone receive any compensation for their contributions. That’s another huge reason to avoid them.

And despite that “training,” the so-called “writing” pumped out by these AI systems flat out sucks, to put it mildly. We’ve gotten dozens of AI-penned submissions over the past few months, usually lackluster “essays,” but some people even attempt to pass off AI output as journalism. It’s immediately obvious as soon as I start reading them: The sentences in these pieces usually work, at least grammatically, but everything feels mechanical and paint-by-numbers.

So there you go, it’s settled. No AI writing at The Revelator, and any writer submitting AI-generated articles will find themselves immediately blacklisted. It’s what’s best for our readers, and it’s what’s best for the planet.

(That said, we’re always looking for real writers to contribute to our pages. Read our submission guidelines and pitch us.)

But let’s take it further: If you’re reading this and you care about the planet, I encourage you to avoid generative AI, too. No AI articles, emails, school papers, artwork, research, reports, press releases, editing assistance — nothing. Given the environmental cost, and the preponderance of inaccuracies, there’s simply no ethical way to use these tools for environmental work, or even for fun.

There is an exception, though: I think it’s fair game to use AI for science that requires the analysis of gigabytes or terabytes worth of data. Humans can’t process that amount of information, and they need help — especially given the immediate need for data-driven solutions to climate change and the extinction crisis.

And ironically, there might be one other exception: Using AI to figure out how to get AI to use less power.

That doesn’t get around the other ethical issues of generative AI, and it won’t negate the evils that have already been done — or will be done — with AI. But it’s a start.

We’d love to hear your thoughts on this — but please don’t submit an AI-written comment or op-ed, OK?

Previously in The Revelator:

2025 From A to Z

John R. Platt

is the editor of The Revelator. An award-winning environmental journalist, his work has appeared in Scientific American, Audubon, Motherboard, and numerous other magazines and publications. His “Extinction Countdown” column has run continuously since 2004 and has covered news and science related to more than 1,000 endangered species. He is a member of the Society of Environmental Journalists and the National Association of Science Writers. John lives on the outskirts of Portland, Ore., where he finds himself surrounded by animals and cartoonists.