ARTIFICIAL INTELLIGENCE AI and the environment: what’s the real cost of technological innovation?

From Simon Morrison 13 min Reading Time

The widespread adoption of generative Artificial Intelligence (AI) is ushering in a new digital era. Alongside the excitement about the positive aspects, there’s also growing concern about the negative environmental impacts of AI. How much damage is this new technology doing to our planet, and what, if anything, can be done about it?

Rapid AI growth is driving demand for hyperscale data centres, increasing energy and water use and raising sustainability concerns.(Source: ©  Joachim - stock.adobe.com)
Rapid AI growth is driving demand for hyperscale data centres, increasing energy and water use and raising sustainability concerns.
(Source: © Joachim - stock.adobe.com)

The term ‘disruptive’ gets thrown around a lot in the tech world. Most of the time, it’s just empty marketing hyperbole.

But not all the time. When it comes to large language models (LLMs) and generative Artificial Intelligence (AI), the term ‘disruptive’ hardly seems adequate.

OpenAI released an early version of ChatGPT for public use on November 30th, 2022.1 Pundits have breathlessly described ChatGPT and its counterparts, such as Anthropic’s Claude, Google Gemini, and Microsoft Copilot, as the most groundbreaking innovation since the printing press, the wheel, or even the discovery of fire.

At first glance, it’s easy to dismiss these claims as more marketing guff. But there’s no doubt that AI has the potential to transform society completely.

Discussions surrounding the pros and cons of AI technologies are becoming increasingly heated. The environmental impacts of AI are described as either negligible and overblown or depicted as an immediate and severe threat to our planet.

To further complicate the issue, accurate information surrounding AI’s energy and resource consumption is difficult to come by. Tech companies are going to great lengths to shield critical environmental data from scrutiny. Scholars and analysts argue fiercely over how to calculate energy and water usage.

The battle lines have been drawn, the acolytes for both sides are becoming entrenched, and there is scant room for nuance or reasoned debate.

This article attempts to take a balanced overview of the environmental impacts of AI.

AI is driving the demand for hyperscale data centres

While the environmental impact of our digital infrastructure has long been a cause of concern, the AI boom is pushing these issues to the forefront.

In 2021, there were 8,000 data centres in operation worldwide.2 At the time of writing, there are approximately 11,102 operational data centres, almost half of them located in the United States.3 Analysts estimate that worldwide demand for data centres will triple by 2030 and that AI workloads will make up 70% of this demand.4

Legacy data centres don’t have the capacity to meet the massive computing demands of AI. As governments and companies race to stay at the cutting edge of innovation, hyperscale AI data centres are being built across the globe at a feverish pace.

Typically, a legacy data centre covers about 9,300 square metres. Hyperscale data centres can be as large as 930,000 square metres.5,6 According to analysts, the average amount of land needed for a data centre has increased by 144% since 2022.7

The land use of hyperscale AI data centres can cause significant environmental impacts. Converting woodland or farmland into data centre sites can lead to soil erosion, deforestation, habitat loss, and the reduction of land available for housing or food production.8

Land is a finite resource. As we move forward, the responsible management of land is going to be an important aspect of sustainable AI development.

How much power does AI need?

Data centres are responsible for about 1 to 1.5% of global final electricity demand, around 415 Terawatt-hours (TWh).9 Which isn’t a huge amount in comparison to other industries.

But the tech sector’s demand for electricity is growing at a rate of about 12% per year and could hit 945 TWh or even 1,050 TWh by the end of 2026.10 Data centres' need for electricity is growing at four times the rate of any other industry. By 2030, data centres are predicted to require more energy than Japan.11 This growth is a direct result of the scaling up of AI.

Why is AI so power hungry? Partly, it’s because training an AI model takes a massive amount of electricity. Exactly how much is unclear. Unhelpfully, AI companies tend to keep this information to themselves.

However, analysts have estimated that it took between 51,772,500 and 62,318,750 kWh of electricity to train ChatGPT 4.0.12 That’s about 2,000 barrels of oil or enough power to run a major US city for around three days.13

Most AI data centres are located in areas powered by fossil fuels. While the use of renewable energy to power data centres is growing, it simply can't keep pace with the needs of the industry. Somewhat tellingly, the titans of the tech world are now heavily investing in nuclear power. Google, Meta, and Amazon all signed a pledge to triple global nuclear power capacity by 2050.15

Subscribe to the newsletter now

Don't Miss out on Our Best Content

By clicking on „Subscribe to Newsletter“ I agree to the processing and use of my data according to the consent form (please expand for details) and accept the Terms of Use. For more information, please see our Privacy Policy.

Unfold for details of your consent

Generating AI text, video, or audio in response to a prompt or a query is known as ‘inference’. While training an AI model is energy-intensive, it’s the inference stage that is responsible for 80 to 90% of its computing power needs.14

There’s been a lot of debate about the environmental impacts of individual text-based LLM use. It’s been estimated that an individual ChatGPT 4.0 text-based query is about 0.3 Wh, about ten times less than previously thought. The problem is that there are a huge number of variables that need to be considered when quantifying exactly how much energy one prompt uses.

Simple text-based prompts are less energy-intensive than complex queries or generating audio, images, or video. The number of queries made in a single session also matters, as high-frequency users contribute more to overall energy consumption. Advanced LLMs with billions of parameters require greater computational resources and so use more energy.

Obviously, as AI systems scale upwards and usage increases in both volume and severity, there will be a corresponding increase in overall energy demand.

The most important question is: what will be the aggregate energy impact of mass LLM use?

Right now, the IEA estimates that electricity generation for data centres represents about 1% of global CO2 emissions.16 Although this is a relatively small amount, the IEA also predicts that this will rise to 3% by 2030.17 Given the rapid expansion of the technology, this may be a conservative estimate.

The long-term environmental impact of AI will depend largely on how well governments and companies deal with this new technology’s ever-growing need for electricity.

Is AI really drinking the planet dry?

Data centre servers generate large amounts of heat. In most cases, water cooling systems are used to keep the servers from overheating.

Water cooling systems can be configured as either open circuit systems, where cooling towers remove heat via water evaporation, or closed loop cooling systems, where the water circulates throughout the system without coming into contact with the outside environment. Since closed-loop systems recycle the same water over and over again, they have much less impact on the environment and use less water. Many of the major tech players, such as Microsoft, are focusing efforts on developing more efficient and less resource-intensive closed-loop cooling models.

But closed-loop systems also need to be flushed out regularly to prevent the buildup of corrosives. The resulting wastewater is classified as industrial grade and often contains trace metals and treatment chemicals. If not disposed of correctly, this wastewater poses a direct risk to the local environment.

Open-loop systems rely on evaporation to remove heat and can consume millions of litres of water per day. Some of the water used in open-loop systems has to be periodically discharged to remove corrosives and solid mineral deposits. Known as ‘blowdown’, this process releases water with concentrated salts and minerals, biocides, and corrosion and scale inhibitors into the environment. Evaporation also results in higher salinity and contamination levels in the local soil.18,19

Whether they use closed or open loop systems, AI data centres frequently source water from local municipal supplies of potable water. In areas where water is scarce, the increased demand from an AI data centre can be a huge problem.20

In the US alone, approximately two-thirds of new data centres are being located in areas under severe water stress.21 Google had to stop construction of a proposed data centre in Santiago, Chile, due to concerns about its water use.22 Despite the continuation of a 15-year-long drought in Chile, Google is still planning to build other data centres in the same area. Globally, speaking, it’s been estimated that 68% of data centres are located near protected areas where biodiversity depends on a supply of clean water.23

How much water does a data centre need? As with electricity, it’s a difficult question to answer with any accuracy.

The water needs of a data centre depend on what cooling system it uses, the average temperatures of its location, the time of year, and the size of the data centre itself. And just like electricity use, the big tech companies are loath to release accurate data on just how much water AI hyperscale data centres use.

Sam Altman has been quoted as saying that one ChatGPT prompt uses just “one fifteenth of a teaspoon” of water.24 He has also publicly called the criticisms of the amount of water consumed ChatGPT “fake” and “totally insane”.25

Hyperbole aside, some estimates of the water usage of LLMs have indeed been overstated. Most famously, Karen Hao’s book “Empire of AI” claimed that Google’s Chilean data centre would require “more than one thousand times the amount of water consumed by the entire population.” This often-quoted figure has since been proven to be false. Analyst Andy Masley was largely behind the push to disprove Hao’s figures and has written extensively on how he believes the AI water issue to be fake. Masley takes pains to point out that AI’s water usage is minimal when compared to other industries or even the digital tools we use each day, like streaming services, for instance. 27

Still, the impact of AI on the world’s water supplies is a serious problem that is worthy of scrutiny. Research shows that US data centres directly consumed 66 billion litres of water in 2023. Indirect use accounted for 800 billion litres.28 Meta’s 2025 environmental report stated that the company’s data centre water use went from 726 megalitres ‌in 2020 ⁠to 5,637 megalitres in 2024.29 Google discloses its own water use, but not the water used by data centres operated by third parties. Amazon refuses to disclose its water usage data at all and simply states that it tries to be a "good neighbour".30 The reluctance of the major tech companies to accurately disclose their water usage has been criticised by environmental groups as well as their own shareholders.

The water consumption of AI data centres in 2023 was estimated to be 560 billion litres. A recent peer-reviewed study stated that the total amount of water consumed by AI systems could be between 312.5 and 765 billion litres.31 According to Dr Alex de Vries-Gao, the author of the study, “That is of the same order of magnitude as all bottled water consumed worldwide in a single year.” The study shows that the indirect water use of AI systems has been vastly underestimated and could be up to four times higher than official estimates.32

While there have been criticisms of the methodology used by de Vries-Gao, it’s clear that AI is driving up demand for potable water. According to a report from the UK Government Digital Sustainability Alliance (GDSA), AI use is expected to increase global water usage from 1.1bn to 6.6bn cubic metres by 2027.33

Slating AI’s thirst for water is going to be one of the most important environmental challenges of the next decade.

Does AI deserve its bad environmental reputation?

Many analysts, such as Andy Masley and Hannah Ritchie, are adamant that the negative environmental impact of personal use of text-based LLMs is negligible when compared to other activities, like playing golf, streaming shows, eating meat, or buying fast fashion.

Leaving aside the fact that every action we take has an environmental cost, Masley and Ritchie do have a point. The negative impacts of AI have been overstated, to some extent. And the damage AI is currently causing to the environment is minimal when compared to some other industries.

But this doesn’t negate concerns about the environmental impact of AI and its rapid growth rate.

No other industry in history has scaled at the rate of AI.34 Within five days of its release, ChatGPT gathered one million users, the fastest adoption rate of any consumer app. In just two months, that figure had reached 100 million users.35 The number of active weekly ChatGPT users was 400 million in February 2025.36 As of early 2026, it’s estimated that there are more than 900 million active users.37 And that’s just ChatGPT.

By the end of 2025, Google’s Gemini had hit 750 million monthly active users.38 Anthropic’s Claude managed only 19 million users within the same period, but its focus is on enterprise users, not the general public.39 Other AI platforms, such as Perplexity AI, Grok, and DeepSeek, are also gaining massive amounts of users with every passing month.

A growing number of people are becoming dependent on AI for personal use. Many people now commonly use AI as their own personal digital assistant for everything from organising schedules to meal ideas to relationship and child-rearing advice.

AI is sending the business world into a frenzy. Conservative estimates show that there are now between 58,000 and 70,000 AI startups, many of which (perhaps even up to 95%) are destined to fail.40

It’s not just ambitious entrepreneurs who are embracing the new technology. An estimated 78% of companies across the globe are building AI into their operations.41 Major multinational companies are shoehorning AI into every possible digital crevice they can find. From logistics to analytics to advertising to recruitment and customer service, AI is being integrated into every operational layer. Whether it’s needed, whether it’s appealing to the consumer base, or even whether it works.

Just like plastics, it seems like AI is everywhere. The number of AI-generated artworks, music, video, and written content has exploded. While it’s impossible to quantify exactly how much AI content there is, some analysts have estimated that at least 50% of new online articles are AI-generated.42 Over 15 billion images were generated using AI platforms between 2022 and 2024.43 Analysts have stated that by 2030, AI will be present in every facet of our digital interactions.44

On a global scale, the amount of energy and water that AI systems currently use is indeed somewhat negligible in comparison to other industries. And it is certainly possible that AI technologies could be the key to reducing CO2 emissions and combating climate change.

But as adoption rates soar and the need for data centres increases, so too will the negative environmental impacts of AI. Carefully assessing the potentially damaging effects of AI and taking action to reduce any negative impacts is now an essential part of our fight to slow climate change.

As the de Vries-Gao study concluded, without transparent data, there’s no way to identify the most effective opportunities to reduce the climate impact of AI. And as Sam Altman himself has said, the future of AI depends on the development of more efficient ways to generate clean energy.

Realising the potential of AI technology without further harming the planet depends on companies and governments fully committing to transparency, developing more resource-efficient digital infrastructure, and continuing to adopt renewable energy sources.

Whether or not political leaders and the titans of the tech world will make such commitments remains to be seen.

Follow us on LinkedIn

Have you enjoyed reading this article? Then follow us on LinkedIn and stay up-to-date with daily posts about the latest developments on the industry, products and applications, tools and software as well as research and development.

Follow us here!

Sources

  • 1 https://openai.com/index/chatgpt/
  • 2 https://www.usitc.gov/publications/332/executive_briefings/ebot_data_centers_around_the_world.pdf
  • 3 https://www.datacentermap.com/datacenters/
  • 4 https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/the-cost-of-compute-a-7-trillion-dollar-race-to-scale-data-centers
  • 5 https://encoradvisors.com/hyperscale-data-center/
  • 6 https://programs.com/resources/data-center-statistics/
  • 7 https://www.techtarget.com/searchdatacenter/feature/The-increasing-concern-of-data-center-land-acquisition
  • 8 https://www.langan.com/news-and-insights/insights/a-guide-to-data-center-sustainability-trends-challenges-and-engineering-solutions
  • 9,10,16,17 https://www.iea.org/reports/energy-and-ai/energy-demand-from-ai
  • 11 https://www.globalplasticaction.org/resources/publications/global-data-centres-expected-to-consume-more-electricity-than-japan-by-2030,-and-other-technology-news-you-need-to-know/753e15cda3498b255d5393e03a41f656d2963fb8
  • 12https://towardsdatascience.com/the-carbon-footprint-of-gpt-4-d6c676eb21ae/
  • 13 https://n3xtcoder.org/energy-impact-of-ai
  • 14 https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/
  • 15 https://www.cnbc.com/2025/03/12/amazon-google-and-meta-support-tripling-nuclear-power-by-2050.html
  • 18 https://www.energy.gov/cmei/femp/best-management-practice-10-cooling-tower-management
  • 19 https://nepis.epa.gov/Exe/ZyPDF.cgi?Dockey=P100TFYV.PDF
  • 20 https://www.forbes.com/sites/monicasanders/2026/03/20/world-water-day-and-the-hidden-water-footprint-of-ai/
  • 21 https://www.bloomberg.com/graphics/2025-ai-impacts-data-centers-water-data/
  • 22 https://www.datacenterdynamics.com/en/news/google-pauses-chilean-data-center-project-to-rethink-water-use/
  • 23,33 https://sustainableict.blog.gov.uk/2025/09/17/ais-thirst-for-water/
  • 24 https://www.theverge.com/news/685045/sam-altman-average-chatgpt-energy-water
  • 25 https://www.theverge.com/2024/1/19/24044070/sam-altman-says-the-future-of-ai-depends-on-breakthroughs-in-clean-energy
  • 26 https://www.techpolicy.press/decolonizing-the-future-karen-hao-on-resisting-the-empire-of-ai/
  • 27 https://andymasley.com/writing/empire-of-ai-is-wildly-misleading/
  • 28 https://escholarship.org/uc/item/32d6m0d1
  • 29,30 https://www.reuters.com/sustainability/boards-policy-regulation/investors-press-amazon-microsoft-google-water-power-use-us-data-centers-2026-04-06/
  • 31, 32 https://www.sciencedirect.com/science/article/pii/S2666389925002788#sec4
  • 34 https://epoch.ai/gradient-updates/after-the-chatgpt-moment-measuring-ais-adoption
  • 35 https://technologychecker.io/blog/chatgpt-statistics
  • 36 https://www.reuters.com/technology/artificial-intelligence/openais-weekly-active-users-surpass-400-million-2025-02-20/
  • 37 https://www.demandsage.com/chatgpt-statistics/
  • 38,39 https://aifundingtracker.com/chatgpt-vs-claude-vs-gemini/
  • 40 https://www.hubspot.com/startups/ai/ai-stats-for-startups
  • 41 https://www.hostinger.com/tutorials/how-many-companies-use-ai
  • 42 https://futurism.com/artificial-intelligence/over-50-percent-internet-ai-slop
  • 43 https://journal.everypixel.com/ai-image-statistics
  • 44 https://epoch.ai/blog/what-will-ai-look-like-in-2030#conclusion

(ID:50822649)