1. In December, the UN Environment Assembly will debate how to mitigate the mounting environmental cost of AI. As well it might – while the new tech’s pattern-spotting capabilities are being used to do lovely green things like chart emissions of methane and map the destructive dredging of sand, it’s also burning through resources at a terrifying rate.
2. That’s because, to store data and to process and respond to queries, AI relies on vast data centres. These, as with all large-scale construction, require a lot of concrete and take up a lot of land.
3. In a rare bit of good news, the supply of renewable energy has recently exploded – but not enough, alas, to prevent data centres from needing fossil fuels to provide the electricity they require (in 2024, an estimated 1.5% of the entire world’s supply). As a result, such centres are currently thought to account for about 1% of all energy-related greenhouse gas emissions, too.
4. The demands data centres place on the grid can also lead to less reliable power supply for those living near to them. That means they can’t be sure they can even use the AI – or, come to that, their lights, the fridge and so on.
5. In case you’re thinking that 1.5% of global electricity demand doesn’t sound very much: some estimates suggest data centres’ power needs are now roughly on a par with the quantity of electricity used by the entirety of Austria. Thanks to its booming tech industry, AI could account for a third of Ireland’s energy usage by 2026.
6. Oh, and because data centres generate so much heat, a large one can require as much as 18.9m litres of water to stay cool each day – roughly the same as a town of 50,000 people. One estimate predicts that, by 2027, AI-related infrastructure will consume six times more water than Denmark, or half that used in the UK.
7. To reduce some of these costs, Microsoft is experimenting with putting data centres underwater. Cool! Signs are, alas, that this will raise local water temperatures and reduce oxygen levels in marine environments. Back to the drawing board.
8. It feels worth noting at this point that a quarter of humanity currently does not have access to clean water.
9. Large digital infrastructure also requires mining a vast array of minerals – take a deep breath: silicon, cobalt, copper, aluminium, lithium, manganese, natural graphite, nickel, rare earth elements and gold – with all the physical destruction, energy use, groundwater depletion, soil contamination, deforestation, soil erosion and biodiversity loss that implies.
10. To put all this in context, Jon Ippolito, professor of new media at the University of Maine, has attempted to use publicly available data to calculate the environmental footprint of using AI to find information. “Tell me the capital of France” took 23 times more energy using AI than it did using good old-fashioned search engines. “Tell me the number of gummy bears that could fit into the Pacific Ocean” used 210 times more. That’s because search engines look for the answer, where AI tried to work it out from first principles.
11. Research from Sasha Luccioni, of the Université de Montréal, also suggests that generative tasks (those that involve creating new data) require around 10 times as much energy as classification tasks (those that involve processing existing data).
12. As the technology becomes more efficient, all this will get better though, right? Well, possibly not. In a phenomenon known as the Jevons paradox, increases in efficiency often make new technology so much cheaper and easier to use that we end up using more of it – enough to overwhelm any energy savings. That’s what happened with things like lighting, heating and transport technologies – we travel further, or heat or light rooms we are not in, because there’s so little cost to doing so. There’s no reason to imagine we won’t also do this with AI – thoughtlessly using it for basic search queries, say, or taking AI-powered self-driving cars when we could instead just cycle or walk.
13. So the UN Environmental Programme is calling on countries to do more to regulate and measure the impact of AI, and demanding that tech companies make their work more efficient. Luccioni and her colleagues have also noted that fine-tuned “task-specific” AI models emit just 3% of the carbon that general-purpose models do – which makes it a shame that so much of the tech industry is focused on the latter.
14. AI isn’t the only reason these data centres are burning through energy, of course. The majority is almost certainly other big tech products like social media sites, streaming services and, worst of all, cryptocurrency. It’s just possible that we’re ever so slightly in trouble.
Suggested Reading
AI: What is it good for?
500,000
Estimated number of data centres in 2012
8 million
Estimated number of data centres today
800kg
Quantity of raw materials required to produce 2kg of digital infrastructure
181 zettabytes
Quantity of new data set to be generated this year (that is, 181 million million gigabytes)
2,000 zettabytes
Quantity of new data estimated to be generated in 2035
15,000
Number of Google searches you could perform for the same energy cost as generating three seconds of AI video
One year
Length of time you could leave a lightbulb burning for the same energy
