If you want evidence of Microsoft’s progress towards its environmental “moonshot” target, then look closer to the ground: at a construction site on an industrial estate in west London.
The company’s Park Royal data center is part of its commitment to drive the expansion of artificial intelligence (AI), but that ambition is at odds with its goal to be carbon negative by 2030.
Microsoft says the center will run entirely on renewable energy. However, building data centers and the servers they are filled with means that company goal 3 emits – such as CO2 related to the materials in its buildings and the electricity people consume when using products such as the Xbox – are more than 30% above their 2020 level. As a result, the company is exceeding its overall emissions target by roughly the same rate.
This week, Microsoft co-founder Bill Gates claimed that AI would help fight climate change because big tech is “seriously willing” to pay extra to use clean sources of electricity in order to ” to say that they are using green energy”.
In the short term, AI has been problematic for Microsoft’s green goals. Brad Smith, Microsoft’s outspoken president, once called his carbon ambitions a “moonshot.” In May, stretching that metaphor to breaking point, he admitted that because of its AI strategy, “the moon has moved.” It plans to spend £2.5 billion over the next three years on growing its UK AI data center infrastructure and this year has announced new data center projects around the world, including in the US, Japan, Spain and Germany.
Training and running the AI models that power products such as OpenAI’s ChatGPT and Google’s Gemini use a lot of electricity to power and cool the associated hardware, with additional carbon generated by manufacturing and transporting the associated equipment.
“It’s a technology that is increasing energy consumption,” says Alex de Vries, founder of Digiconomist, a website that monitors the environmental impact of new technologies.
The International Energy Agency estimates that the total electricity consumption of data centers could double from 2022 levels to 1,000 TWh (terawatt hours) in 2026, equivalent to the power demand of Japan. Artificial intelligence will result in data centers using 4.5% of global energy generation by 2030, according to calculations by research firm SemiAnalysis.
This means that amid concerns about AI’s impact on work and humanity’s lifespan, the environment is also emerging. Last week, the International Monetary Fund said governments should consider imposing carbon taxes to capture the environmental cost of AI, in the form of a blanket carbon tax that captures emissions from servers as part of its scope, or methods of others such as a specific CO tax2 generated by that device.
All the big tech firms involved in AI – Meta, Google, Amazon, Microsoft – are looking to renewable energy sources to meet their climate targets. In January, Amazon, the world’s largest corporate buyer of renewable energy, announced it had bought more than half the output of an offshore wind farm in Scotland, while Microsoft said in May it was backing 10 billion (£7.9 billion) in renewable energy projects. Google aims to run its data centers entirely on carbon-free energy by 2030.
A Microsoft spokesperson said: “We remain steadfast in our commitment to meet our climate goals.”
Microsoft co-founder Bill Gates, who stepped down in 2020 but retains a stake in the company through the Gates Foundation Trust, has argued that AI can directly help combat climate change. Additional electricity demand will be matched by new investment in green generation, he said on Thursday, which would more than offset usage.
A recent UK government-backed report agreed, stating that “the carbon intensity of the energy source is a key variable” in calculating AI-related emissions, although it adds that “a significant part of the training of AI globally still relies on high-carbon resources like coal or natural gas.” The water needed to cool servers is also a problem, with one study estimating that AI could account for up to 6.6 billion cubic meters of water use by in 2027 – nearly two-thirds of England’s annual consumption.
De Vries argues that the pursuit of sustainable computing power puts a strain on demand for renewable energy, which will result in fossil fuels increasing the slack in other sections of the global economy.
“More energy consumption means we don’t have enough renewable resources to fuel this growth,” he says.
NexGen Cloud, a UK firm that provides sustainable cloud computing, an industry dependent on the data center that provides IT services such as data storage and computing power over the Internet, says that renewable energy sources for AI-related computing are available to data centers if they avoid cities and are located near hydro or geothermal energy sources.
Youlian Tzanev, co-founder of NexGen Cloud, says:
“The industry norm has been to build around economic centers rather than renewable energy sources.”
This makes it harder for any AI-focused tech company to meet carbon goals. Amazon, the world’s largest cloud computing provider, aims to be net zero – removing as much carbon as it emits – by 2040 and match its global electricity use with 100% renewable energy by in 2025. Google and Meta are pursuing the same net zero goal by 2030. OpenAI, the developer of ChatGPT, uses Microsoft data centers to train and operate its products.
There are two main ways in which large language models – the technology that supports chatbots like ChatGPT or Gemini – consume energy. The first is the training phase, where a model is fed datasets collected from the Internet and beyond, and builds a statistical understanding of the language itself, which ultimately enables it to provide convincing answers to questions.
The initial energy cost of AI training is astronomical. This keeps smaller companies (and even smaller governments) from competing in the sector unless they have a spare $100 million to throw at a training run. But it’s dwarfed by the cost of running the resulting models, a process known as “inference.” According to analyst Brent Thill, at the investment firm Jefferies, 90% of the energy cost of AI lies in that inference stage: the electricity used when people ask an AI system to answer factual questions, summarize a piece of text or to write an academic. essay.
The electricity used for training and inference is circulated through a large and growing digital infrastructure. Data centers are filled with servers, which are built from the ground up for the specific part of the AI workload they sit on. A single training server may have a central processing unit (CPU) barely more powerful than the one on your computer. paired with dozens of specialized graphics processing units (GPUs) or tensor processing units (TPUs) – microchips designed to quickly run through the vast amounts of simple calculations that AI models are made of.
If you use a chatbot, as you watch it respond verbatim, a powerful GPU is using about a quarter of the power needed to boil a kettle. All of this is being hosted by a data center, either owned by the AI provider itself or a third party – in which case it might be called the “cloud”, a fancy name for someone else’s computer.
SemiAnalysis estimates that if generative AI were integrated into every Google search, this could translate into annual energy consumption of 29.2 TWh, comparable to what Ireland consumes in a year, although the financial cost to the tech company would be prohibitive . This has led to speculation that the search company may start charging for some AI tools.
But some argue that looking at excess energy for AI is the wrong lens. Instead, consider the energy that new tools can save. A provocative paper in the peer-reviewed journal Scientific Reports earlier this year argued that the carbon emissions of writing and illustration are lower for AI than for humans.
AI systems emit “between 130 and 1,500 times” less carbon dioxide per page of text created compared to human writers, researchers from the University of California Irvine estimated, and up to 2,900 times less per image.
Left unsaid, of course, is what those human writers and illustrators are doing. Redirecting and retraining their work in another area – such as green jobs – may be another moon.