Amazon (AMZN) is ubiquitous in today’s world, not just for being one of the biggest and most established online marketplaces but also for being among the largest data center providers.
What Amazon is far less known for is being the owner and operator of nuclear power plants.
Yet that’s exactly what its cloud subsidiary, AWS, did in March, purchasing a $650 million nuclear-powered data center from Talen Energy in Pennsylvania.
On the surface, the deal indicates Amazon’s ambitious expansion plans. But dig deeper, and the company’s purchase of a nuclear power facility speaks to a broader issue that Amazon and other tech giants are grappling with: the insatiable demand for energy from artificial intelligence.
In Amazon’s case, AWS purchased Talen Energy’s Pennsylvania nuclear-powered data center to co-locate its rapidly expanding AI data center next to a power source, keeping up with the energy demands that artificial intelligence has created.
The strategy is a symptom of an energy reckoning that has been building as AI has been creeping into consumers’ daily lives — powering everything from internet searches to smart devices and cars.
Companies like Google (GOOG, GOOGL), Apple (AAPL), and Tesla (TSLA) continue to enhance AI capabilities with new products and services. Each AI task requires vast computational power, which translates into substantial electricity consumption through energy-hungry data centers.
Estimates suggest that by 2027, global AI-related electricity consumption could rise by 64%, reaching up to 134 terawatt hours annually — or the equivalent of the electricity usage of countries like the Netherlands or Sweden.
This raises a critical question: How are Big Tech companies addressing the energy demands that their future AI innovations will require?
The rising energy consumption of AI
According to Pew Research, more than half of Americans interact with AI at least once a day.
Prominent researcher and data scientist Sasha Luccioni, who serves as the AI and climate lead at Hugging Face, a company that builds tools for AI applications, often discusses AI’s energy consumption.
Luccioni explained that while training AI models is energy-intensive — training the GPT-3 model, for example, used about 1,300 megawatt-hours of electricity — it typically only happens once. However, the inference phase, where models generate responses, can require even more energy due to the sheer volume of queries.
For example, when a user asks AI models like ChatGPT a question, it involves sending a request to a data center, where powerful processors generate a response. This process, though quick, uses approximately 10 times more energy than a typical Google search.
“The models get used so many times, and it really adds up quickly,” Luccioni said. She noted that depending on the size of the model, 50 million to 200 million queries can consume as much energy as training the model itself.
“ChatGPT gets 10 million users a day,” Luccioni said. “So within 20 days, you have reached that ‘ginormous’ … amount of energy used for training via deploying the model.”
The largest consumers of this energy are Big Tech companies, known as hyperscalers, that have the capacity to scale AI efforts rapidly with their cloud services. Microsoft (MSFT), Alphabet, Meta (META), and Amazon alone are projected to spend $189 billion on AI in 2024.
As AI-driven energy consumption grows, it puts additional strain on the already overburdened energy grids. Goldman Sachs projects that by 2030, global data center power demand will grow by 160% and could account for 8% of total electricity demand in the US, up from 3% in 2022.
This strain is compounded by aging infrastructure and the push toward the electrification of cars and manufacturing in the US. According to the Department of Energy, 70% of US transmission lines are nearing the end of their typical 50- to 80-year life cycle, increasing the risk of outages and cyberattacks.
Moreover, renewable energy sources are struggling to keep pace.
Luccioni pointed out that grid operators are extending the use of coal-powered plants to meet the rising energy needs, even as renewable energy generation expands.
AI upends Big Tech sustainability pledges
Microsoft and Google have acknowledged in their sustainability reports that AI has hindered their ability to meet climate targets. For instance, Microsoft’s carbon emissions have increased by 29% since 2020 due to AI-related data center construction.
Still, renewable energy remains a crucial part of Big Tech’s strategies, even if it cannot meet all of AI’s energy demands.
In May 2024, Microsoft signed the largest corporate power purchasing agreement on record with property and asset management giant Brookfield to deliver over 10.5 gigawatts of new renewable power capacity globally through wind, solar, and other carbon-free energy generation technologies. Additionally, the company has invested heavily in carbon removal efforts to offset an industry-record 8.2 million tons of emissions.
Amazon has also made significant investments in renewable energy, positioning itself as the world’s largest corporate purchaser of renewable energy for the fourth consecutive year. The company’s portfolio now includes enough wind and solar power to supply 7.2 million US homes annually.
However, as Yahoo Finance reporter Ines Ferre noted (video above), “The issue with renewables is that at certain times of the day, you have to also go into energy storage because you may not be using that energy at that time of the day.”
Beyond sourcing cleaner energy, Big Tech is also investing in efficiency. Luccioni said companies like Google are now developing AI-specific chips, such as the Tensor Processing Unit (TPU), that are optimized for AI tasks instead of using graphical processing units (GPUs), which were created for gaming technology.
Nvidia claims that its latest Blackwell GPUs can reduce AI model energy use and costs by up to 25 times compared to earlier versions.
For a glimpse of what lies ahead for tech firms that don’t manage energy costs, look no further than Taiwan Semiconductor Manufacturing Company (TSM). TSMC makes more than 90% of the world’s most advanced AI chips and has seen energy costs double over the past year, reducing the company’s margins by nearly a full percentage point, according to CFO Wendell Huang.
In order to more accurately gauge energy demands and reduce future costs, experts say transparency is key.
“We need more regulation, especially around transparency,” said Luccioni, who is working on an AI energy star-rating project that aims to help developers and users choose more energy-efficient models by benchmarking their energy consumption.
When it comes to tech companies’ priorities, always follow the money, or in this case, the investments. Utility companies and tech giants are expected to spend $1 trillion on AI in the coming years.
But according to Luccioni, AI might not just be the problem — it could also be part of the solution for addressing this energy crunch.
“AI can definitely be part of the solution,” Luccioni said. “Knowing, for example, when a … hydroelectric dam might need fixing, [and the] same thing with the aging infrastructure, like cables, fixing leaks. A lot of energy actually gets lost during transmission and during storage. So AI can be used to either predict or fix [it] in real-time.”
Click here for the latest technology news that will impact the stock market
Read the latest financial and business news from Yahoo Finance