AI Infrastructure Power Play
Sponsored by:
NOTE FROM THE EDITOR

Juan Pedro Tomás
August 12, 2025
How AI infrastructure is impacting power grids
The explosive demand for AI is reshaping the energy equation. As hyperscalers and enterprises deploy high-density compute infrastructure to train and run AI models, electricity consumption is surging. Utilities, regulators, and data center operators must now collaborate to expand grid capacity, accelerate clean energy adoption, and manage load demands—all while navigating public scrutiny and sustainability goals.
-
Included in this trendline:
-AI’s energy surge drives urgent upgrades to grids and cooling systems
-States clash over how energy policy should shape AI infrastructure growth
-AI strains the grid—but also helps optimize and stabilize it
-Power bottlenecks emerge as the key risk to AI’s continued scale
RCRtech Trends cut through the noise to unpack the biggest shifts shaping your industry. These in-depth reports, crafted by our expert editorial team, give business leaders the insight they need to stay ahead of change and make smarter strategic decisions.
Takeways:
- AI’s soaring power needs are forcing major infrastructure upgrades
The rise of AI, especially large models and real-time applications, is creating energy spikes that current grids and data centers can’t handle. This is driving urgent investments in smart grids, edge data centers, renewable energy integration and efficient cooling systems to avoid blackouts and reduce emissions.
- Energy policy becomes a battleground for AI growth
As AI infrastructure expands, states and regulators are debating how to manage its energy impact. Some, like Arizona, are reviewing pricing fairness and transparency, while others, like West Virginia, are cutting regulations and promoting fossil fuels to attract investment. These policy choices will shape where and how AI infrastructure grows.
- AI drives energy demand—but also optimizes it
While AI is accelerating electricity consumption, it’s also being used to manage it more intelligently. Utilities are leveraging AI to predict consumption spikes, balance loads and improve efficiency. This dual role makes AI both a challenge and a solution in the race toward a more sustainable energy future.
4- Power bottleneck threatens to stall AI’s explosive growth
Power has become the critical bottleneck in AI infrastructure growth, with U.S. data centers expected to add 50 GW of demand annually. Hyperscalers are investing heavily, but without major innovations in energy efficiency and alternative sources like SMRs or repurposed coal plants, AI’s expansion risks hitting a hard limit.

What infra upgrades are needed to handle AI energy spikes?

Juan Pedro Tomás
April 17, 2025
AI workloads don’t just consume energy—they consume it with patterns that are impossible to predict. This is why infrastructure upgrades are key
In sum – what you need to know:
AI energy demand – rapid expansion of AI tech, especially large models and applications, drives unpredictable energy spikes that strain outdated power grids and data centers.
Smart infrastructure – modernizing power grids, strengthening data center systems, and integrating renewable energy storage are critical to managing AI energy sustainably.
AI to the rescue – AI-driven tools are being used to predict demand patterns, optimize grid performance, and prevent energy overloads – to ensure reliable energy delivery.
As artificial intelligence (AI) technologies surge at a very fast pace, so do their energy demands. From training massive AI models like ChatGPT to running real-time applications in industries and homes, AI systems need significant computing power. This increased usage leads to sharp spikes in energy consumption — especially in data centers and power grids. To handle this growing demand, there is a need for smart infrastructure upgrades that can ensure reliable energy delivery, reduce emissions and support the continued expansion of AI.
Why AI needs so much energy
AI systems — especially large language models and generative AI—rely on high-performance computing hardware such as GPUs and specialized chips like TPUs. These machines run continuously, often in large data centers. Training one large AI model can consume as much electricity as hundreds of homes use in a year.
AI also drives energy use beyond data centers. AI-powered applications like autonomous vehicles, smart factories and predictive healthcare run around the clock and require low-latency data processing. That creates additional demand on both local computing systems and the energy grid.
The problem with sudden spikes
AI workloads don’t just consume energy—they consume it with patters impossible to predict. A company might suddenly launch a new AI product, or traffic on an app might surge due to a viral trend. These events can cause short bursts of very high energy usage.
Traditional energy systems are not built to respond instantly to such spikes. If the local power infrastructure is outdated or already strained, these surges can lead to blackouts or reduced performance. That’s why energy systems must evolve to handle AI’s unpredictable demands.
Key infrastructure upgrades needed
Here are some of the most important upgrades that can help manage AI-driven energy spikes:
1. Modernize power grids
A smart grid uses sensors, automation as well as real-time data to monitor and manage electricity flow. Unlike traditional grids, which are slow to respond, smart grids can detect energy spikes and automatically redirect power where it’s needed.
Smart grids also make it easier to balance supply and demand, avoid overloads, and integrate renewable energy sources.
2. Strengthen data center power systems
Data centers are the backbone of AI computing. Operators need to:
-Install redundant power supplies and battery backups.
-Use intelligent load balancing to distribute computing tasks across multiple centers.
-Upgrade to modular power architectures that scale efficiently with AI demand.
-Many centers are now turning to liquid cooling systems to reduce energy waste from traditional air-based cooling.
3. Build more edge data centers
Edge computing brings data processing closer to users. These edge data centers reduce latency and relieve pressure on centralized data centers.
Deploying more edge data centers can help absorb local AI traffic surges and reduce long-distance power and data transfer needs.
4. Integrate renewable energy with storage
AI’s growing energy footprint raises concerns about sustainability. To reduce emissions, infrastructure must rely more on renewable energy, such as solar and wind.
However, renewables are not always available on demand. The solution is battery storage and energy management software that store excess energy when it’s produced and release it when AI demand spikes.
5. Use AI to manage energy itself
AI can help manage the very problem it causes. Energy operators are now using AI for:
-Predicting demand patterns in real time
-Optimizing grid performance
-Preventing overloads before they happen
By using AI to analyze usage data and weather conditions, operators can make smarter decisions about energy distribution.
Conclusion
AI is here to stay—and it will only become more powerful. If infrastructure doesn’t keep up, energy shortages and environmental damage could follow. But with smart investments in power grids, data center design renewable energy, and AI-driven management, we can support AI’s growth sustainably and responsibly. This would require that Governments, energy companies and tech firms work together to modernize infrastructure now.
Sponsored by:

AI infra brief: Power struggles behind AI growth

Juan Pedro Tomás
April 16, 2025
As AI workloads soar and hyperscale data centers multiply across the U.S., lawmakers and regulators are confronting a critical question: who should shoulder the rising cost of powering AI-scale infrastructure?
This week, three key developments underscore the growing political and economic tensions around energy consumption and emissions, in the age of AI.
U.S. Senators propose Clean Cloud Act to curb AI and crypto emissions
Senators Sheldon Whitehouse (D-RI) and John Fetterman (D-PA) introduced the “Clean Cloud Act of 2025,” a federal proposal to amend the Clean Air Act and bring crypto miners and AI data centers under direct emissions oversight. The bill would mandate that any facility using more than 100 kilowatts of power annually report its energy mix and carbon emissions — and cut emissions by 11% per year, with a target of net-zero operations by 2035 powered entirely by renewables. Facilities that fail to comply would face penalties starting at $20 per kilowatt, a fee that could rise if targets aren’t met. The legislation would also require operators to submit annual transparency reports on power use and emission reduction strategies. Read more
Takeaway: This is the boldest federal attempt yet to align AI infrastructure with climate goals — but it faces stiff opposition in a Republican-led Senate.
Arizona opens formal review into data center energy fairness
Arizona’s Corporation Commission (ACC) has launched a formal investigation into whether data centers — particularly those serving AI and cloud computing — are paying their fair share of the state’s infrastructure costs. With 129 data centers now operating across the state, many benefiting from favorable utility rates and tax incentives, questions are growing over whether residential and small business customers are effectively subsidizing large-scale operators. The ACC will evaluate new pricing models, including time-of-use rates, “behind-the-meter” options, and infrastructure cost-sharing schemes. According to Chair Kevin Thompson, the goal is to ensure fairness in cost distribution without discouraging future investment in Arizona’s growing digital economy. Read more
Takeaway: Arizona joins a growing number of states rethinking how they price energy for digital infrastructure — especially as grid stress and public scrutiny increase.
West Virginia pushes bill to fast-track data centers, backs coal
West Virginia’s legislature passed HB 2014, a bill that fast-tracks the approval process for new data centers and microgrids by eliminating local oversight and green energy requirements. Under the bill, fossil fuels — including coal — can be used to power AI infrastructure without environmental or zoning reviews at the municipal level. The bill is part of Governor Patrick Morrisey’s broader strategy to make West Virginia a destination for AI and heavy industrial investment by emphasizing low-cost, reliable power. Read more
Takeaway: West Virginia’s bet on deregulation and fossil fuels shows how energy policy is emerging as a competitive tool in the race for AI infrastructure.
Big picture
Across the United States, the explosive energy demands of AI are triggering deep regulatory rifts. Some states are introducing emissions mandates, transparency laws and fair-pricing reviews to make AI development more sustainable and accountable. Meanwhile, others states are opting for deregulation and fossil fuel incentives with the aim of attracting investment. As data centers continue to proliferate and grid loads surge, the fundamental questions of who pays — and how clean that power should be — remain unresolved.
What else is powering AI infra today?
SteelDome, Supermicro launch edge-to-cloud AI infra
EU AI funding surges 55% to €3 billion in Q1
Auradine raises $153 million to Scale AI, blockchain infra
Nokia and Zayo trial 800Gb/s optical tech to power the next generation of AI-ready data centers
Follow AI Infrastructure Insights on LinkedIn to get more AI infra briefs.
Sponsored by:

AI to double electricity demand from DCs by 2030: IEA

Juan Pedro Tomás
April 11, 2025
The IEA report predicts that AI processing in the U.S. will need more electricity than all heavy industries combined, such as steel, cement and chemicals
In brief – why this matters
–AI is set to double electricity demand from data centers by 2030, reaching 945 TWh — more than Japan’s total consumption today, according to the IEA.
-In countries like the U.S., AI will drive nearly half of electricity demand growth, surpassing heavy industries, while also raising concerns about energy security and critical mineral supply.
-AI could help reduce emissions and improve energy efficiency, with the IEA calling for investments in power generation, smarter grids and data center efficiency.
Artificial intelligence (AI) is expected to double global electricity use from data centers by 2030, according to a new report from the International Energy Agency (IEA).
At the same time, AI offers big opportunities to make energy systems more efficient, reduce costs and cut emissions, the report stated.
The IEA’s Energy and AI report predicts that data centers will use around 945 terawatt-hours (TWh) of electricity in 2030 — which is more than Japan uses today. AI is the main reason for this growth, especially as more AI models are trained and used at scale, IEA said.
In countries like the United States, nearly half of all electricity demand growth by 2030 will come from data centers. The report says that by then, AI processing in the U.S. will need more electricity than all heavy industries combined, such as steel, cement and chemicals.
To meet this demand, a mix of energy sources will be used, with renewables and natural gas playing the biggest roles due to their cost and availability.
“AI is one of the biggest stories in the energy world today – but until now, policy makers and markets lacked the tools to fully understand the wide-ranging impacts,” said IEA executive director Fatih Birol.
The new IEA report warns that AI could also create new energy challenges. Cyberattacks on energy companies have tripled in four years, with AI making them more advanced. However, AI can also help defend against such threats. It may also cause higher demand for critical minerals, which are used in the hardware of data centers, the report said.
Still, AI could help reduce emissions by improving energy use and speeding up innovation in clean technologies like solar panels and batteries, it added.
“With the rise of AI, the energy sector is at the forefront of one of the most important technological revolutions of our time,” said Birol. “AI is a tool, potentially an incredibly powerful one, but it is up to us – our societies, governments and companies – how we use it.”
To make the most of AI, the IEA recommends investing in new power generation, building better energy grids and improving data center efficiency. The report builds on work from the IEA’s 2024 Global Conference on Energy and AI and the AI Action Summit.
The IEA also said it will also soon launch a new Observatory on Energy, AI and Data Centers, which will gather the most comprehensive and recent data worldwide on AI’s electricity needs, in addition to tracking cutting-edge AI applications across the energy sector.
One of the biggest challenges that AI data centers face is the enormous power consumption they require. Unlike traditional data centers, which primarily handle storage and processing for standard enterprise applications, AI data centers must support intensive workloads such as deep learning, large-scale data analytics as well as real-time decision-making.
As AI adoption continues to expand, so does the need for energy-efficient solutions. While power demands are a major challenge, innovations in hardware design, renewable energy and cooling technologies offer promising ways to mitigate their environmental impact.
Sponsored by:

Powering the AI boom: Energy challenges behind AI growth

Juan Pedro Tomás
April 21, 2025
Energy demand for AI data centers in the U.S. is expected to grow about 50 gigawatt each year for the coming years, according to Aman Khan, CEO of International Business Consultants
In sum – what you need to know:
AI fuels infrastructure race – Hyperscalers like Microsoft, Google, Amazon and Meta have invested over $200 billion in AI infrastructure in 2024, with plans to exceed $220 billion in 2025, driving massive demand for power-hungry data centers.
Power becomes the bottleneck – AI workloads, especially training, require unprecedented electricity levels. Inference tasks like ChatGPT searches can use up to 30 times more energy than standard searches.
Energy innovation is critical – To meet growing demand, firms are optimizing power usage and exploring diverse sources including renewables, small nuclear reactors, and converted coal plants.
As AI adoption accelerates, the global tech ecosystem is scrambling to scale up its infrastructure. But powering this transformation comes with serious energy challenges, said Aman Khan, CEO of International Business Consultants, in a recent webinar organized by DatacenterDynamics.
Khan emphasized that AI models — particularly those used for training — require massive compute power, which translates into immense power demand. “As we have heard, the ChatGPT search may require 30 times more power to process a normal search in what power it needs in Google search engine,” he said. And that’s just inference workload, he noted, adding that training models demand even more power over extended periods.
He cited a projection for the U.S. market: “Power demand for AI data centers is expected to grow about 50 gigawatt each year for the coming years.” In Europe, he added, “every three to four years, this capacity will be doubling for the data center consumption.”
These demands are already reshaping investment decisions. “Investors are looking for AI training data center sites that are physically close to the power supply. If it is grid or renewable energy, power doesn’t matter,” Khan said. Other focus areas include high-performance computing, cooling systems and distributed AI models at edge data centers.
Khan also outlined major infrastructure investments globally: “All four hyperscalers —Microsoft, Google, Amazon and Meta — they have made Capex investments in AI infrastructure, more than $200 billion in 2024 alone.” He cited specific 2025 forecasts: “Meta has announced about $65 billion, and Alphabet or Google, $75 billion and Microsoft, about $80 billion.”
Adding to that, Khan mentioned the Stargate initiative in the U.S., “which is joined by SoftBank, Oracle and OpenAI, which is about a $500 billion project.”
In Europe, the European Union announced a 200 billion euro EU invest fund for AI-related projects. This fund will finance what they call four AI gigafactories across the EU.
Despite the enthusiasm, Khan flagged serious concerns about energy availability. “There is a major risk that some of the North American investors, they may focus on development of AI training data centers in North America or in Nordic rather than Europe, which will not be a good news for Europe.”
Asked whether new investments or a grid overhaul would be sufficient, Khan responded, “We need an overhaul. We need new approaches. We need both.”
He explained that companies are responding with two main strategies: improving energy efficiency and investing in alternative energy sources. “So if I’m sitting on 500 MW, my first focus is how can I use those 500 MW more effectively via monitoring, optimizing, initiating technologies, technology refresh projects, new solutions or even using AI?”
Alternative energy strategies include wind and solar parks, small modular nuclear reactors (SMRs), power purchase agreements (PPAs), and hydro/geothermal projects. But all come with hurdles. “The challenge we have when it comes to wind and solar parks is the energy storage, as wind and sun are not always there.”
Regarding SMRs, Khan noted, “There are about 20 projects in the U.S. and about the same in Europe,” but regulations are slowing progress. “We’re not talking about tomorrow. It’s going to take a while because of the permit process.”
Companies are also repurposing idle infrastructure. “Data center companies are sometimes evaluating the dependent coal plants or production facilities… trying to transform those abundant sites, abundant locations into data center facilities because the power is available.”