Table of Contents
This is the year when proof-of-concepts and tangible benefits are needed to allay consumer, employee, and policymaker concerns about AI infrastructure.
In sum – what to know:
- Physical vs. Digital Reality: While AI seems digital, its infrastructure is a heavy industrial complex that requires immense investment, and up-front sacrifice.
- Flip the energy mix: Electricity accounts for 14% of total energy used in the U.S., but that will possibly flip to more than 50% in the next decade.
- Information entropy: AI has to move beyond pattern matching toward true innovation and usefulness.
Some of the largest companies in the United States have couched massive layoffs recently in an AI narrative about operational efficiency and productivity gains, and the drive for real-time predictive and prescriptive insights that will greatly accelerate decision-making, hyper-personalization, and innovation.
The rapid expansion of AI infrastructure is placing considerable strain on power grids, as well as on consumer sentiment as higher electricity prices, depletion of resources, and concerns about air quality, noise, land, climate change, and massive layoffs are making them question the speed at which AI infrastructure buildouts are happening.
In a Bloomberg podcast yesterday with Simplify Asset Management chief analyst Mike Green, there was a compelling discussion about the widening gap between AI’s energy demand and the grid’s ability to scale quickly. “Companies are realizing the reality of the physical world versus the digital world that we tend to think about when talking about the cloud,” Green said. “It’s not an imaginary space, but a heavy industrial complex with massive amount of data centers.”
He was very frank in saying Americans, and data center operators, will have to get accustomed to paying more for electricity — no longer sheltered by massive offshoring of energy-intensive manufacturing that freed up capacity in electricity production for the past 25 years. “We are rapidly exhausting capacity because the reality is the United States has underinvested in the infrastructure that is required for the projections for AI…we don’t have the physical infrastructure, and the overall grid is incapable of conducting the balancing needed to smooth things as needed.”
What he means there is that AI data centers have to be dynamic and responsive enough to pause workloads during peak demand hours, and advanced enough to maximize existing capacity. There also has to be a shift toward more localized power, with on-site generation sidestepping the constraints of regional and state grids.
As RCR Tech has been reporting, the largest hyperscalers and companies with the resources to do so are moving toward a “bring-your-own-power” model, such as Microsoft’s nuclear focus with Crane Energy, or Meta’s Meta Compute initiative.
As Green pointed out in the podcast, sourcing electricity to feed machines rather than humans is difficult, because humans are “calorically diverse,” whereas machines require electricity. That is the “gating factor,” especially since AI requires a flip-flop in the energy mix: “electricity accounts for 14% of total energy used in the U.S., but that has to go north of 50% in the space of a decade to expand AI,” he said. That flip will take trillions in Capex.
Is there a way to capture the magic of AI in the next decade, despite the electricity constraints? According to MIT professor Markus J. Buehler’s concept of “information entropy”, AI has to move beyond pattern matching toward true scientific discovery, but that requires a “Goldilocks zone” of information entropy—a state of “stable magic” that sits between rigid order and chaotic randomness. In other words, too little entropy leads to insufficient exploration, and too much can lead to instability and chaos.
To feed that entropy, there will have to be a diversity of energy sources, including fossil fuels, wind, solar, and nuclear. Policy around those energy sources, and around permitting and approvals will have to catch up to the AI era.
Additionally, financial analysts believe Capex will have to go from equity financing to debt financing, as even the biggest hyperscalers cannot come up with the trillions-of-dollars of financing needed for AI infrastructure.
Also important is the fight for resources that will come as the AI infrastructure push clashes with the “Made in America” push, in which manufacturing for “widgets” will be competing for the electricity, resources, materials and labor currently being funneled to AI.
If large swaths of the public don’t feel a tangible value coming from the data center buildouts, there will be growing reluctance or even fierce backlash against higher electricity prices, as well as other concerns about water, land, noise, air quality, and surging carbon footprints.
Consumers will need a compelling proof-of-concept to quiet their concerns about AI infrastructure costs, like electricity, or depletion of resources they, too, need, like water.
As RCRTech reported, the World Economic Forum (WEF) narrative was one of AI experimentation shifting to a focus on tangible value and functional, real-world change. Many AI visionaries and leaders said 2026 period would represent a critical “pivot” from potential to measurable ROI. RCRTech will report on these trends as they unfold.