The AI grid – build it and they will come?

Home RCR Wireless News The AI grid – build it and they will come?

It’s Friday, after a long week, and I should resist the urge to put pop references into this unwieldy tale; let’s try and hold it together a while longer. Yesterday, in this blurb, we sketched a telco architecture for the future AI economy, geared for some kind of tiered cloud-to-edge inference proposition. Today’s news flow, as RCR has it, asks a simpler question: is this even a market, or just a build-out? I mean, of course it’s a build-out, and the ideas are speculative. But they’re what we like. The bricks and mortar, cables and ducting – I mean, that’s just marketing numbers. 

 

There is a bit of that today. Nvidia’s ‘AI grid’ – is it even Nvidia’s concept, or is its voice just the loudest in the room (planet)? – is being filled in piece by piece, from cloud to edge, wireline and wireless, via hyperscale interconnects, global longhaul systems, metro transport networks, regional RAN deployments, and private enterprise setups. The whole telecoms sector is turning on a dime; March 2026, when it all came out – an industry-wide move to converge a single AI-optimised infrastructure layer. Nvidia says jump, and the telco industry digs in its pockets, says how much? 

 

For mobile operators, this feels like a unique inter-generational infrastructure overhaul, which will straddle the 5G and 6G eras, and reach way behind-the-RAN into their backhaul and longhaul fiber systems. Nokia’s announcements about application-specific optical networking reinforce this: the transport network is being re-engineered for AI traffic patterns, not human ones. Sulagna’s write-up from a strategy briefing at OFC is worth reading. Ciena, now as much of a rival for the Finnish firm as Ericsson (in the AI networking narrative), is also working on the back-end.

 

It has just set a speed record with Meta (800 Gbps) across 16,608 kilometers of trans-pacific submarine cable, heralding massive new capacity to shuttle model outputs between data centers and edge domains. So yes, it is a building project. But the Telefónica stuff shifts the conversation from placement of compute to extraction of value – with automation, exposure, slicing, orchestration as monetization levers. Which aligns with yesterday’s papers (can’t help myself), per the discussion about AT&T and Verizon, and telcos as a “middle layer” in the AI supply chain.

 

Their role is not just to host inference workloads, but to operationalise them – on programmable telco networks, in programmable telco products. The risk for telcos is the infrastructure is built faster than the business models; and the risk for everyone is that the use cases don’t show up. (There is a-whole-other narrative about how clever/stupid AI really is, at least in its present form; at least enough to build a new global economy on, in the medium-term.) Build it and they will come (another pop cliche). Are you in, or are you out? Carpe diem, says Jefferson. 

 

The Siemens story, about private 5G, provides a pragmatic enterprise counterpoint to this hurried telco‑centric AI grid narrative. The tech isn’t the hero of this story, or any story; it needs to go to work, and work is difficult, idiosyncratic, self-absorbed, conservative. 5G is an enabler; so too is AI – despite the clamour. Use cases count, and physical AI is already alive and kicking. Another thing: on‑prem control (and on‑prem LLMs) might just knock the tokenomics off its stride. Siemens had an MWC demo with just such a setup; the on-prem directive is clear in the industrial economy. 

 

Security, determinism, reliability; all of that. Which is a different value prop, versus programmable latency tiers in the macro RAN. While telcos are busy building infrastructure, serious industrial adopters want self-contained systems, on their terms – to keep data close and secure, and cheap (versus public toll roads). Of course, most large-scale AI workloads – and parts of all workloads – will rely on telco edge nodes, metro orchestration, longhaul fiber optics (maybe even AI-RAN) for training, scale, collaboration. But it is a reality check – if only about how varied the work is.

James-Newsletter-rh9kgqsb2fng5qdzlyiq8g8ux9jouqcjh68bxs270g-10-rjao7r1w4lop2svtzn812vsuqvbme6iqfireyhnqps

James Blackman
Executive Editor
RCR Wireless News

RCR Top Stories

Telefónica eyes AI $$$: Telefónica is pushing transport upgrades to boost AI monetization, targeting autonomous networks by 2030. It is adopting open interfaces and coherent optics for new efficiency, services, and revenue.

Nvidia AI telco grids: Nvidia is working with telcos on distributed AI grid infrastructure to embed high-performance compute directly into regional hubs and switching facilities – to move inference closer to the end-user.

Nokia’s optical pitch: Post its Infinera integration, the Finnish vendor’s target is the data center interconnect space, which it hopes to disrupt with a brand new line of optical assets for short-reach campus interconnectivity and subsea applications.

AI, minus the theatre: The tech is not the story, stupid; it is a part of a solution, says Siemens. Enterprises have their own issues, and don’t buy the hype anyway; but sometimes 5G helps – and they call their crane supplier (etc), not their telco. 

Ciena sets 800G record: Ciena and Meta have recorded 800 Gb/s over 16,000km on a transpacific cable system, highlighting advances in long-distance optical performance as demand for AI and cloud connectivity increases globally.

Logos SMCI NVIDIA 2021 2400x700 1 1 1
In partnership with

AI-Powered Telecom Infrastructure
Supermicro, in collaboration with NVIDIA, delivers AI-powered infrastructure tailored for telcos, enhancing operational efficiency, network management, and customer experiences. Explore now 

Beyond the Headlines

NAM DC trends: The  North American data center market is experiencing unprecedented demand, with record-low vacancies (1.4% in primary markets) and rising rental rates. Check out this video interview with CBRE.

Vantage in APAC: Vantage Data Centers explains how AI workloads are increasing rack density, accelerating liquid cooling adoption, and driving larger infrastructure deployments across key markets across the Asia-Pacific region.

Token economics: The message from Nvidia chief Jensen Huang at GTC is that AI is no longer about models or chips, but about monetizing inference at scale – where tokens are the core unit of value and data centers are revenue factories.

Telco AI infra: AT&T, Cisco and Nvidia are working to integrate AI inference into telecom networks, enabling enterprises to process data at the edge and support real-time applications across industrial and connected device environments.

Nvidia in orbit: All about Nvidia’s new Space-1 module, its proposal for an orbital data center for LLMs: with 25 times the compute of the H100, Nvidia is looking to eliminate the latency and expense of downlinking raw datasets to ground stations.

What We're Reading

End of Metaverse: Meta is scaling back Horizon Worlds, shifting focus from VR to AI, with Mark Zuckerberg effectively abandoning the metaverse vision as the company pivots toward AI following heavy losses and limited user engagement.

Busy-mom AI: Red Hat’s AI supremo Fatiha Nar has a good post about a ‘busy mom syndrome’ for shared AI infra, where cloud models suffer performance degradation during peak demand, especially for complex workloads.

GVP in 6 GHz: Federated Wireless will enable geofenced variable power (GVP) devices in the 6 GHz band following FCC approval, supporting higher-power indoor and outdoor use while preventing interference, unlocking new Wi-Fi and IoT cases.

Inference tests: Keysight has launched an AI inference emulation platform to replicate real-world workloads and validate AI infrastructure – helping enterprises test performance and identify bottlenecks and scale data centre deployments.

Bots on the Tyne: The Port of Tyne has trialled autonomous yard robots using sensors and LiDAR to improve container handling and show how AI automation can enhance efficiency, safety, and integration within complex port logistics.

Events

Virtual Program
Explore the technologies, tools, strategies and partnerships powering the next generation of intelligent systems. This is where the backbone of AI innovation takes center stage. Register now 
 
Wi-Fi Forum, January 20th 2026
Join this RCR Wireless News‘ event to understand the current state of the Wi-Fi as we examine a myriad of evolving use cases and monetization strategies being deployed by industry. Register now 

Industry Resources

Webinar, September 18th
The journey to a fully autonomous network – The evolution of network automation and how Amdocs is leading the way

What you need to know in 5 minutes

Join 37,000+ professionals receiving the AI Infrastructure Daily Newsletter

This field is for validation purposes and should be left unchanged.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More