RCRTech engages industry communities through research-driven content, conversations, and connections. Building on 40+ years of RCR Wireless News excellence, RCRTech delivers trusted insights informing and connecting technology buyers with innovators shaping connectivity and compute.
RCRTech engages industry communities through research-driven content, conversations, and connections. Building on 40+ years of RCR Wireless News excellence, RCRTech delivers trusted insights informing and connecting technology buyers with innovators shaping connectivity and compute.



![]()
A Bloomberg report noted that the AI data center will be jointly funded by Nvidia and the German operator In sum – what to know: $1.2 billion joint investment – Nvidia and Deutsche Telekom will co-finance a large-scale AI data center in Germany to accelerate the country’s industrial and digital modernization. SAP to anchor new facility – German software leader …
Digital Realty CEO Andy Power said the company continues to see a robust pipeline of demand from AI-oriented use cases In sum – what to know: AI drives 50% of bookings – Digital Realty says half its recent deals are tied to AI use cases, underscoring how deeply AI is shaping data-center demand. Urban power constraints persist – Power shortages …
SoftBank announced plans in April to invest $30 billion in OpenAI’s for-profit subsidiary, with $10 billion of that to be syndicated to co-investors In sum – what to know: SoftBank finalizes $30B OpenAI backing – The board approved a $22.5 billion payout to complete its planned $30 billion investment in OpenAI. Payment tied to corporate restructuring – Full funding depends …
The EIB will work with the European Commission to evaluate project proposals that meet key technical and financial criteria In sum – what to know: AI gigafactory plan – The agreement sets a framework for financing and advisory support for large-scale AI data centers across Europe. €200 billion InvestAI fund – Includes a €20 billion fund dedicated to building three …
AI compute isn’t one thing. It’s two. Under the umbrella of “AI workloads,” training and inference represent distinct computational worlds with different goals, hardware profiles, and economics. They often get lumped together, but the split matters — especially as it relates to the compute capacities of the data centers that are used for these two different tasks. Understanding the divide …
For decades, compute has scaled faster than memory. Processors can execute more operations every year, but the speed at which data moves in and out of memory has lagged behind. That mismatch, known as the “memory wall,” is now one of the defining constraints in artificial intelligence. AI makes the problem even worse. These days, training and serving large models …
The semiconductor industry is changing quickly, especially as it relates to AI. As AI workloads grow ever more demanding, old monolithic chips are giving way to new chiplet-based designs. But what exactly are chiplets and how will they radically improve performance for AI? Here’s a look. What are chiplets? Chiplets are small, functional blocks of silicon, each optimized for a …
Artificial intelligence has reshaped the semiconductor industry, driving an endless chase for better performance and efficiency. But as transistor scaling slows and Moore’s Law fades, the gains from smaller nodes are running into a wall. Now, packaging is where the real action is. In this new phase, performance breakthroughs aren’t being won by shrinking transistors — but instead by innovating …
Enterprise

