Table of Contents
One stop at a time
Author’s note: I’m one of those AuDHD people you encounter in tech, media and tech media. CES and Las Vegas are exhausting, but I wanted to get this written while still having some fun. So, I landed on a format where I write a section, have a drink, take the monorail one stop, then repeat. It neatly combines my fondness for writing, structure, novelty, public transit, and drinking. The journey is tracked in the nesting bylines. The format led me to some decidedly adequate bars I’d usually never visit, but the format is the format. I hope you enjoy it as much as I did.
MGM Grand, Tap Sports Bar — I’ve been making the annual pilgrimage to the Consumer Electronics Show since around 2013 (I’ve honestly lost count), which means I can track its evolution less by press releases more by memory. My most vivid contrasts are between my first CES and this most recent one.
That first year, I was green and overwhelmed. I remember wandering into a hall filled with row after row of cars outfitted with high-end stereo systems — impressive, loud, and unmistakably consumer-facing. This year, my CES was far more curated and far more abstract, centered almost entirely on the semiconductors that quietly underpin everything from cars and PCs to mobile devices, and now, rack-scale AI systems.
What struck me most wasn’t just that my schedule skewed heavily toward silicon, but that the tone of the show itself has shifted in a productive way. CES is no longer primarily about the things consumers buy; it’s about the systems that make those things possible. AI is the connective tissue. Chips are the physical substrate. And whether the touchpoint is a data center, a robot, or a laptop, every consumer AI experience ultimately traces back to silicon — an NVIDIA Vera Rubin cluster training trillion-parameter models, an Intel Core Ultra Series 3 powering a high-end AI PC, or a Qualcomm Dragonwing IQ10 Series pushing intelligence to an autonomous mobile robot at the far edge.
But if CES isn’t just about consumer devices, why should consumers, or my industry readership for that matter, care about the show? The short answer is that CES hasn’t stopped being a consumer show; it’s moved down the stack and recognized that consumer experience is essentially a property of infrastructure and industrial design. It’s a layered shift from endpoints to the platforms that make those endpoints work. And the focus on AI isn’t the result of fever-pitched hype; AI is front-and-center because, like the enabling silicon, it cuts across every category.
NVIDIA — new AI system, new economics
Horseshoe, Cabinet of Curiosities — NVIDIA CEO Jensen Huang delivered his Monday keynote at the Fontainebleau, largely playing the hits: AI scaling laws, manufacturing intelligence, agentic AI, physical AI. The marquee announcement was the new Vera Rubin platform, named after the astronomer whose work reshaped our understanding of how galaxies rotate, which is a fitting reference point for a company intent on redefining the gravitational center of AI computing.
Compared to the current-generation Blackwell platform, Rubin delivers a 10x reduction in inference token cost and a 4x reduction in the number of GPUs required to train mixture-of-experts models. The platform reflects NVIDIA’s full-stack approach to AI infrastructure, spanning a Vera CPU, Rubin GPU, NVLink 6 switch, ConnectX-9 SuperNIC, BlueField-4 DPU and Spectrum-6 Ethernet switch.
“Rubin arrives at exactly the right moment, as AI computing demand for both training and inference is going through the roof,” Huang said. “With our annual cadence of delivering a new generation of AI supercomputers, and extreme co-design across six new chips, Rubin takes a giant leap toward the next frontier of AI.”
What matters here is the role NVIDIA now plays at CES. This is not consumer electronics in the traditional sense, but it is consumer reality-setting. By defining the upper bounds of AI capability and economics, NVIDIA establishes the conditions under which nearly every downstream AI experience exists. CES may no longer revolve around finished products on shelves, but in Huang’s keynote, it was unmistakably about the infrastructure that determines what consumers will eventually experience.
Intel — Core Ultra Series 3, made in America
The Flamingo, Purple Zebra Daiquiri Bar — At CES Intel launched its Intel Core Ultra Series 3 processors built on its 18A node at a fabrication facility in Arizona. As an aside, the dual-emphasis on Core Ultra Series 3 being manufactured in America speaks to this larger CES narrative of moving from consumer devices to the platforms — in this case leading-edge manufacturing with geopolitical and economic implications layered in for good measure.
Intel Core Ultra Series 3 is being used in more than 200 PC designs, and is also present in embedded and industrial devices. The SoC features Core Ultra X9 and X7 processors featuring Intel Arc graphics. The higher-end SKUs feature up to 16 CPU cores, 12 Xe-cores and 50 TOPS of AI compute in the NPU.
“With Series 3, we are laser focused on improving power efficiency, adding more CPU performance, a bigger GPU in a class of its own, more AI compute and app compatibility you can count on with x86,” Intel SVP and GM of the Client Computing Group Jim Johnson said in a statement.
The big idea here is that reinventing the endpoint, an AI PC, requires reinvention at the platform level. The SoC launch is about new PCs but it’s really about running inference locally with predictable performance, power efficiency and software compatibility across an established ecosystem. In that sense, Intel’s story fits squarely within CES’s evolving focus: consumers may still buy devices, but their experience is increasingly shaped by the compute architecture beneath the keyboard, not the industrial design above it.
Qualcomm — AI without the cloud
Park MGM, Bavette’s (thank God, somewhere good) — In an effort to capture revenue from new markets while leveraging its core expertise in mobile, Qualcomm is in the midst of a multi-year diversification strategy that has taken it from mobile to PCs, automotive, wearables, extended reality, data center and, now, robotics. At CES, Qualcomm announced the new SoC platforms for general-purpose robotics that draw from its heritage in connectivity and high-performance, low-power compute.
The new Qualcomm Dragonwing IQ10 Series is a premium-tier robotics processor purpose-built for humanoid robots and autonomous mobile robots. The architecture supports advanced perception, motion planning derived from vision-language-action and vision-language models, and otherwise supports general manipulation and human/robot interactions. Target verticals are industrial, logistics and retail, and out-of-the-gate, Qualcomm is partnered up with Advantech, APULUX, AutoCore, Booster, Figure, KUKA Robotics, Robotec.ai and VinMotion.
For several years Qualcomm has discussed AI in terms of a shift from training to inference and the importance of the valuable contextual data generated at the edge. EVP and GM for Automotive, Industrial and Embedded IoT and Robotics Naked Duggal said in a statement: “As pioneers in energy efficient, high-performance physical AI systems, we know what it takes to make even the most complex robotics systems perform reliably, safely and at scale. By building on our strong foundational low-latency safety-grade high performance technologies…we’re redefining what’s possible with physical AI by moving intelligent machines out of the labs and into real-world environments.”
Qualcomm’s robotics push completes the systems narrative that CES now reflects — they’re focused on what happens when intelligence no longer lives in a data center or behind a screen. This is AI that operates under real-world constraints and delivers value precisely because it can function without constant cloud dependence. In that sense, Qualcomm’s far-edge strategy reinforces the broader point: modern consumer experiences increasingly emerge from invisible infrastructure decisions, long before a human ever touches a finished product.
AMD — betting on black, red and white
As much as I’d like my coverage and narrative framing to be neat and tidy, it’s not; nor is AMD’s presence at CES. Theirs is a story of convergence — its CPUs, GPUs, and accelerators are aimed at the same workloads NVIDIA, Intel and Qualcomm are going after in terms of training economics, AI PCs, inference and everything else. The overlap exemplifies the competitive pressures these companies endeavor to address.
AMD’s tagline for CES is “AI everywhere, for everyone.” To that end, the company announced its Helios rack-scale platform “for yotta-scale AI infrastructure”, built on AMD Instinct MI455X GPUs and AMD EPYC Venice CPUs. It also introduced its Ryzen AI platforms for AI PCs and embedded applications alongside the Ryzen AI Halo developer platform. Friendly metric system reminder: tera is a trillion, exa is a quintillion, yotta is a septillion, so each is 1,000x its precursor.
“At CES, our partners joined us to show what’s possible when the industry comes together to bring AI everywhere, for everyone,” AMD CEO and Chair Lisa Su said in a statement. “As AI adoption accelerates, we are entering the era of yotta-scale computing, driven by unprecedented growth in both training and inference. AMD is building the compute foundation for this next phase of AI through end-to-end technology leadership, open platforms, and deep co-innovation with partners across the ecosystem.”
The thing is that platforms don’t necessarily fit neatly one on top of the other; they’re porous and there are overlaps. AMD’s strategy reflects a market no longer organized around clean architectural diagrams, but around convergence and competition across training, inference and deployment — it’s a good reminder that where AI ultimately runs, and how accessible it becomes to consumers, will be decided by cost, power efficiency and the relentless pressure of overlapping bets.
CES still matters for consumers
Qualcomm CEO Cristiano Amon put it succinctly in an interview with Bloomberg during the show as he discussed the company’s automotive, data center, mobile, PC, robotics and wearables strategy. “Everybody’s playing to win. Everybody is building capacity to win. Will everybody win? Probably not…In the short-term, you could argue maybe there’s over-investment. It’s a natural consequence of everybody playing to win. But in the long-run, I still believe AI is under-hyped…We’re doing things at the edge — the stuff that humans are buying.”
To wrap this up, CES didn’t become less consumer-focused this year. It became more honest about what consumer technology actually is.
Author’s note: I didn’t really make it that far on the monorail before turning around. But that feels appropriate. I saw parts of Vegas I hadn’t seen before, despite coming here a dozen times a year for more than a decade. It’s a city that doesn’t reveal itself all at once, and rarely in neat layers. Turns out that’s a decent way to think about the silicon ecosystem now shaping AI.