Table of Contents
The potential agreement would mark a shift in how Anthropic sources specialized silicon for its models
In sum – what we know:
- Diversifying the silicon supply – Anthropic is in early-stage talks with UK-based Fractile, which would represent a big move to source chips from a non-hyperscaler startup rather than the industry’s largest vendors.
- Addressing the memory wall – Fractile’s architectural approach co-locates memory and compute on the same die using SRAM, aiming to eliminate the back-and-forth bottlenecks that plague traditional GPU inference.
- Economic necessity – With Anthropic’s annualized revenue soaring, reducing inference costs has become a financial imperative, making specialized, high-performance silicon a critical procurement priority to protect margins.
Anthropic is in early-stage talks to buy inference accelerators from Fractile, a London-based chip startup, according to reporting from The Information. It’s a small deal on the surface, but there might be a little more to it. Anthropic has, until now, sourced its silicon almost exclusively from the biggest players in the industry. A deal with Fractile would mark a noticeable shift in that procurement strategy.
If the talks progress, Fractile would become Anthropic’s fourth source of AI server silicon, joining Nvidia, Google, and Amazon. The discussions are reportedly still early, and The Information notes they could fall through entirely. Still, the fact that they’re happening at all says something about where Anthropic thinks the inference market is heading, and why a British startup with a fundamentally different chip architecture is suddenly on the radar of one of the most compute-hungry companies in AI.
Anthropic’s current chip supply strategy
Rather than building massive proprietary data centers, Anthropic has opted for a multi-vendor rental model, spreading its workloads across the industry’s largest chip suppliers. Claude runs on Nvidia GPUs. It also runs on Amazon’s Trainium processors through Project Rainier. Anthropic rents additional Nvidia capacity via Microsoft Azure. And recently, it expanded its Google partnership to use Google’s TPUs.
The logic is fairly straightforward — diversification mitigates vendor lock-in and gives Anthropic negotiating leverage across its suppliers. When you’re buying compute at this scale, even modest pricing advantages compound fast.
What makes Fractile interesting is that every one of Anthropic’s existing silicon relationships is with a hyperscaler or mega-cap chip vendor. Fractile is neither.
Who is Fractile?
Fractile was founded in 2022 by Walter Goodwin, an Oxford PhD, and is based in London. Its team is stacked with hardware engineers pulled from Graphcore, Nvidia, and Imagination Technologies — a pedigree that lends credibility to what is otherwise a very early-stage company. Alongside the hardware, Fractile is building its own software stack, which is more or less table stakes for any inference silicon startup hoping to be taken seriously.
On the funding side, Fractile raised $15 million in seed capital co-led by Kindred Capital, the NATO Innovation Fund, and Oxford Science Enterprises. It’s now reportedly in talks to raise another $200 million at a valuation north of $1 billion, with Founders Fund, 8VC, and Accel among the potential investors. Anthropic’s interest, by multiple accounts, has become a key selling point in that round.
Fractile’s technological approach
Fractile is designing an inference chip that co-locates memory and compute on the same die using SRAM, rather than relying on separate DRAM chips. Essentially, it’s putting the data right next to the arithmetic transistors that act on it, eliminating the constant back-and-forth between compute and off-chip memory that bottlenecks GPU inference today.
This is the memory wall problem, attacked at the architectural level rather than through more bandwidth or fancier packaging. And if it works, the performance implications could be significant. Goodwin claimed in July 2024 that Fractile’s chips could run large language models 100 times faster and 10 times cheaper than Nvidia GPUs.
Those are enormous numbers, but to be clear, the claims were based on simulations. Fractile had not manufactured test chips as of mid-2024, and simulation-to-silicon is a big transition. That said, the approach isn’t completely new — Groq and Cerebras have pursued related SRAM-based and near-memory designs, and the industry is clearly moving toward specialized inference architectures that look nothing like traditional GPUs.
Driving the deal
The context behind Anthropic’s interest is hard to separate from the company’s financials. Anthropic’s annualized revenue run rate crossed $30 billion in March 2026, up from roughly $9 billion at the end of 2025. That’s the kind of growth that sounds great in a headline and looks considerably more complicated on a gross margin line.
Inference costs are now a material drag on Anthropic’s margins, and with demand still surging, the pressure to reduce cost-per-token isn’t a nice-to-have — it’s an economic necessity. At current scale, even a fractional improvement on inference economics translates into hundreds of millions of dollars.
The broader market sees the same thing. Nvidia’s $20 billion agreement with Groq in December 2025, followed by the launch of the Groq 3 LPX accelerator, was a clear signal that specialized inference silicon is now a first-class category.
Not an overnight change
None of this is imminent. Fractile’s chips aren’t expected to be commercially ready until around 2027, which lines up roughly with Anthropic’s expanded Google-Broadcom TPU timeline but places any deployment well outside the company’s near-term procurement needs. Volumes and financial terms haven’t been disclosed, and likely haven’t been negotiated.
The risks are obvious. Fractile’s technology remains unproven in commercial production, the leap from simulation to working silicon is nontrivial, and early-stage chip programs slip all the time. But the fact that Anthropic’s interest is already being leveraged as a selling point in Fractile’s $200 million fundraise suggests this isn’t a casual conversation.