China’s AI surge – infrastructure, chips, and an urgent game of catch-up

Home AI Infrastructure News China’s AI surge – infrastructure, chips, and an urgent game of catch-up
China AI

With electricity capacity set to rise 30% this year and $70 billion in planned investments for 2026, China is building AI infrastructure at scale. Enterprise AI services and domestic chip initiatives are driving the next wave of growth, positioning China as a major global AI infrastructure hub – even as it plays catch-up with the US.

In sum – what to know:

DeepSeek moment – China’s cloud and internet sectors have been “racing” to develop AI since January, when the LLM-powered DeepSeek chatbot went viral.

Infrastructure build – top Chinese internet firms will invest $70 billion next year; electricity capacity for Chinese data centres will jump roughly 30% in 2025.

Domestic focus – hyperscalers like Alibaba, Tencent, and Baidu are shifting toward domestic AI chips, supported by state mandates and energy subsidies. 

The Chinese economy at large woke up to the AI dream a little late, perhaps. Goldman Sachs says the country’s cloud and internet industries have been properly “racing” to develop AI only since January, when DeepSeek rocked the world with a cut-rate LLM-backed chatbot that rapidly climbed the charts and hobbled US tech stocks (particularly for AI chip‑maker Nvidia, which lost 17% in one trading day; about $593 billion). It was a “Sputnik moment” in the AI race, a signal that the centre of gravity might be shifting. China has certainly taken it to heart, and is not looking back.

Power demand from China’s data centres will increase 25% this year (2025), suggests Goldman Sachs. Electricity capacity will jump roughly 30% – from 23 GW to about 30 GW (about enough to power 30 million households), it reckons. Investments by China’s top internet firms will exceed $70 billion in 2026, it says. The point is the figures say China views the AI infrastructure build‑out as a large‑scale infrastructural push, and not simply incremental growth. Figures from analyst firms say the same: that the Chinese market is already large and quickly expanding.

A market‑study by Mordor Intelligence says the Chinese data‑centre market will be worth $29.23 billion by the end of 2025, and $56.71 billion by 2030, with compound annual growth (CAGR) of 14.2% in the period. A study of “AI‑ optimized data‑centres” projects by Grand View Research says the AI data‑centre market in China will grow from $1.08 billion in 2024 to $5.9 billion by 2030 – up by about 33% (CAGR). Which sounds like a lot, but is really a drop in the ocean. McKinsey & Company reckons some $6.7 trillion will be required by 2030 in data‑centre capex to support global compute demand – with a significant share focused on AI‐capable facilities

And China has some catching up to do. Statista says there are 331 data centres in China – an economy of $18.74 trillion and a country of 1.4 billion people (and a landmass of about 9.6 million square kilometres). The US ($29.18 trillion, 342.8 million) has 4,165 data centres; even the UK and Germany ($3.96 trillion, $5.01 trillion, respectively), smaller and denser countries, have more centralised compute infrastructure (499 and 487 data centres). Goldman Sachs puts China’s 2026 AI investment ($70 billion) at only about 15‑20% of what US hyperscalers’ 2026 spend.

So China’s AI ambitions may be escalating fast, but they are being built on a relatively sparse physical infrastructure base. Which also explains the urgency behind its rapid data‑centre expansion and massive capital expenditure plans. And beyond IT racks and GPUs, China is spending big on all the other “non‑IT infrastructure” AI infrastructure componentry – up to RMB 800 billion ($112bn) by 2030 on power generation/transmission, cooling systems, metals and construction. Because “without electricity, there is no AI” – as one study observes.

China is also layering in self‑reliance and domestic supply‑chain ambitions. Goldman Sachs says Chinese hyperscalers like Alibaba, Tencent, and Baidu have traditionally spent 50%-75% of their capex on foreign chips. “This ratio is now shifting in favour of domestic producers,” it writes. New state guidance mandates that new (state‑funded) data‑centres should “only use domestically‑made AI chips” – as reported by Reuters, and widely. Ongoing projects that are less than 30% complete are obliged to either remove foreign chips, or else down tools and quit. 

The country has related state interventions for power supplies. The government has introduced a system to reduce electricity fees for data centres using domestic semiconductors by up to 50% – to further reduce reliance on Nvidia, most notably. The Financial Times reports that local governments in regions with dense data centres – Gansu Province, Guizhou Province, the Inner Mongolia Autonomous Region – have introduced a subsidy to reduce electricity fees by up to 50% for data centres using domestic chips.

The measure is, in part, because Chinese AI chips are more power hungry, and tech companies have complained about surging energy bills. It takes 30-50% more power to generate the same amount of tokens with Chinese chips as Nvidia’s H20 chip for the Chinese market, it writes. 

The country’s AI build‑out is infrastructure‑heavy and supply‑chain‑aware, then. Historically, it is also domestically oriented. The Goldman Sachs piece says that Chinese cloud/AI providers are still mostly inward‑facing, serving Chinese enterprises at home. As much as 90‑to‑95% of their revenues come from domestic business, it writes. And while the country’s AI chip policy is increasingly China-focused, its cloud and internet firms are looking to target foreign markets for a new wave of growth, writes Goldman Sachs, and to go beyond their domestic bases. 

They are building data centres across Asia, the Middle East, and Latin America, it says, with a new focus on monetisation models via AI post‑training (enterprises customisation of LLMs with proprietary data) and AI inference (applying trained models to day‑to‑day tasks). These become the basis of recurring‑revenue (subscription) models around AI applications for video generation, picture editing, object identification, and so on. Monetisable use‑cases are emerging; but Goldman Sachs emphasises “the number of paid users [for consumer chatbots] is immaterial”.

In other words, China’s AI chatbots might be viral and significant, like DeepSeek, but their financial draw in the consumer realm will not amount to a hill of beans; the real revenue comes from selling AI services to businesses. Which explains the whole AI infrastructure model, as it stands, where infrastructure build is upstream and industrial monetisation downstream, rather than consumer apps as the immediate profit driver. 

But while the AI momentum in China is evident, there are risks – such as its total infrastructure spend (ramped, but still lagging the US), harsh regulatory constraints (limitation with home-grown chips), plus potential regulatory scrutiny around energy demands. And then there is the whole ROI challenge, as above, which is the same for everyone; Goldman Sachs says China’s monetisation is at a “high growth rate”, but also that it is only in the “early days” in its “revenue story”. 

On the other hand, while China still lags the US in terms of its infrastructure footprint, investment roadmap, and chip ecosystem, its broad ambitions and central policies, plus its scale and track record, make clear the Chinese market will develop, in line, as a major global AI infrastructure node.

What you need to know in 5 minutes

Join 37,000+ professionals receiving the AI Infrastructure Daily Newsletter

This field is for validation purposes and should be left unchanged.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More