AMD CEO Lisa Su said the global market for data center chips could reach about $1 trillion by 2030, reflecting how artificial intelligence is redefining the chip industry.

Speaking at AMD’s Analyst Day in New York, Su told investors that data center revenue is set to grow faster than any other part of the company’s business, powered by rising demand for AI computing, networking, and storage systems.

Su called the data center business AMD’s “biggest growth opportunity,” predicting that AI acceleration, cloud computing, and high-performance enterprise applications will collectively drive unprecedented chip demand.

AMD estimates its data center revenue could reach $100 billion within five years, supported by roughly 60 percent compound annual growth in that segment.

The company’s optimism rests on new AI-focused products such as its Instinct MI400 chips, set for release in 2026, and ongoing partnerships with hyperscale cloud providers deploying large-scale training and inference systems.

AMD has also been acquiring smaller software and infrastructure firms to strengthen its end-to-end AI capabilities.

Positioning Against Nvidia In The AI Race

While Nvidia continues to dominate the AI training-chip market, AMD is carving out space in both training and inference workloads. Su’s trillion-dollar forecast suggests AMD sees structural expansion across the industry rather than a zero-sum game.

With demand for compute doubling nearly every six months across major AI labs, AMD expects multiple winners in a diversified supply landscape.

At the same time, the company is focused on efficiency and scalability, seeking to differentiate on performance-per-watt metrics and integration flexibility for enterprises.

Analysts at TrendForce noted that AMD’s growing partnerships with Meta, Microsoft, and Oracle could provide key competitive leverage as enterprises diversify beyond Nvidia’s CUDA ecosystem.

Broader Industry Implications

The trillion-dollar forecast highlights how AI is driving a new wave of industrial infrastructure comparable to the early internet boom. Data centers are evolving from static compute hubs into AI-native factories, demanding new chip architectures, high-bandwidth memory, and optimized cooling systems.

For AMD, the challenge lies in execution: meeting production targets, securing supply chain stability, and maintaining profit margins while scaling output.

For the broader tech ecosystem, the estimate reinforces that AI hardware will remain one of the defining growth engines of the decade.

Lisa Su’s trillion-dollar projection is ambitious but not implausible. If AI adoption continues at its current pace, data centers will sit at the heart of the global economy, and companies like AMD will be competing to build the silicon that powers it.