According to CRN, AMD reported record third-quarter revenue of $9.2 billion, representing a 36% year-over-year increase and beating Wall Street expectations by $500 million. CEO Lisa Su highlighted “sharp” sales jumps for both Instinct data center GPUs and EPYC server processors, with data center business hitting $4.3 billion. The company revealed deepening ties with OpenAI through a multi-year agreement to deploy six gigawatts of Instinct GPUs starting in late 2025. AMD’s client PC segment surged 73% to $4 billion, while gaming revenue exploded 181% to $1.3 billion. Despite the strong results, AMD’s stock dropped over 3.5% in after-hours trading, possibly due to fourth-quarter guidance of $9.6 billion coming in below some analyst expectations.
The AI Gold Rush Is Real
Here’s the thing: AMD is finally cashing in on the AI boom that’s been dominated by Nvidia. Their Instinct GPU sales are accelerating dramatically, with multiple large cloud providers like Oracle and several smaller players deploying the MI350 series. But the real game-changer is that OpenAI partnership. Six gigawatts of GPU deployment starting next year? That’s massive. And Su isn’t shy about the potential – she’s talking about “over $100 billion in revenue over the next few years” from this deal alone.
What’s interesting is how quickly AMD is moving up the AI food chain. They’ve got MI350 deployments happening now, MI400 development “progressing rapidly,” and that Helios rack-scale platform coming next year. They’re not just selling chips anymore – they’re selling complete AI infrastructure solutions. And with companies like IBM, Cohere, and Character AI already using MI300 series for training and inference, AMD is building serious credibility in the AI space.
Server CPU Domination Continues
While everyone’s focused on AI GPUs, AMD’s server CPU business is quietly crushing it. EPYC processor revenue hit an all-time high, with fifth-gen chips making up nearly half of all EPYC sales. But here’s what really stands out: more than 160 new EPYC-powered cloud instances launched this quarter alone. That brings the total to over 1,350 public cloud instances – nearly 50% higher than a year ago.
Enterprise adoption is exploding too. Large businesses increased their EPYC cloud usage by more than three-fold year over year. And they’re signing “large” deals with Fortune 500 companies across telecom, finance, retail, and automotive. Basically, AMD is eating Intel’s lunch while everyone’s watching the AI drama. The next-gen “Venice” processors on 2nm are already getting traction with cloud partners, setting up another strong product cycle for 2025.
PC and Gaming Are Back
Remember when people said the PC market was dead? AMD’s client business just grew 73% to $4 billion. Desktop CPU sales hit record highs, driven by strong demand for Ryzen 9000 processors. But the commercial segment is where things get really interesting – Ryzen PC sales to businesses grew more than 30% year over year.
Gaming revenue nearly tripled to $1.3 billion, thanks to Microsoft and Sony console chips and strong Radeon GPU demand. What’s telling is that OEM laptop sales are surging, indicating real end-customer demand rather than just channel filling. AMD’s restructured partner program with 40% more funding is clearly paying off in the commercial space.
The Big Question
So why did the stock drop on such great numbers? Wall Street might be worried about whether AMD can actually deliver on that $100 billion AI revenue promise. Or maybe they’re skeptical about the embedded segment’s 8% decline. But honestly, when your data center, client, and gaming businesses are all firing on all cylinders, who cares about a slight embedded dip?
The real story here is that AMD has transformed from a plucky underdog to a serious AI contender practically overnight. With OpenAI in their corner and server CPU dominance continuing, they’re positioned to capture the next wave of AI infrastructure spending. The question isn’t whether AMD will grow – it’s whether they can grow fast enough to satisfy increasingly hungry investors.
