Amazon is intensifying its AI infrastructure strategy as AWS CEO Matt Garman and CEO Andy Jassy outlined a dual approach combining major model partnerships with aggressive in-house chip development. Amazon’s reported $50 billion investment in OpenAI, alongside its $8 billion backing of Anthropic, reflects what Garman described as a longstanding AWS strategy of collaborating with and competing against partners simultaneously.
Garman indicated that AWS has built experience managing such dynamics, positioning itself as a neutral platform while offering competing first-party technologies. This approach is increasingly central as cloud providers enable model routing across multiple AI systems, allowing customers to optimise performance and cost while AWS integrates its own models into the ecosystem.
Jassy reinforced this strategy by highlighting surging demand for Amazon’s custom AI chips, particularly Trainium, with capacity for upcoming generations already largely committed. He positioned AWS silicon as a price-performance alternative in a market historically dominated by Nvidia, while also noting widespread adoption of Graviton processors across enterprise customers.
The company is supporting this expansion with a planned $200 billion capital expenditure in 2026, primarily focused on AI data centers, as Amazon seeks to secure long-term leadership in the rapidly evolving AI infrastructure market.