AWS and OpenAI Forge $38 Billion Strategic Partnership to Power the Next Era of Generative and Agentic AI

AWS and OpenAI Forge $38 Billion Strategic Partnership to Power the Next Era of Generative and Agentic AI

Andy Jassy, President and CEO of Amazon

Andy Jassy, President and CEO of Amazon, announced the partnership on LinkedIn, highlighting AWS’s “unusual experience running large-scale AI infrastructure securely, reliably, and at scale.”

New multi-year, strategic partnership with OpenAI will provide our industry-leading infrastructure for them to run and scale ChatGPT inference, training, and agentic AI workloads.

Allows OpenAI to leverage our unusual experience running large-scale AI infrastructure securely, reliably, and at scale.

OpenAI will start using AWS’s infrastructure immediately and we expect to have all of the capacity deployed before end of next year-- with the ability to expand in 2027 and beyond.

Amazon Web Services (AWS) and OpenAI have entered a multi-year, strategic partnership that will provide OpenAI with immediate and growing access to AWS’s industry-leading cloud infrastructure. Under the $38 billion agreement, AWS will deliver state-of-the-art compute—including hundreds of thousands of NVIDIA GPUs and the capacity to scale to tens of millions of CPUs—enabling OpenAI to accelerate its frontier research, model training, and inference workloads.

The new deployment, expected to be fully operational by the end of 2026, will serve as a backbone for OpenAI’s most advanced systems, including ChatGPT. AWS’s EC2 UltraServers, built around the latest GB200 and GB300 GPUs, are engineered for low-latency, large-scale AI performance, and designed to support both training and inference with seamless scalability through 2027 and beyond.

“Scaling frontier AI requires massive, reliable compute,” said Sam Altman, co-founder and CEO of OpenAI. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”

“As OpenAI continues to push the boundaries of what's possible, AWS’s best-in-class infrastructure will serve as a backbone for their AI ambitions,” said Matt Garman, CEO of AWS. “The breadth and immediate availability of optimized compute demonstrates why AWS is uniquely positioned to support OpenAI’s vast AI workloads.”

AWS brings decades of experience operating hyperscale infrastructure securely and reliably, running clusters that exceed 500,000 chips. This collaboration underscores AWS’s growing role as the infrastructure foundation for frontier AI model providers seeking performance, scalability, and security at global scale.

Earlier this year, OpenAI’s open-weight foundation models became available on Amazon Bedrock, giving AWS customers access to OpenAI models for agentic workflows, scientific analysis, and generative applications across industries. OpenAI is already one of the most utilized model providers on Bedrock, with adopters including Bystreet, Comscore, Peloton, Thomson Reuters, Triomics, and Verana Health.

Andy Jassy, President and CEO of Amazon, announced the partnership on LinkedIn, highlighting AWS’s “unusual experience running large-scale AI infrastructure securely, reliably, and at scale.”