OpenAI and AWS revealed a multi-year strategic partnership, enabling OpenAI to leverage AWS’s industry-leading infrastructure to power its advanced artificial intelligence (AI) workloads, beginning immediately.
Under the agreement a commitment of approximately US $38 billion over the next seven years – OpenAI will gain access to AWS’s compute offerings, including hundreds of thousands of state-of-the-art NVIDIA GPUs and the capacity to scale to tens of millions of CPUs. AWS has a proven record of operating large-scale AI infrastructure, with clusters exceeding 500,000 chips in production.
The collaboration pairs AWS’s cloud infrastructure leadership with OpenAI’s innovations in generative AI, ensuring millions of users continue to benefit from tools such as ChatGPT. Immediate production deployment is underway, with full capacity targeted for deployment by end of 2026 and further expansion into 2027 and beyond.
The infrastructure being built for OpenAI by AWS features Amazon “EC2 UltraServers”, which cluster NVIDIA GB200s and GB300s via ultra-low latency networks. These architectures are designed to support inference workloads (serving models such as ChatGPT) and next-generation model training with maximum efficiency, performance and scalability.
Also Read: Cisco Debuts New Unified Edge Platform for Distributed Agentic AI Workloads
Commenting on the milestone, OpenAI co-founder and CEO Sam Altman said: “Scaling frontier AI requires massive, reliable compute. Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”
Matt Garman, CEO of AWS, added: “As OpenAI continues to push the boundaries of what’s possible, AWS’s best-in-class infrastructure will serve as a backbone for their AI ambitions. The breadth and immediate availability of optimised compute demonstrates why AWS is uniquely positioned to support OpenAI’s vast AI workloads.”
Earlier this year, OpenAI’s open-weight foundation models became available on Amazon Bedrock, making these models accessible to millions of AWS customers. OpenAI has already become one of the most popular model providers on Amazon Bedrock, with thousands of customers including Bystreet, Comscore, Peloton, Thomson Reuters, Triomics, and Verana Health leveraging OpenAI models for agentic workflows, coding, scientific analysis and mathematical problem-solving.





