AWS and Open AI. Picture: AWS
Amazon Web Services (AWS) and OpenAI have announced a $38 billion multi-year, strategic partnership that provides AWS’s infrastructure to run and scale OpenAI’s core artificial intelligence (AI) workloads starting immediately.
Under the agreement, which will see growth over the next seven years, OpenAI is accessing AWS compute comprising hundreds of thousands of state-of-the-art NVIDIA GPUs, with the ability to expand to tens of millions of CPUs to rapidly scale agentic workloads.
Infrastructure
AWS has experience in running large-scale AI infrastructure securely, reliably, and at scale–with clusters topping 500K chips.
“Scaling frontier AI requires massive, reliable compute,” said OpenAI co-founder and CEO Sam Altman. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”
The rapid advancement of AI technology has created unprecedented demand for computing power.
“As OpenAI continues to push the boundaries of what’s possible, AWS’s best-in-class infrastructure will serve as a backbone for their AI ambitions,” said Matt Garman, CEO of AWS.
“The breadth and immediate availability of optimized compute demonstrates why AWS is uniquely positioned to support OpenAI’s vast AI workloads.”
ALSO READ: Uber amongst ‘coolest’ brands again in South Africa
AI processing
The infrastructure deployment that AWS is building for OpenAI features a sophisticated architectural design optimized for maximum AI processing efficiency and performance.
Clustering the NVIDIA GPUs – both GB200s and GB300s – via Amazon EC2 UltraServers on the same network enables low-latency performance across interconnected systems, allowing OpenAI to efficiently run workloads with optimal performance.
The clusters are designed to support various workloads, from serving inference for ChatGPT to training next generation models, with the flexibility to adapt to OpenAI’s evolving needs.
Open AI
Earlier this year, OpenAI open weight foundation models became available on Amazon Bedrock, bringing these additional model options to millions of customers on AWS.
OpenAI has quickly become one of the most popular publicly available model providers in Amazon Bedrock with thousands of customers – including Bystreet, Comscore, Peloton, Thomson Reuters, Triomics, and Verana Health.
ALSO READ: Tips for teachers and parents to help children use GenAI responsibly

