Sam Altman, CEO of Open AI, speaks during a conversation session with SoftBank Group CEO Masayoshi Son at the Transforming Businesses with AI event on February 3, 2025 in Tokyo.
Tomohiro Ohsumi | Getty Images
In November, after Nvidia’s latest version earnings Beat CEO Jensen Huang bragged to investors about his company’s position in artificial intelligence and said of the hottest startup in the space: “Everything that OpenAI does today is powered by Nvidia.”
While it’s true that Nvidia maintains a dominant position in AI chips and is now the most valuable company in the world, competition is emerging, and OpenAI is doing its best to diversify, following a historically aggressive expansion plan.
OpenAI announced on Wednesday A $10 billion deal chipmaker Cerebras is a relatively new player in the space, but it’s targeting the public market. This was the last one chain of transactions Between OpenAI and processor companies, there is a need to build large language models and run increasingly complex workloads.
Last year, OpenAI invested more than $1.4 trillion in infrastructure deals with companies including Nvidia, Advanced Micro Devices and Broadcomon the way to the team a $500 billion private market valuation.
As OpenAI races to meet projected demand for its AI technology, it has expressed its desire to get as much processing power into the market as possible. Here are the key chip deals OpenAI has signed as of January and potential partners to focus on in the future.
Nvidia
Nvidia co-founder and CEO Jensen Huang speaks during Nvidia Live at CES 2026 on January 5, 2026, ahead of the annual Consumer Electronics Show in Las Vegas.
Patrick T. Fallon | Afp | Getty Images
From the early days of building large language models, before the launch of ChatGPT and the start of the generative AI boom, OpenAI relied on Nvidia GPUs.
In 2025, this relationship has reached another level. Nvidia after investing in OpenAI in late 2024 announced In September, it said it would commit $100 billion to support OpenAI, which will build and deploy at least 10 gigawatts of Nvidia systems.
A gigawatt is a measure of power, and 10 gigawatts is roughly equivalent to annual energy consumption 8 million US householdsAccording to a CNBC analysis of Energy Information Administration data. Huang said in September that 10 gigawatts would equate to 4 million to 5 million GPUs.
“It’s a giant project,” Huang told CNBC at the time.
The first phase of the OpenAI and Nvidia project is expected to be online in the second half of this year on Nvidia’s Vera Rubin platform. However, during Nvidia’s November quarterly earnings call, the company said “no guaranteeIts agreement with OpenAI will evolve from announcement to formal contract phase.
Nvidia’s initial investment of $10 billion will be placed when the first gigawatt is completed, and the investments will be based on then-current valuations, according to CNBC. previously reported.
AMD
Lisa Su, Advanced Micro Devices Inc. chairman and chief executive officer, demonstrates the AMD Instinct MI455X GPU during the CES 2026 event in Las Vegas, January 5, 2026.
Bloomberg | Bloomberg | Getty Images
OpenAI in October announced plans to deploy six gigawatt AMD GPUs over several years and several generations of hardware.
As part of the deal, AMD granted OpenAI warrants for up to 160 million shares of AMD common stock, which could represent approximately a 10% stake in the company. The warrant covers milestones related to both the volume of the offering and AMD’s share price.
The companies said they plan to produce the first gigawatt of chips in the second half of 2026, and said the deal would be worth billions of dollars, without disclosing an exact amount.
“You really need these kinds of partnerships that bring the ecosystem together, you know, so we can really get the best technologies,” AMD CEO Lisa Su told CNBC at the time of the announcement.
Altman planted the seeds of the deal in June, when he appeared on stage with Su at AMD launch event San Jose, California. According to him, OpenAI plans to use AMD’s latest chips.
Broadcom
Broadcom CEO Hock Tan.
Lucas Jackson | Reuters
Later that month, OpenAI and Broadcom went public cooperation has been working for more than a year.
Broadcom calls its own AI chips XPU, and so far has trusted few customers. But its pipeline of potential deals has generated so much enthusiasm on Wall Street that Broadcom is now worth more than $1.6 trillion.
OpenAI said it is developing its own AI chips and systems, which will be developed and distributed by Broadcom. The companies have agreed to deploy 10 gigawatts of these custom AI accelerators.
In an October release, the companies said Broadcom plans to deploy the AI accelerator and network backbone by the second half of this year, with the goal of completing the project by the end of 2029.
But Broadcom CEO Hock Tan told investors During the company’s quarterly earnings call in December, it did not expect much revenue from its OpenAI partnership in 2026.
“We appreciate that this is a multi-year journey that will last until 2029,” Tan said. “I call it an agreement, an alignment of where we’re going.”
OpenAI and Broadcom did not disclose financial terms of the deal.
Brain
Andrew Feldman, founder and CEO of Cerebras Systems, will speak at the Collision conference in Toronto, June 20, 2024.
Ramsey Cardi | Sportsfile | Collision | Getty Images
OpenAI on Wednesday announced The deal to deploy 750 megawatts of Cerebras AI chips will come online through 2028 in multiple tranches.
Cerebras builds large wafer-scale chips that can deliver responses 15 times faster than GPU-based systems. release. The company is much smaller than Nvidia, AMD and Broadcom.
OpenAI’s deal with Cerebras is worth more than $10 billion, and could be a boon for the chipmaker as it makes a potential debut in the public markets.
“We’re excited to partner with OpenAI, bringing the world’s leading AI models to the world’s fastest AI processor,” Cerebras CEO Andrew Feldman said in a statement.
Cerebras is in dire need of branded customers. Company in October retreated The plans for an IPO come days after it announced it had raised more than $1 billion in a fundraising round. It had filed for an IPO a year ago, but its prospectus was a hard faith To one customer in the United Arab Emirates, Microsoft– Backed by G42, it is also an investor in Cerebras.
Potential partners
Sam Altman, CEO of OpenAI, speaks during a media tour of the Stargate data center in Abilene, Texas on September 23, 2025. Stargate is a collaboration between OpenAI, Oracle and SoftBank, backed by President Donald Trump, to build data centers and other infrastructure for artificial intelligence across the US.
Kyle Grillot | Bloomberg | Getty Images
Where does this go? Amazon, Google and Intel, they all have their own AI chips?
In November, OpenAI signed the agreement $38 billion dealing with the cloud Amazon Web Services. OpenAI will run workloads through existing AWS data centers, but the cloud provider plans to build additional infrastructure to run it as part of the agreement.
Amazon is also in talks to invest even more $10 billion on OpenAI, as CNBC previously reported.
OpenAI may decide to use Amazon’s AI chips as part of those investment discussions, but nothing official has been determined, according to a person familiar with the matter who spoke on condition of anonymity because the discussions are confidential.
AWS announced its 2018 Inferentia chips and last generation its Trainium chips in late 2025.
Google Cloud provides computing power to OpenAI thanks to deal it ended quietly last year. But OpenAI said In June, Broadcom also said it did not plan to use Google’s internal chips, called tensor processing units, to help produce them.
Intel is the biggest AI laggard among traditional chip makers, which explains why the company recently received massive investments from the US government and Nvidia. Reuters reported In 2024, citing people with knowledge of the discussions, Intel was able to invest in OpenAI several years ago and build hardware for the then-new startup, offering it a way to not be dependent on Nvidia.
Intel backed out of the deal, Reuters reports.
Intel in October announced Codenamed Crescent Island, the new data center is said to be “designed to meet the growing demands of GPU and AI workloads and offers high memory capacity and energy-efficient performance.” The company said “customer selection” is expected in the second half of 2026.
Wall Street will hear updates on Intel’s latest AI efforts as the company kicks off its tech earnings season next week.
— CNBC’s Keef Lesswing, Mackenzie Sigalos and Jordan Noveth contributed to this report.
REVIEW: Hacking AI chips from Nvidia GPUs to Google and Amazon ASICs


