OpenAI is taking a page out of Big Tech’s playbook by reportedly building its own chips

OpenAI CEO Sam Altman. 

Building custom AI chips has long been the preserve of a select few tech companies — but OpenAI might be about to join the party.

The AI startup is in talks with semiconductor firms Broadcom and TSMC to design and produce its own AI chips, sources told Reuters on Wednesday.

OpenAI has reportedly assembled a team headed by former Google engineers to lead the project and has secured manufacturing capacity with TSMC, the world’s largest contract chipmaker, to build the first chips in 2026.

However, Reuters reported that OpenAI has backed away from ambitious plans to build its own network of chip factories, known as fabs.

OpenAI declined to comment when approached by B-17.

The move to design chips in-house would give OpenAI greater control of its supply chain and the chance to create chips better suited to its needs.

“By working in tandem with Broadcom, OpenAI can design chips that are specifically tailored to power its models, offering more speed and greater energy efficiency,” Kate Leaman, chief market analyst at AvaTrade, told B-17.

“Nevertheless, this collaboration doesn’t just concern efficiency — it’s also about control. Custom chips could result in less dependency on external suppliers and potentially lower costs,” she added.

It’s a play that echoes those made in the past by the likes of Amazon, Google, Microsoft, Meta, and Apple — underscoring how much of a force OpenAI has become since it launched in late 2015.

Tech giants go their own way

Chip design is a venture with high upfront costs, typically making it an option reserved for the biggest tech companies.

Amazon has used its own home-grown central-processing unit chips in its data centers since 2018. Amazon Web Services executive Rahul Kulkarni told B-17 this month that over 90% of AWS’ biggest customers are now using the company’s Graviton chip.

Earlier this year, Google unveiled its new Axion proprietary chip, which the company said is optimized for training and running AI models as efficiently as possible.

Google has been building TPUs — chips designed especially for artificial intelligence — since 2015 and renting them out via its cloud service. Apple even used Google’s chips to train the models that power the iPhone’s AI features.

Microsoft unveiled its own custom Maia AI chip in November 2023, and Meta also joined the party by releasing the latest version of its Meta Training and Inference Accelerator chip in April.

CEO Mark Zuckerberg has vowed to spend heavily on building out Meta’s chip capacity, with the company saying it would spend at least $35 billion on AI infrastructure in 2024.

Chip diversification

In-house chip design comes with advantages that extend beyond customization.

OpenAI’s move, which will also reportedly see it incorporate AMD chips into its supply mix, means it would reduce its dependency on Nvidia, the market leader for AI chips.

Gil Luria, a senior software analyst at investment firm D.A. Davidson, said that the “over-reliance on Nvidia chips has caused bottlenecks for Microsoft, OpenAI, and others and has been extraordinarily expensive.”

Luria added that the AI chip market is too important for companies to rely on one supplier. “That is why Google and Amazon are investing so much in growing their own chip supply, Microsoft has developed new AI chips and evidently OpenAI is considering the same,” he said.

“We’ve seen with Meta and Alphabet that designing your own chip is one way of improving the power of your model,” Edward Wilford, a senior principal analyst at tech consultancy Omdia, told B-17.

“The fact that it makes them perhaps less reliant on Nvidia is certainly a bonus,” he added.

Soaring demand for AI chips

OpenAI’s CEO, Sam Altman, along with other Big Tech CEOs, has been outspoken about the need to increase chip supply.

The Wall Street Journal previously reported that Altman was seeking as much as $7 trillion to boost the world’s chip-building capacity and accelerate progress toward developing advanced artificial intelligence. However, the reported move was met with skepticism from industry leaders.

While it’s unclear how much OpenAI’s reported chip-building push will cost, creating custom AI chips doesn’t come cheap. Pierre Ferragu, an analyst at New Street Research, told The New York Times in January that Google spent an estimated $2 billion to $3 billion in 2023 building a million of its own AI chips.

Designing its own chips will likely put more pressure on OpenAI’s finances. A report from The Information earlier this month found that OpenAI expects to lose $44 billion between 2023 and 2028, and doesn’t expect to turn a profit before 2029.

However, the fast-growing company has deep pockets to meet its goals, raising a historic $6.6 billion from investors in early October.

Similar Posts

Leave a Reply