Now Reading
OpenAI Uses Google AI Chips for Product Power

OpenAI Uses Google AI Chips for Product Power

OpenAI rents Google AI chips to support ChatGPT and other products.

OpenAI has recently begun renting Google’s artificial intelligence chips to power ChatGPT and its other products, according to a source familiar with the matter. Although OpenAI remains one of the largest purchasers of Nvidia GPUs, this marks a notable shift. Traditionally, the company has relied on Nvidia’s chips for both model training and inference computing where an AI system makes predictions based on previously learned information.

Earlier this month, Reuters exclusively reported OpenAI’s plan to add Google Cloud services to scale its growing computing demands. This development highlights a surprising collaboration between two major players in the AI sector. While OpenAI is supported by Microsoft, this move signals an expansion beyond its backer’s infrastructure. As OpenAI increases usage of Google’s tensor processing units (TPUs), it becomes clear that the company is exploring alternatives to reduce reliance on Nvidia’s more costly GPUs.

Strategic Shifts and Competitive Implications

For Google, this deal reflects a broader strategy. The company is making its TPUs previously used only internally available to external clients. This has attracted not just OpenAI but also companies like Apple, and startups such as Anthropic and Safe Superintelligence, both founded by former OpenAI executives. Google’s decision to open its hardware ecosystem has therefore helped it gain traction in a competitive AI market.

Notably, OpenAI’s use of TPUs marks the first significant instance of adopting non-Nvidia chips. The decision aligns with the company’s broader ambitions, including plans to finalize its first custom chip design later this year. Although this diversification introduces new opportunities, it also comes with limitations. According to The Information, Google is not providing its most advanced TPUs to OpenAI. A Google Cloud employee confirmed this, reflecting the careful balance of cooperation and competition between the two firms.

See Also
Windows 11 screen showing system recovery environment with update tools.

Cost, Control, and Cloud Ambitions

OpenAI is betting that Google’s TPUs will help reduce inference costs. Since inference is a critical and ongoing part of AI deployment, any improvement in efficiency can have significant financial implications. At the same time, Google continues to strengthen its cloud business by leveraging its vertically integrated AI tools from hardware like TPUs to cloud infrastructure.

While OpenAI did not respond to Reuters’ request for comment, and Google declined to speak on the matter, the move underscores evolving dynamics in the AI landscape. Transitioning to TPUs could reduce costs, broaden computing options, and pave the way for future independence from third-party chips—though with trade-offs in access and capability.

View Comments (0)

Leave a Reply

Your email address will not be published.

© 2024 The Technology Express. All Rights Reserved.