OpenAI Might Develop Its Own AI Chips Soon

With the shortage of chips that remains a problem to this day, tech companies are looking into the possibility of developing their own chips that would be readily available to them. Meta has been giving it a shot since May and it looks like OpenAI is trying to do the same.

OpenAI
(Photo : Didem Mente/Anadolu Agency via Getty Images)

OpenAI Chips for AI Models

The Microsoft-backed AI company is among the fastest-growing AI firms in the industry, providing us with tools like ChatGPT and DALL-E 3. However, they are not exempted from the same problem that affects most companies trying to develop AI technology.

With the chip shortage still making it hard to train AI models, OpenAI is already working on a way to create its own by designing them internally or acquiring an AI chip manufacturer to speed things along, as reported by Tech Crunch.

Currently, OpenAI CEO Sam Altman has reportedly made the efforts a priority within the company. Once the ChatGPT-maker does so successfully, it will no longer rely on GPU-based hardware to develop AI models for its services.

While that's also a good way to go, it could prove to be expensive for the company, especially since GPU supply has also been experiencing a few setbacks. To put how bad that is into perspective, the particular hardware is needed for services to run smoothly.

With the shortage, AI tools and services might experience delays and disruptions for their users. Nvidia won't be of much help in that department as well, since according to reports, the hardware giant's best AI chips won't be available again until 2024.

Bernstein analyst Stacy Rasgon says that if ChatGPT use increased to a tenth of the scale of Google Search's traffic, OpenAI is looking at about $48.1 billion in costs for GPUs for a start, and about $16 billion for chips to keep things running smoothly.

We are yet to see if the AI company will be successful in its attempt, especially since others have been trying and haven't really gotten to anything significant. For instance, Meta has been trying to make its own chip for months but is still using third-party chips for its devices.

Read Also: OpenAI and Microsoft May Be Having a Few Partnership Issues

Meta's Attempts at an AI Chip

The social media company is still working on a new chip that is designed to be optimized for AI. Calling it the Meta Training and Inference Accelerator or MTIA, Mark Zuckerberg aims to create an "opportunity to introduce AI agents to billions of people in ways that will be useful and meaningful."

Meta VP and Head of Infrastructure Santosh Janardhan says that it will be the company's "in-house, custom accelerator chip family targeting inference workloads." It will serve as a way to provide greater compute power and efficiency that is customized for Meta's workloads, as per The Verge.

If Meta happens to be successful in this endeavor, the pairing of MTIA chips and GPUs would allow Meta to provide better performance, decreased latency, and greater efficiency for each workload. The chip is set to come out in 2025.

Related: OpenAI Launches ChatGPT Enterprise

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

More from iTechPost