[ad_1]
OpenAI could be looking at saving operational costs.
OpenAI, the creator of hugely popular ChatGPT, could be looking at creating its own chipsets to power its AI-driven infrastructure. Here’s what we know so far.
OpenAI, the Microsoft-backed company behind ChatGPT, is reportedly considering making artificial intelligence chipsets of its own and has even set a potential acquisition target.
As per Reuters, the company is still in the planning stage—and at the moment—has not decided to go ahead with the move. But the company has been affected by the expensive chipsets it needs for its AI-driven ambitions and has been looking to solve the shortages of the said chips.
To facilitate this, Reuters notes that OpenAI is considering partnering with giants like NVIDIA, and in the future, may even diversify beyond it. Also, it must be acknowledged that OpenAI CEO, Sam Altman, has previously openly stated that the company finds it tough to source GPUs—which are needed to run its infrastructure. But this is a market that is currently dominated by NVIDIA.
With that said, the alleged move may also help OpenAI to reduce its operational costs. Notably, running ChatGPT is very expensive for OpenAI. “Each query costs roughly 4 cents, according to an analysis from Bernstein analyst Stacy Rasgon. If ChatGPT queries grow to a tenth the scale of Google search, it would require roughly $48.1 billion worth of GPUs initially and about $16 billion worth of chips a year to keep operational,” Reuters reported.
Earlier this year, SemiAnalysis told The Information that OpenAI could be burning through $7,00,000 per day. Moreover, with the company gradually making GPT-4 mainstreammoving away from GPT 3.5 LLM—the currently freely available version that ChatGPT runs on—could make things even more expensive for OpenAI.
So with these alleged possibilities, it could be a natural move on OpenAI’s end to go forward with the decision. But Reuters notes that it could take years before this happens.
[ad_2]