OpenAI, the company behind ChatGPT, is reportedly taking steps to address the mounting challenges of artificial intelligence (AI) chip shortages and rising costs.
So if you dig into custom, Creating an AI chip OpenAI could be a move that mirrors the path of big tech companies like Amazon and Google.
The sad state of GPUs
Currently, OpenAI relies heavily on graphics processing units (GPUs), with market leader Nvidia, which holds over 80% of the global market share, as the primary provider. GPUs are particularly suited for AI applications because they can handle parallel operations.
🤖💥 OpenAI’s quest for AI chip superiority 🧠🔌
OpenAI, the company behind ChatGPT, is embarking on a strategic initiative to address the shortage and cost of AI chips, with plans including chip development and acquisitions. Here’s the scoop:
🔬 Explore
multiple… pic.twitter.com/JfgiM4QCSS— AI Journos (@AIJournos) October 6, 2023
However, with increasing global demand for AI-driven solutions, the availability of these critical chips is decreasing. As a result, prices have skyrocketed and companies like OpenAI are suffering.
CEO Sam Altman hasn’t been quiet about the burden either. His ChatGPT, OpenAI’s flagship product, exemplifies the seriousness of the situation. Each query processed by ChatGPT costs approximately 4 cents. To put this into perspective, if ChatGPT usage grew to one-tenth the size of Google Search, OpenAI would require an initial investment of approximately $48.1 billion worth of GPUs, followed by will require an annual investment of $16 billion.
To build or not to build: The OpenAI conundrum
Faced with these costs and the potential threat of GPU shortages, OpenAI is considering alternatives. One viable option is to create a custom AI chip tailored to your specific needs. Such a strategic shift would put OpenAI on the same path as Amazon and Google. Amazon and Google have been searching for chip designs that better meet their operational demands.
But chip manufacturing is a difficult challenge. In addition to the technical complexity, there are also significant financial implications. Industry experts estimate that this effort could cost OpenAI hundreds of millions of dollars annually. And while acquisitions can speed up the process (as Amazon did with its acquisition of Annapurna Labs in 2015), they don’t eliminate the risks inherent in ventures.
Beyond immediate needs
While the immediate concern is addressing GPU shortages and costs, the impact of this move extends beyond operational efficiency. Microsoft, one of his major investors in OpenAI, has been using his giant supercomputer with 10,000 Nvidia GPUs since 2020. This setup can only last for a while, mainly because artificial intelligence advances and requires more computing power.
In addition, several players Monopolize The AI chip market is dominated by Nvidia. Diversifying your chip sources or owning your own chips increases OpenAI’s flexibility, reduces dependencies, and enables more customized and efficient AI solutions.
Solving the mystery of processing power
To understand this difficulty, it is important to understand the computing power of AI models. Additionally, training and fine-tuning these models requires resources. Importantly, OpenAI’s generative AI technology runs on a 10,000 GPU supercomputer. These models are not only extensive, but also continually changing. Additionally, OpenAI’s models have evolved significantly in recent years and are likely to continue to do so.
However, increasing the number of parameters also increases the computational power, which increases the cost. When a model reaches up to 175 billion parameters, as is the case with the ChatGPT model, the computational demands become enormous. The primary costs are the acquisition of these GPUs and the ongoing power, maintenance, and upgrade costs.
OpenAI’s potential foray into developing custom AI chips reflects its proactive approach to industry-wide challenges. While the initial motivation is based on addressing the current GPU shortage and high costs, the long-term implications could be game-changing.
Custom chips could enable more efficient AI models, reduce operating costs, and create a precedent for other AI-driven businesses. But like any pioneering effort, the path ahead is unpredictable and has hurdles. The next steps OpenAI takes will define the path to the next stage of AI research.