OpenAI, a prominent AI startup, is seriously considering venturing into developing its own artificial intelligence (AI) chips. This strategic move comes as a response to the ongoing chip shortage which severely impacts the training of AI models. The shortage has been exacerbated by the growing demand for generative AI, impacting the availability of GPU-based hardware. This is fundamental for training advanced AI models such as ChatGPT, GPT-4, and DALL-E 3.
The discussions around AI chip strategies within OpenAI have been ongoing for over a year. In this time, the firm has been exploring potential solutions to address the scarcity of chips. These options encompass the possibility of acquiring an AI chip manufacturer or embarking on an in-house chip design endeavor. OpenAI's CEO, Sam Altman, has prioritized the acquisition of additional AI chips to sustain the company's operational requirements.
The current scenario emphasizes the challenges posed by the soaring costs associated with running AI hardware and the scarcity of advanced processors essential for OpenAI's software. The cost analysis reveals that each query for ChatGPT amounts to about 4 cents, and if scaled to a fraction of Google's search scale, it necessitates a staggering initial investment and substantial annual chip costs.
If OpenAI proceeds with developing its own AI chips, it would join a select group of tech giants like Google and Amazon striving to exert control over chip designs crucial to their operations. However, it's important to note that this undertaking would represent a significant strategic initiative and a substantial investment, potentially costing hundreds of millions annually.