ChatGPT Creator to Open ‘App Store’ for Custom AI Chatbots

    Published on:

    The boom in artificial intelligence is pushing the chip manufacturing industry to its limits, leading to a shortage of GPUs, the fundamental processing units that power machine learning (ML) models.

    A decentralized computing network could offer an off-the-shelf solution, according to cryptocurrency research and data specialist Messari.

    Growing demand and GPU requirements

    A new report from Messari examines the challenges faced by chip makers such as Nvidia, which is struggling to keep up with demand in the wake of AI mania. T.High cost and limited chip availability raise concerns about the future deployment of AI applications.

    The AI ​​industry relies on GPUs, which are “essential for training and querying ML models,” Messari said. Due to the rapid increase in sales, manufacturers are no longer able to keep up with the demand, and the product is in short supply.

    However, there may be light at the end of the tunnel, as solutions may already exist in the form of distributed computing networks.

    “Distributed computing networks offer a promising solution by connecting entities with idle computing power and alleviating GPU starvation,” Messari tweeted. Wednesday.

    There are many potential cryptocurrency computing projects that could be entered to meet demand.

    Regarding model training and fine-tuning, Messari points out: Genshin and together. Projects on the model inference side that Messari promotes include: Giza, give, chain ML, modulus lab and bittensor.

    A more general purpose computing network is Akash, praise, iExec, true bit, Bacalhau and flux.

    Harnessing the power of idle GPUs will reduce demand for high-end GPUs, reduce costs, and improve accessibility for AI developers, Messari said.

    lots of chips

    Recent report Research firm TrendForce has revealed that ChatGPT may require more than 30,000 GPUs from Nvidia to efficiently process training data.

    TrendForce estimates are based on the following computational power: Nvidia A100 Graphics cards are priced between $10,000 and $15,000. Nvidia is expected to generate substantial revenue that could reach $300 million due to the high demand driven by ChatGPT.

    The demand for GPUs in AI is growing exponentially as ML models become more complex, requiring larger parameter models and increased computational power. The advent of transformers and their application to language modeling has further increased computational requirements, doubling demands every three to six months.

    Political Tensions and GPU Supply Constraints

    a newtown blog Research on distributed computing in AI and ML suggests that political tensions contribute to constraining GPU supply. Semiconductor production relies on a complex stack of mechanical, physical, chemical, logistical and commercial factors.

    Taiwan accounts for 63% of the semiconductor foundry market and is a hub of the global supply chain. However, geopolitical tensions between the United States and China bring uncertainty and potential threats to the semiconductor industry, highlighting the need to diversify supply chains.

    The blog also confirms that cloud providers such as AWS, GCP, and Azure offer GPU rentals, but need help with pricing and availability.

    The continued strained relationship between the United States and China therefore presents great opportunities for distributed computing networks.


    Leave a Reply

    Please enter your comment!
    Please enter your name here