More

    Nvidia Fuels the Cut-Throat AI Race With Its $10k A100 Chip

    Published on:

    Nvidia has jumped on the AI ​​bandwagon by putting A100 chips in most of the industry’s applications in the race for AI supremacy. Tech giants such as Microsoft and Google are racing to incorporate AI technology into their search engines, but the likes of Stable Diffusion and OpenAI are already ahead.

    OpenAI released ChatGPT in November to immediate fanfare, with other tech companies trying to emulate it or come up with rival products. Efforts like this require powerful processing power, and as one of the most important tools for the AI ​​industry, Nvidia’s A100 has helped.

    Also read: US Copyright Office says AI-generated images cannot be copyrighted

    Today’s AI Operating Systems

    Nathan BenaichAn investor who publishes a newsletter focused on AI.

    According to New Street Research, Nvidia owns 95% of the total market for graphics processors that can be used for machine learning.

    “Nvidia AI is essentially the operating system for today’s AI systems,” Nvidia CEO Jensen Huang said during an earnings call with analysts on Wednesday.

    A100 is suitable for machine learning models powering tools like ChatGPT, Bing AI, and Stable Diffusion. You can perform many simple calculations at once. This is important for training and using neural network models.

    Companies like Google and Amazon are developing their own chips specifically designed for AI workloads, AI report compute index status shows that AI hardware continues to be strongly integrated into Nvidia.

    As of December, over 21,000 open-source AI papers said they used Nvidia chips. Additionally, most researchers included in the report used V100, which was first released in 2017.

    Nvidia’s future is definitely AI

    NVIDIA Popular It has a graphics processing unit (GPU) and derives most of its revenue from these dedicated chips. The company designs and sells GPUs for gaming, cryptocurrency mining, and professional applications, as well as chip systems for use in vehicles, robotics, and other tools.

    Overall sales fell 21%, according to its fourth-quarter earnings report released last Wednesday. However, investors pushed the stock up by about 14% the next day, largely due to the AI ​​business.

    according to earnings report, the AI ​​chip business (data centers) grew 11% to $3.62 billion during the fourth quarter. The company said the growth was due to US cloud service providers buying more products.

    Looking at the year ending February 23, the stock is up 65%. Huang said AI is at an “inflection point,” with companies of all sizes buying chips to develop learning software.

    “The versatility and capabilities of generative AI are forcing companies around the world to develop and deploy AI strategies,” said Huang.

    “Our activity around the AI ​​infrastructure we built, and our inference using Hopper and Ampere to influence large language models, has gone through the roof in the last 60 days,” he added. rice field.

    AI as a service is underway

    The company also announced that it will launch a cloud-based AI-as-a-service option. This will allow small businesses to leverage that processing power for training AI models, including the kinds that power ChatGPT.

    Brett Simpson, co-founder of Arete Research Center Said Yahoo Finance Live is more than just ChatGPT.

    “At the end of the day, the strength of AI is much stronger than traditional computing. It goes far beyond chatbots,” Simpson said.

    Meanwhile, Daniel Howley, Technology Editor at Yahoo Finance Live, said, “For now, Nvidia’s future is directly tied to AI.”

    bullish sentiment on stocks

    The hype around AI, ChatGPT, and Nvidia’s ability to monetize this has resulted in positive remarks from Wall Street giants such as: Goldman Sachs.

    Goldman Sachs said it was “wrong” to sit on the sidelines in anticipation of a setback in the company’s fundamentals, raising its share price from “neutral” to “buy” in a memo on Thursday.

    Goldman said, “Accelerating AI development/adoption across hyperscalers and enterprises is more of a leadership position for the company as customers with a sense of urgency rely on scalable and available solutions. I think it will help,” he said. .

    Nvidia’s A100 was first introduced in 2020 before the more expensive H100 came out in 2022.

    Nvidia says the H100 is one of the first of its data center GPUs optimized for transformers, an increasingly important technology used by many modern AI applications.

    Nvidia says it wants to speed up AI training by over a million percent. Talk about warp speed.

    share this post

    Related

    Leave a Reply

    Please enter your comment!
    Please enter your name here