Finance

Nvidia debuts next-generation Blackwell AI chip at GTC 2023


On Monday, Nvidia (NVDA) CEO Jensen Huang took the wraps off of the company’s highly anticipated Blackwell graphics processing unit (GPU) at the company’s annual GTC conference in San Jose, Calif.

The Blackwell is the successor to Nvidia’s already highly coveted H100 and H200 GPUs, and according to the company, it is the world’s most powerful chip. The H100 and H200 chips have become the go-to GPUs for AI applications, helping to rocket Nvidia’s data center revenue over the last few quarters.

In its latest quarter alone, the company reported data center revenue of $18.4 billion. To put the segment’s growth into perspective, Nvidia reported annual revenue of $27 billion for all of 2022.

“For three decades we’ve pursued accelerated computing with the goal of enabling transformative breakthroughs like deep learning and AI,” Huang said in a statement.

“Generative AI is the defining technology of our time. Blackwell GPUs are the engine to power this new industrial revolution. Working with the most dynamic companies in the world, we will realize the promise of AI for every industry.”

Like its prior Hopper GPUs, the Blackwell GPU will be available as a standalone GPU, or two Blackwell GPUs can be combined and paired with Nvidia’s Grace central processing unit to create what it calls its GB200 Superchip.

That setup, the company says, will offer up to a 30x performance increase compared to the Nvidia H100 GPU for large language model inference workloads while using up to 25x less energy. That energy savings is an important part of the story.

Nvidia customers, including Microsoft (MSFT), Amazon (AMZN), Google (GOOG, GOOGL), Meta (META), and Tesla (TSLA), are currently using or actively developing their own in-house AI chips as alternatives to Nvidia’s offerings. Part of the reason for that is so that they don’t have to pay the tens of thousands of dollars Nvidia’s chips are estimated to cost. But the other reason is that Nvidia’s chips are especially power-hungry.

By talking up its energy savings with the Grace Blackwell Superchip, Nvidia is speaking directly to its customers’ concerns.

Nvidia says Amazon, Google, Microsoft, and Oracle (ORCL) will be among the first companies to start offering access to Blackwell chips through their cloud platforms.

In addition to the Blackwell and Grace Blackwell chips, Nvidia also debuted its DGX SuperPOD supercomputer system. The DGX SuperPOD is made up of eight or more DGX Grace Blackwell 200 (GB200) systems, which include 36 Grace Blackwell 200 Superchips paired to run as a single computer. Nvidia says customers can scale up the SuperPOD to support tens of thousands of GB200 Superchips depending on their needs.

Subscribe to the Yahoo Finance Tech newsletter.Subscribe to the Yahoo Finance Tech newsletter.

Subscribe to the Yahoo Finance Tech newsletter. (Yahoo Finance)

The DGX SuperPOD also gets a new liquid-cooled rack-scale architecture, which means the system is cooled off using fluid circulating through a series of pipes and radiators rather than straight-up fan-based air cooling, which can be less efficient and more energy intensive.

The Blackwell GPU and GB200 Superchip will be Nvidia’s new top-of-the-line leaders when it comes to AI training and inferencing, which means they’ll be in high demand the moment they hit the market.

But competitors AMD (AMD) and Intel (INTC) aren’t sitting idly by; they are almost certainly working on their own chips with the hopes of eventually catching Nvidia.

Daniel Howley is the tech editor at Yahoo Finance. He’s been covering the tech industry since 2011. You can follow him on Twitter @DanielHowley.

Click here for the latest technology news that will impact the stock market.

Read the latest financial and business news from Yahoo Finance





Source link

Leave a Reply