Nvidia Unveils 'H200' with Advanced AI Capabilities

Nvidia Unveils ‘H200’ with Advanced AI Capabilities

Nvidia Unveils 'H200' with Advanced AI Capabilities PlatoBlockchain Data Intelligence. Vertical Search. Ai.

Chipmaker Nvidia has introduced the H200 chipset, which will boost performance for large language models (LLMs) and handle bigger AI systems.

The new chip is a significant upgrade from the H100, with almost double-performance speed as well as higher capabilities. Nvidia has indicated the upgraded chip will start rolling out in 2024.

The specifics

The new chip supersedes the current H100 chipset with significant upgrades, especially high-bandwidth memory, a significant element that defines data processing speed.

The H200 comes in the form of memory, becoming the first GPU to come with HBM3e technology. It is embedded with 141 GB of memory at 4.8 terabytes per second, which is much greater than the H100 with 80 GB of capacity.

According to Nvidia, a bigger memory space and high-speed processing elements are meant to enable better and faster responses from AI services.

Nvidia did not name any memory supplier for the new chipset, but possible competitors such as Micron Technology may have provided the memory, according to English Jagran.

As the AI race continues, the H200 chip was designed to address high demand for better efficiency and stronger capabilities.

Big names in the pipeline

Microsoft Azure, Oracle cloud infrastructure, Google Cloud, and Amazon Web Services make up the initial cloud service providers who will offer access to the H200-based chip in the second quarter of 2024. This is in addition to specialty AI cloud providers Lambda, CoreWeave, and Vultr.

“When you look at what’s happening in the market, model sizes are rapidly expanding… It’s another of us continuing to swiftly introduce the latest and greatest technology,” Dion Harris, head of data center product marketing at Nvidia, was quoted as saying.

According to the chipmaker, the new H200 will lead to a “performance boost that will include almost doubling the interference speed on Meta’s Llama 2.”

Also read: China International Import Expo Showcases AI and Metaverse Devices

The significance

The new H200 comes amid US export restrictions on high-end chips, which could impact the chip-making giant.

This comes as an estimated 20% to 25% of Nvidia’s data center revenue comes from China. But the company has paused new AI chip orders from the Asian economy and redirected its systems to other markets.

Following the announcement of the H200 chipset, Nvidia’s shares jumped by as much as 1.5%, reaching $486 on Nov. 13, which was $12 shy of its all-time high of $502.

On a year-to-date basis, the counter has jumped by over 250%. The company’s growth trajectory has also remained robust, and it projects $16 billion in revenue for its fiscal third quarter, which is about a 170% increase over the previous year.

Nvidia controls about 80% of the global market for AI chips. Companies like ChatGPT maker OpenAI and Stability AI rely heavily on Nvidia technology.

[embedded content]

Time Stamp:

More from MetaNews