Nvidia GTC: First manufacturers build new GPU accelerator H100 "hoppers" a

As part of the Nvidia GTC conference, two server manufacturers, Supermicro and Asus, presented new products that each rely on Nvidia’s new H100 Tensor Core GPUs as the core. Both companies are targeting the AI ​​market. However, the new GPU servers are not yet available, and benchmarks, prices and delivery dates are still missing.

Supermicro has announced a total of 20 new configuration options with which the US manufacturer is expanding its Nvidia-certified server portfolio. According to Supermicro, the new servers are optimized for the H100 Tensor Core GPU, the Nvidia announced in spring and whose architecture was named Hopper.

Supermicro announces a wide range of different rack modules with heights from 1U to 8U. These include versions for both Nvidia’s SXM versions of the H100 and the PCIe variant. The latter are also delivered with Nvidia AI Enterprise, a cloud-native software suite for AI applications.

Supermicro is expanding its portfolio and incorporating H100 GPUs.

Supermicro CEO Charles Liang promises that customers can expect up to a 30x performance increase in AI inferencing “in certain AI applications” compared to previous GPU accelerator generations. In addition, an innovative airflow design of the new GPU servers reduces fan speeds and thus power consumption, noise levels and total cost of ownership. However, these savings should not compensate for the additional energy consumption of Hopper GPUs – 75 percent compared to the predecessor Ampere.

The delivery of the new server generation has already begun as part of an early access program for general availability the manufacturer side is still silentthe same applies to the prices.

At the conference, Asus is primarily focusing on marketing AI success stories such as “multiple groundbreaking records” in AI benchmarks. Nevertheless resigned also this manufacturer in a press release announced that they would like to install the H100 Tensor Core GPUs including software support from Nvidia AI Enterprise in the future.

As far as technical details are concerned, ASUS is even more reserved than its competitor Supermicro and only promises “unprecedented performance, scalability and security for every data center”. The new Asus servers should be available before the end of the year, even if neither technical data nor prices are known yet.

In May of this year, Nvidia still spoke of the availability of its H100 GPUs from third-party manufacturers in early 2023.

In addition, since the announcement of the Grace CPU, we have been working closely with Nvidia in order to be able to offer the manufacturer’s new ARM processor in our own servers. Asus doesn’t want to reveal any details until next year.


More from iX Magazine

More from iX Magazine


More from iX Magazine

More from iX Magazine


(jvo)

To home page

Leave a Comment