Nvidia is building the most powerful supercomputer ever for AI

in #science4 years ago

Nvidia and the University of Florida are jointly building an AI supercomputer in an academic environment that will have a computing power of 700 petaflops.

The new system is to extend the capabilities of the currently used UF HiPerGator machine with the DGX SuperPod architecture. It is worth emphasizing here that from 2012 to 2018 the number of calculations used during the largest artificial intelligence training increased more than 300,000 times.

Experts indicate that computing power doubles every 3.5 months, far exceeding the development rate according to Moore's law.

This demonstrates the power of this technology and the enormous interest of companies and universities in these systems. NVidia says that the new face of HiPerGator supercomputers will allow students and young scientists to advance in AI.

They will be able to make calculations for the development of breakthrough projects in the world of automotive, medicine and environmental protection.

The University of Florida intends to train up to 30,000 students in AI by 2030.

We are talking about a real breeding ground for specialists who will push the development of artificial intelligence sharply forward. As part of the improvement, HiPerGator will gain 140 DGX A100 systems, powered by 1120 NVIDIA A100 Tensor Core GPUs, connected with 4 petabytes of memory with DDN and 15 kilometers of optical cable

The DGX A100 features eight 7nm A100 GPUs with an A100 tensor core, providing 320 gigabytes of memory and the latest high-speed Mellanox HDR 200Gbps connections.

According to Nvidia, the A100's 54 billion single GPU transistors reach 5 petaflops in calculations. Recall that last year Nvidia unveiled the world's smallest supercomputer to support AI.

It's called Jetson Xavier NX, measures just 75 by 45 millimeters and impresses with its amazing capabilities.

It is the future brain of autonomous robots, cars and Internet of Things devices.

The single-board supercomputer has a Tegra Xavier chip in the low-current version, 8 GB LPDDR4x memory, 16 GB eMMC 5.1 for data, a gigabit network card and a USB 3.1 controller.

Sort:  

Source
Direct translation without giving credit to the original author is Plagiarism.

Repeated plagiarism is considered fraud. Fraud is discouraged by the community and may result in the account being Blacklisted.

If you believe this comment is in error, please contact us in #appeals in Discord.

Please note that direct translations including attribution or source with no original content is considered spam.