The Nvidia DGX-2 isn't the initial off-the-peg Nvidia server to be targeted at AI. That honour goes to the DGX-1, based on a mix of Intel Xeon processors paired with Nvidia's very own AI-optimised Tesla V100 Volta-architecture GPUs. The DGX-2 continues that method, yet as opposed to eight Tesla V100s signed up with using Nvidia's NVLink bus, the DGX-2 includes 16 of these magnificent GPUs connected using its more scalable NVswitch innovation. According to Nvidia, this configuration permits the Nvidia DGX-2 to handle deep discovering and other demanding AI and HPC work approximately 10 times faster than its smaller brother or sister.
Although it was announced at the same time as the Nvidia DGX-2, it has taken a further six months for the bigger version to show up. One of the initial to make it to the UK was mounted in the labs of Nvidia partner Boston Limited. They asked if we would love to take a look: we did, and here is what we discovered.
In addition to efficiency, size is a big differentiator with the DGX-2 which has the very same crackle-finish gold bezel as the DGX-1 yet is physically a whole lot bigger, weighing in at 154.2 kg (340lbs) contrasted to 60.8 kg (134lbs) for the DGX-1 and also consuming 10 rack devices rather than 3
It's also worth noting that the Nvidia DGX-2 requires a great deal even more power than its little bro, requiring approximately 10kW at full tilt, rising to 12kW for the just recently announced DGX-2H model (concerning which more quickly). The picture below shows the power arrangements at Boston required to maintain this little monster satisfied. Cooling, likewise, will need mindful factor to consider, especially where more than one DGX-2 is deployed or where it's mounted together with various other equipment in the same rack.
See Also : BullGuard Antivirus 2019
Distributing that power is a set of six hot-swap and redundant PSUs that move in behind the chassis along with the various components that comprise the remainder of the system. Cooling, at the same time, is managed by a selection of 10 followers situated behind the front bezel with room on either side for 16 2.5-inch storage devices in 2 banks of eight.
Nvidia consists of 8 3.84 TB Micron 9200 Pro NVMe drives as part of the base arrangement, equating to simply over 30TB of high-performance storage. This, however, is primarily to take care of neighborhood data, with added storage on the major motherboard for OS and also application code. It likewise leaves 8 bays vacant to add more storage space if required. In addition, the Nvidia DGX-2 is bristling with high-bandwidth network interfaces to link to a lot more ability as well as build server collections if required.Category: Products Reviews