NVIDIA Blackwell GPU has powerful performance and greatly reduced power consumption

2024-12-26 04:56
 91
Nvidia said that using Blackwell GPUs to train a 1.8 trillion parameter model only requires 2,000 Blackwell GPUs with a power consumption of only 4 megawatts, while previously using 8,000 Hopper GPUs required 15 megawatts of power. In addition, Nvidia has also launched the AI ​​super chip GB2000, which provides 30 times the performance for large language model inference workloads and has a cost-efficiency 25 times lower than Hopper.