The mainstream H100 in 2024 will be equipped with 80GB HBM3, and the mainstream chip in 2025 is expected to be equipped with 288GB HBM3e

111
In 2024, mainstream H100 will be equipped with 80GB HBM3. By 2025, mainstream chips such as NVIDIA's Blackwell Ultra or AMD's MI350 are expected to be equipped with up to 288GB of HBM3e, and unit usage will triple. With the continued strong demand in the AI server market, the overall supply of HBM is expected to double by 2025, which will also drive the demand for CoWoS and HBM.